IEEE(c) 1996 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

By choosing to view this document, you agree to all provisions of the copyright laws protecting it.


Return to Table of Contents

Group Decision Support System for Assessment of

Problem-based Learning

Jian Ma

Department of Information Systems

City University of Hong Kong

Kowloon Tong, Hong Kong

Tel: (852) 27888514; Fax: (852) 27888694

Email: isjian@msmail.is.cphk.hk

Abstract

Assessment is an important task in problem-based learning (PBL). It has a strong influence on students' approaches to learning and their outcomes. In order to provide an open and fair method of assessment on the outcomes of PBL, a group decision support system (GDSS) has been developed. The system uses mathematical decision methods for collecting assessment criteria and their appropriate weights, and databases for storing students' details and data on the assessment criteria. It integrates databases with model bases in a cooperative decision making environment and is implemented on a client-server platform using a relational database management system. The GDSS has been developed and used to improve the efficiency and quality of assessing students' outcomes of PBL.

Introduction

Methods of problem-based learning (PBL) have been introduced recently in teaching engineering and science courses [1,2,3]. In a situation of PBL, students are first exposed to a practical problem in the form of a project. They are required to work in groups and to acquire other skills and knowledge not taught in the classroom to solve the problem. PBL encourages students to participate in the whole learning process and stimulates their curiosity. It trains the student to become an independent learner and encourages him or her to take a deep approach to learning [1, 4]. In a deep approach, students focus their attention on the overall meaning or message in a learning context such as a class session, text or situation. They attempt to relate ideas and construct their own meanings. Such approach exemplifies the type of learning that employers and lecturers hope students to demonstrate. Deep approach also embodies the imaginative and adaptive skills and wide sphere of interests that are increasingly demanded in the world of work [2, 4].

The assessment of students' outcomes is a very important task in the whole learning process of PBL. It has a strong influence on students' approaches to learning and their outcomes [5, 6]. Several methods have been used for the assessment of PBL. They include mini-essay formats [7], modified essay questions (MEQs) [8] and laboratory tests [9].

In this paper, MEQs and laboratory tests are used to assess students' outcomes in PBL. However it sometimes happens that the assessment criteria and their corresponding weights are solely determined by the lecturer in charge, which reduces students' autonomy in the whole learning process and lowers the quality of PBL. In order to provide an open and a fair method of assessment on the outcomes of PBL, a group decision support system (GDSS) has been developed. The system makes use of mathematical decision methods [10, 11] for collecting the assessment criteria and determining the corresponding weights from the students and lecturers. It is to meet the requirements of an individual student and those of the whole group.

By definition, a GDSS [12, 13] is an interactive computer-based system which facilitates solutions of unstructured problems by several decision makers working together as a group. The software components of the GDSS include a database, a model base which consists of specialized application programs to be used by the group, and an easy-to-use, flexible user interface. Several GDSS systems have been developed for group decision making.

GROUPSYSTEMS [12] is a general purpose GDSS developed in the University of Arizona. The system can be used by group members to input ideas, comments, votes, or electronic messages to other group members. GROUPSYSTEMS uses the Delphi technique and nominal group technique to collect comments from group members. It also gives group members the ability to register their preferences on an issue on a variety of preference scales.

MEDIATOR [14] is a negotiation support system. It is a special type of GDSS emphasizing computerized assistance for situations in which strong disagreement on factual or value judgments exists between group members. Negotiation support systems focus on enhancing the prospect of consensus, with the intent of making compromise possible. The MEDIATOR includes the ability for each group member to formulate and store issues of importance to the group and the ability for each group member to display selected information on a public screen.

GROUPSYSTEMS is a model-oriented GDSS [12, 13]. It may be useful in collecting the assessment criteria and determining their corresponding weights. However it lacks the capability to store students' details and to support the exchange of knowledge and experience among students in the whole process of PBL. MEDIATOR is a special type of GDSS designed for specific applications, it is not suitable for the proposed application. Several other GDSS systems have been reported in [13, 15]; however they are developed for particular application domains.

In a situation of PBL, students are the decision makers who contribute assessment criteria as well as the end-users who share the information and communicate with other group members in the learning process. For this purpose, a GDSS has been developed using a network-based relational database management system. The GDSS is a client-server application in the sense that it allows the lecturers and students to login and run the program at any client machine on the campus wide network. The system uses mathematical decision methods for collecting assessment criteria and determining appropriate weights, and uses databases for storing students' details and data on the assessment criteria. It also uses a graphic user interface to integrate the databases with decision models in a cooperative decision making environment for the assessment of PBL.

Assessment of the problem-based learning

Assessment of students' projects is always a difficult task, which has a strong influence on students' approaches to learning and the outcomes of PBL. Ramsden [16] states that perhaps the most important single influence on students' learning is their perception of the assessment requirements. It is in the assessment process that the greatest opportunity arises for students' perceptions of the educational context and their understanding of concepts to diverge.

Recent development in PBL has shifted its emphasis from teacher-centered learning to student-centered learning [17]. With the latter, students are expected to take an active part in planning, organizing and evaluating their learning. For this purpose, a formal decision method in fuzzy mathematics is used to obtain from students the assessment criteria and their weights [11]. The procedure can be simplified and divided into four steps:

Step 1: Study existing methods of assessing students' projects and propose a basic set of assessment criteria. The criteria should meet the objectives of the course.

The present practice is for students' project work to be assessed in two parts: a project report and a demonstration. Students present their work of design and implementation in their reports; they also demonstrate the applicability and efficiency of the implemented system.

Our objectives are not only to teach students factual knowledge of principles and the design methodology of information systems, but also to develop their adaptive skills and interests in solving practical problems. A basic set of assessment criteria is therefore proposed which aims at encouraging students to take the deep approach to learning. The criteria include: the correctness, completeness and readability of the software design, and the reliability, flexibility and efficiency of the system implementation.

Step 2: Give the basic set of assessment criteria to three groups of people for evaluation: lecturers in charge of the course, students attending the class and other lecturers teaching similar courses. Their evaluation forms are collected and analyzed. As a result, some new criteria are likely to be added and the existing ones modified.

The basic set of criteria was explained and formally defined in a tutorial class. Students were then asked to read the assessment criteria and to contribute their own ideas of how to evaluate the project. Some of them commented that the correct use of the methods for designing a software system was very important. The teaching staff of the Department made it clear that developing students' ability to write the report and to answer the question was a crucial part of their teaching goal.

Step 3: Analyze the feedback on the assessment criteria and send them back to the above groups for a second evaluation. The final set of the assessment criteria is to be formed by repeating steps one through three until satisfaction.

Several students and teaching staffs were interviewed after the feedback on the assessment criteria had been collected and analyzed. They all agreed to the assessment criteria; some students also responded favorably concerning the method and said that they were clear on what they needed to achieve.

Step 4: Propose weights for the assessment criteria. These weights are then collected and calculated.

As agreed with students at the beginning of the semester, the project assessment was divided into two main parts: a project report and a demonstration. In our past experience with a database course [1], the weights on the assessment criteria for both the Project Report and the Demonstration were collected and calculated (with some rounding of a division) using the group decision support system, the assessment then being entered on the final project marking form as shown in Figure 1.
Project Marking Form
Marking Allocation

Project Report: 60 %

Demonstration: 40 %

E=Excellent; G=Good; S=Satisfactory; P=Poor.

1. Project Report (60%)
Your Marks:
E
G
S
P
Completeness of the database design (25%)
Correctness of the database design (25%)
Expressiveness of the database design (15%)
Structure of the project report (15%)
Correct use of methods (20%)
2. Demonstration (40 %)
Your Marks:
E
G
S
P
Reliability of the system implementation (40%)
Efficiency & flexibility of the implementation (30%)
Responses to questions (30%)

Figure 1: A sample project marking form in a PBL

It can be seen from Figure 1 that 60 percent weight was given to the project report and 40 percent to the demonstration. The assessment criteria for a project report were further divided into: completeness of the database design, correctness of the database design, expressiveness of the database design, structure of the project report and correct use of methods with the weights of 25%, 25%, 15%, 15% and 20% respectively. The assessment criteria for demonstrations are: reliability of the system implementation, efficiency and flexibility of the system implementation and responses to questions with the weights of 40%, 30% and 30% respectively.

The project marking form has been used to assess students' project work. The quantitative data on marks were processed using the GDSS specially designed for the processing of data. In comparison with the old methods of assessing projects, the proposed method encourages students to participate in the whole learning process and uses objective criteria to assess the outcomes of students' projects in PBL. It creates a non-threatening environment of learning and provides a fair and competent basis of assessment.

Design and implementation of the system

In order to improve the quality and efficiency of PBL, a group decision support system has been developed. Due to the nature of the problem and the fact that it involves the processing of a large amount of data, a network-based relational database management system INGRES [18] has been chosen for the implementation of the system. INGRES has the tools and application utilities of INGRES/MENU, INGRES/QUERY, INGRES/FORMS, INGRES/REPORTS, INGRES/GRAPHICS and INGRES/APPLICATIONS. Thus it provides the capability of managing data, programming using 4th generation language (4GL) as well as building user interfaces using FORMS and REPORTS. INGRES makes use of the UNIX functions for distributed computing and supports the client-server applications. Thus the proposed GDSS is implemented on the client-server platform as shown in Figure 2.

Figure 2: A client-server GDSS architecture for assessment of PBL.

In Figure 2, students and lecturers can login to any UNIX computer (or any terminal with simulation of the UNIX environment) and telnet to one of the client machine on the campus wide network. They can then nominate the assessment criteria and contribute the corresponding weights at any time within the first five weeks of the semester. The students' project work are also submitted into a class account in the server machine. Thus the lecturer can login to the server machine from a remote terminal and run the GDSS system to get the final assessment form. Lecturers are to mark students' projects in a real-time mode.

In a situation of PBL, students and lecturers in charge first nominate a set of assessment criteria according to the objectives of the course, they then contribute weights on each of the criteria. The weights are evaluated and a final set of criteria is formed. The criteria are used to assess students' projects. If we use the extended entity-relationship model [19] for the conceptual design of database structure, the resultant conceptual schema is shown in Figure 3.

Figure 3: Conceptual database schema for assessment of PBL.

In Figure 3, PERSON, PROJECT, WEIGHT and CRITERIA are entities, and STUDENT and LECTURER are specialized entities of the PERSON. The PROJECT is the task of PBL, thus the relationship between PROJECT and STUDENT is one to many (1:N). The conceptual schema in Figure 3 can be transformed into the following logical schema:

PERSON(PID, Name);

LECTURER(PID, Appointment, Duty);

STUDENT(PID, Proj-No, Contribution, LearnAppr);

PROJECT(Proj-No, Total-Mark);

WEIGHT(WNO, PID, Type, W1, W2, ..., Wn);

AVGWEIGHT(W1, W2, ..., Wn);

CRITERIA(C1, C2, ..., Cn);

where the AVGWEIGHT is an intermediate table for storing the average weights of the corresponding criteria.

The weights for assessment criteria are collected from two major groups participated in the rating of the criteria, i.e., students and lecturers in-charge of the course, they are computed according to the decision method in fuzzy mathematics. A sample format of the collected data for weights of criteria is shown in Table 1:

Table 1: Weights for assessment criteria collected from both students and lecturers in-charge
PID, Type, W1, W2, ..., Wn
s2011891
S
W11 W12 ... W1n
jianm
L
W21 W22 ... W2n
s2113344
S
Wm1 Wm2 ... Wmn

In Table 1, 'S' and 'L' in Type field stand for students and lecturers respectively.

Now let r1 and r2 be the rating for students' weights and lecturers' weights respectively, the final weight for the corresponding criteria is determined by:

Wj = r1W'j + r2W"j

where W'j and W"j are average weights contributed by students and lecturers respectively.

The weights for the final criteria can be computed using INGRES SQL commands and the resultant weight for the criteria be stored into the table AVGWEIGHT as shown below:

APPEND TO AVGWEIGHT AS

SELECT r1*AVG(F.W1)+r2*AVG(S.W1),

..., r1*AVG(F.Wn)+r2*AVG(S.Wn)

FROM WEIGHT F, WEIGHT S

WHERE F.Type="S" AND S.Type="L" AND

F.WNO=S.WNO AND F.PID=S.PID;

Students' projects are evaluated against the assessment criteria, the total marks for a project group are the sum of weighted marks from the assessment form. Each group member is required to write down his/her percentage of contributions in doing the project, thus the marks for an individual student are determined by:

SELECT F.PID, Name, Contribution*Total-Mark

FROM STUDENT F, PROJECT S, PERSON L

WHERE F.Proj-No=S.Proj-No AND F.PID=L.PID;

where Total-Mark is the cumulated marks for the whole group and the marks for an individual student is the product of his or her percentage of contribution to the cumulated marks.

Security control on the use of the GDSS is very important. For this purpose, GDSS users are classified into four different groups according to the tasks they perform. These groups have different access permissions as shown in Figure 4.

Figure 4: Permission hierarchy for GDSS users

In Figure 4, system administrator (SA) is responsible for the installation of the relational database management system INGRES and the GDSS application software. SA will also create user accounts and grant users the access permissions on the server and selected client machines. Database administrator (DBA) is responsible for creating the GDSS user accounts, granting access permissions to the database, as well as assigning the user groups in the GDSS application. Database owner (DBO) is normally the lecturer in charge of the PBL. He/She is responsible for granting students the access permissions on the use of selected objects such as tables, views and procedures. DBO also maintains the database content for the GDSS application. Database user (DBU) is a student or a lecturer assigned to use the GDSS system in the process of PBL. He/She may have restricted permissions to access certain objects such as tables and views. Thus the database users are further decomposed into two different groups as shown in Figure 5.

Figure 5: User groups for the GDSS application

In Figure 5, student and lecturer are the groups of students and lecturers participating in the process of PBL. DBO is the group of lecturers in charge of the course. DBO is responsible for making final decision on the selection of the assessment criteria.

To ensure that each student can only see his/her own marks in the project, the DBO creates a student view for the student group participating in the process of PBL as shown bellow:

CREATE VIEW StudentMark (Student_ID, Name, Marks)

AS SELECT F.PID, Name, Contribution*Total-Mark

FROM PERSON F, STUDENT S, PROJECT L

WHERE F.PID=S.PID and S.Proj-No=L.Proj-No and F.PID=&loginID

GROUP BY S.Proj-No

where &loginID is a micro replacement of a current user login ID. Thus the lecturer in charge of the PBL can grant the read permissions to the student group on the view StudentMark as follows:

GRANT SELECT ON StudentMark TO student

where "student" is a group consisting of all the students participating in the process of PBL.

Students' approaches and their outcomes of PBL

Quantitative data that might disclose differences in the students' approaches to learning were obtained using Biggs's [20] Study Process Questionnaire (SPQ). There are 21 items which are classified into six categories. Each item is a self-report statement of a motive or a strategy. The categories are constructed to measure deep motive, e.g. while I am studying, I often think of real life situations to which the material that I am learning would be useful; achieving motive, e.g. I think browsing around is a waste of time, so I only study seriously what's given out in class or in the course outline; surface motive, e.g. I summarized suggested readings and include these as part of my notes on a topic; deep strategy, e.g. in reading new materials, I often find that I am continually reminded of material I already know and see the latter in a new light; achieving strategy, e.g. I generally restrict my study to what is specifically set as I think it is unnecessary to do anything extra and surface strategy, e.g. I try to work consistently throughout the term and review regularly when the exams are close.

The respondents rate themselves on the statement on a 5-point scale, from 5 ('The item is always or almost always true of me') to 1 ('This item is never or only rarely true of me'). The 5-point scale is designed to allow rating of students' scores in broad terms: well below average, below average, average, above average and well above average. The range of possible scores for an approach is from 7 to 35 with a middle point value of 21.

120 out of 164 third year undergraduate students enrolled in a semester long database course participated in the study; the others were absent from class on the day when the form was circulated and collected, or else chose not to complete it. The collected data are analyzed using the statistical software package SAS [21] and the results are shown in Table 2.

Table 2: Students' approaches to learning
Deep
Achieving
Surface
Mean
22.88
24.45
21.45
Standard Deviation
4.81
3.88
5.61

It can be seen from Table 2 that the mean value for students' with an achieving approach (combination of the achieving motive and achieving strategy) is well above the middle point value, the mean value for students with a deep approach is above the middle point value, and the mean value for students with a surface approach is close to the middle point value. Data on students' approaches have been plotted against their outcomes in PBL. The statistical results show that there is a significant linear relationship between students taking a deep or an achieving approach to learning and their outcomes of the PBL.

Summary

A group decision support system has been developed and used for assessing the outcomes of PBL. The system uses the method of decision theory in fuzzy mathematics for the assessment, it invites students as well as lecturers to nominate the assessment criteria and to contribute their corresponding weights. It also stores students' details in the database and assists the lecturers in assessing students' projects using the agreed criteria. The system improves the quality of PBL as well as the efficiency of assessing students' projects. It also creates a non-threatening environment for learning and provides a fair method of assessment.

Acknowledgment

The project work was funded by the 1993 National Teaching Development Grant of Australia (project no: E250.261), the 1994 Strategic Research Grant of City University of Hong Kong (project no: 700410) and the 1994/95 Hong Kong Action Learning Project Grant (project code: CITYU23). Mr. Qi-dong LOU, a research associate at the School of Computer Science and Engineering in University of New South Wales, has participated in the design and implementation of the system.

References

[1]. J. Ma, "Problem-based Learning with Database Systems," Computers and Education, Vol. 22, No. 3, 257-263, 1994.

[2]. D. Boud (Ed), Problem-based Learning in Education for the Professions, Higher Education Research and Development Society of Australia, Sydney, 1985.

[3]. J. Ma, "Problem-Based Learning and Students' Approaches in Information Systems", in M. Ostwald and A. Kingsland (Eds), Research and Development in Problem-based Learning, Charles Sturt University Press, Australia, 101-110, 1994.

[4]. F. Marton and R. Saljo, "Approaches to Learning," in F. Marton, D. J. Hounsell and N. J. Entwistle (Eds), The Experience of Learning, Edinburgh: Scottish Academic Press, 1984.

[5]. F. Marton, D. J. Hounsell and N. J. Entwistle (Eds), The Experience of Learning, Edinburgh: Scottish Academic Press, 1984.

[6]. L. Dahlgren, "Outcomes of Learning," in F. Marton, D. J. Hounsell and N. J. Entwistle (Eds), The Experience of Learning, Edinburgh: Scottish Academic Press, 1-18, 1984.

[7]. G. Gibbs, S. Habeshaw and T. Habeshaw, 53 Interesting Ways to Assess Your Students, Bristol: Technical and Educational Services, 1986.

[8]. G. Feletti and E. K. Smith, "Modified Essay Questions: are they worth the effort?" Medical Education 20, 126-132, 1986.

[9]. D. Boud, J. Dunn and E. Hegarty-Hazel, Teaching in Laboratories, Guildford, Surrey: STHE & NFER-Nelson, 1986.

[10]. A. Kaufman and M. M. Gupta, Fuzzy Mathematical Models in Engineering and Management Science, North-Holland, 1988.

[11]. J. Ma, "Use of Fuzzy Mathematics for Assessing Problem-Based Learning," Proc. of 3rd International Conference on Fuzzy Logic, Neural Nets and Soft Computing, 277-279, Fukuoka, Japan, August, 1994.

[12]. J. F. Jr. Nunamaker, L. M. Applegate and B. R. Konsynski, "Facilitating Group Creativity with GDSS," Management Information Systems 3 (4): 5-19, 1987.

[13]. D. L. Olson and J. F. Courtney, Decision Support Models and Expert Systems, Maxwell Macmillan, 1992.

[14]. M. Jelassi and A. Foroughi, "Negotiation Support Systems: An Overview of Design Issues and Existing Software," Decision Support Systems 5: 167-81, 1989.

[15]. M. K. O. Lee and J. Pow, "Information Access Behavior and Expectation of Quality: two factors affecting the satisfaction of users of clinical hospital information systems," Journal of Information Science, Vol 22 (3), 1995.

[16]. P. Ramsden, "Studying Learning: Improving Teaching," in P. Ramsden (Ed.), Improving Learning, New Perspectives, Kogan Page, London, 13-31, 1988.

[17]. D. Boud, "Problem-based Learning in Perspective," in D. Boud (Ed.), Problem-Based Learning in Education for the Professions, Higher Education Research Development Society of Australia, Sydney, 13-19, 1985.

[18]. C. J. Hursch. and J. L. Hursch, INGRES SQL Developer's Guide, Windcrest/McGraw Hill, 1992.

[19] C. Batini, S. Ceri and S. B. Navathe, Conceptual Database Design: An Entity-Relationship Approach, The Benjamin/Cummings Company, Inc, 1992.

[20]. J. B. Biggs, Student Approaches to Learning, Australian Council for Education Research, Melbourne, 1987.

[21]. D. G. Kleinbaum, L. L. Kupper and K. E. Muller, Applied Regression Analysis and Other Multivariable Methods, 2nd edn. PWS-KENT, Boston, 1988.