IEEE© 1997 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.





[ Index | Introduction | Expositive | Demonstrative | Interactive | Practical | Conclusion | References ]



The Interactive Methodology


        Learning tools based on the interactive methodology may look similar to the ones based on animation and local simulation; however, they are instead characterized by a very important methodological difference. Expositive and demonstrative learning tools are designed to show or demonstrate concepts or system behaviors: the learner is in charge of controlling presentation or simulation parameters only. Interactive tools, instead, demand directly to the learner to choose, describe or calculate something. They therefore imply an evaluation of the learnerís answers.
        The need for stimulating learnerís activity has been discussed already. The results of our experience indicate that, without a continuous self-training activity, in our field no serious learning is possible. A weakness of animation and local simulation is the fact that a student might go through them without paying the necessary level of attention to assimilate the topics presented. Instead, the necessity of performing a series of actions in response to requests made by the courseware, typical of the interactive methodology, demands a direct learner activity and, consequently, a deeper involvement in the study. The challenge to accomplish what the tool asks for is generally working in favor of learnerís motivation [19].
        On a programming point of view, the interaction just described is implemented usually in very straightforward way, by asking the student to choose, among several options available, the right one, that is determined by the nature of the problem under consideration and "known" by the system. Typical example of one of such problems with a unique solution is the tracing, step by step, of a timing diagram of an existing logical network, that the courseware compares with a stored one. Attempts are made to tackle problems with more than one approach, such as the conception of software algorithms or state machines diagrams, by pre-defining several solution paths, instead of one. It must be stressed, though, that approaches based on expert systems techniques are not taken into consideration.
        Most of the interactive tools described in this chapter have been designed and developed taking into account the intuition and experience of our lecturers, who have translated into courseware tools the practical, operational answer to the most important needs of the learners [20]. The interactive methodology fulfills several pedagogical needs of the learning environment that are described in the following paragraphs. It is, in fact, used to test regularly the proficiency level, to develop tutoring exercises, and to introduce design training [21].



Test

        In our course, short, rapid multiple-choice and visual tests are one of the implementations of the interactive methodology. An attempt has been made to structure the material upon which the test is based. The basic unit is the "question". The system is composed by standard multiple-choice questionnaires (MCQ) and interactive questions where the learner is requested to perform a task. For each action, the system provides a feedback in term of assessment of the action, help and explanation.
        Several questions unified by a common subject and a quasi-uniform level of difficulty, make a "level". Access to the next level is granted to the student that has reached a score above a certain threshold, established by the teacher. At the end of verification, the system evaluates the candidate with a final score, providing also detailed information and time spent on each question. Therefore, the auto-evaluation system, that has been built for all the relevant subjects of the course, allows students to evaluate their knowledge level and, at the same time, to consolidate their understanding of digital electronics.
        The example shown in Fig. 11 is a classical MCQ whose main purpose is the establishment of a common language to describe digital systems. The interface is common to all the evaluation system: the buttons belonging to the bottom bar control the navigation within the test and the score is indicated in red on the top right on the screen.
        More demanding the test of Fig. 12, that asks to identify the situation where the clock frequency is too high for given values of t1 and t2. The obvious approach of the MCQ has been substituted by an operational procedure. The learner has to set the values of t1 and t2 with the proper buttons (center of the screen). The timing diagram changes accordingly and the condition object of the test has to be identified visually. In comparison with MCQ, this kind of tests forces the learner to interact with the timing diagram and to rely more on non-verbal skills. A working example of the auto-evaluation system described here is available.



Interactive Tutoring Exercises

        This series of tools is the result of an effort targeted to foster understanding of fundamentals issues that our experience has shown particularly difficult for the beginner to master [22]. Most of these problems are present in the chapter of the course dedicated to the design of digital systems controlled by FSM. The tools developed so far deal with the following issues:

        All the tools use a standard interface and, even if they deal with different issues, they are characterized not only by the same interactive methodology, but also by a similar pedagogical sequence. They all start with a statement of the problem, followed by an animated explanation that, at the beginning works as "instruction manual" for the tool and then provides, step by step, the full solution of the exercise. The learner has full control of the animation and may leave it at any time to start solving alone the exercise using the interactive part of the tool. In this second part, she/he does not get any more explanation, but the work is checked step by step and mistakes are pointed out and counted against the score. Switching between the two parts of the tool is always possible. We provide in the following few examples of interactive tutoring exercises, supported by working demonstrations.
        The tool of Fig. 13 is dedicated to train the learner in the analysis of an existing logical network, an important skill for a digital designer. It guides the learner to understand the behavior of a sequential network composed by three D-type flip-flops and an EXOR gate. The target of the tool is the definition of the network timing diagram, given the input signals. In the animation mode, after an introduction to the use of the tool, a practical method to perform the analysis is presented and applied in full. In the solution mode (interactive) the learner practices on the construction of the timing diagram, using the buttons on top-right of the screen to define of the three outputs for each clock cycles. When the learner finds it difficult to proceed in the diagram construction, she/he may go back, using the button with the open eye, to a complete animated explanation of each step (working example).
        The relations between state sequence and time evolution is, sometimes, an issue difficult to grasp by the beginner: courseware tools, taking advantage of animation and interaction features, may do a better job than traditional textbooks, limited to static images and textual explanation. The relation between an ASM chart and its timing is explored using the tool of Fig. 14, whose purpose is to guide the learner to build the timing diagram (bottom) corresponding to a given ASM (top). The pedagogical approach is the same of the previous example: the animation mode explains the behavior of the simple synchronous FSM, the solution mode allows studentsí practice (working example).



Interactive Construction of Algorithms

        A major problem in digital design is to train a learner in the conception of a system. This usually implies the construction of a proper algorithm that in our case can be a state machine, a software program or a hardware architecture. Such task is difficult to teach because it requires, in addition to the knowledge of the design rules, the conception of something that did not exist before. To make a comparison (a little exaggerated, indeed) it is the same difference as from knowing the rules of the grammar and writing a meaningful text.
        The basic philosophy used for implementing our interactive algorithm construction is to divide the algorithm in a ordered set of steps, and to let the learner choose among several possibilities for each.
        These choices are presented to the student in the form of multiple-choice questionnaires (MCQ): identifying the right one means to define one algorithm step. When the interactive tool is developed, the most likely questions that might arise at the different steps of the exercise are identified. In the case of wrong answers, text fields and graphics explaining why the answer was incorrect are shown. A score is generally computed, that is not used for "grading" but rather to determine whether the student needs more work on the specific problem. Interactive exercises are generally built upon a unique solution of a problem. This limitation is acceptable when the purpose of the exercise is to guide the learner toward a specific solution. Nevertheless, training for real design implies the removal of this limitation: in our courseware an unlimited freedom of design is achieved using the general purpose tools described in the following chapter. It must be understood, tough, that this implies to renounce the "guide" function provided by this methodology. In a few cases, like the following example, weíve been able to implement interactive exercises, as described above, that allow the learner to choose among a set of alternative solutions. In other words, choices made during the interactive exercise determine the construction of a particular ASM among the ones implemented by the author. To stimulate learnerís attention (to say it differently, to fight the "video game" attitude), the tool sometimes accepts wrong solutions for a while and then warns the learner.
        Fig. 15 shows a frame of the interactive tool for FSM construction. The process starts from design specifications and schematics of the system (not shown here). Students builds interactively the ASM chart by choosing the right ASM block from a set of available ones (on the left in the figure). In this particular case the blocks are dragged with the mouse and dropped in the workspace in the middle of the screen where, if the choice is right, they are automatically attached to the chart under construction. A window appears with explanations and comments. If the choice is wrong the learner is notified by a window with a text explaining the mistake. Buttons present on the ASM construction screen allow to recall support information like the timing diagram or the problem text and schematics. Information, such as data sheets of digital components used in the system, an introductory explanation of the problem and other items, are part of the tool (working example).
        The same interactive methodology is applied to the development of computer programs (in our course mostly in Assembly language). Instead of ASM blocks, instruction are presented as multiple choice.





Figure 11


Fig. 11. A multiple-choice question on digital systems at the most elementary level of difficulty.





Figure 12


Fig. 12. An interactive test evaluates the understanding of the effects of delays on the operation of a FSM.





Figure 13


Fig. 13. Tutoring exercise assisting the tracing of the timing diagram of a sequential network.





Figure 14


Fig. 14. Tutoring exercise for training the learner in establishing a correspondence
between state (top) and timing (bottom) of a FSM.





Figure 15


Fig. 15. Interactive construction of a FSM with "drag and drop" functions.





Working Examples