SpannerBack Home

Full Paper

Back to List of papers

Web Based Tools For Tertiary Teachers

Angela Carbone, Matthew Drago

angela@cs.monash.edu.au, mattd@cs.monash.edu.au

Department of Computer Science

 

Ian Mitchell

ian.mitchell@education.monash.edu.au

Faculty of Education

Monash University

 

Abstract

This paper describes a web based application developed in the Department of Computer Science, that involved the design of a database to capture and analyse teaching skills, and review the set tasks. The database includes information, as provided by tutors, on the teaching practice itself, the context in which it occurs, processes of review and improvement, and the academicís general views on teaching and learning. The package supplements the work of a project commissioned by the Committee for the Advancement of University Teaching in June 1995, titled Reflections on University Teaching.

Introduction

In 1994, the Dean of the Faculty of Computing and Information Technology (FCIT) approached the Dean of Education for assistance in tackling a perceived problem in the teaching of computing in the Departments of Computer Science and Software Development. The main concerns were high failure rates, a low flow of students into higher degrees and a perception of a wide variation of teaching skills of the tutors (Mitchell, Macdonald, Gunn, & Carbone 1996).

As a result, a collaboration team know as ìEdprojî was established. It composed of staff members from the two FCIT departments and the Education Faculty. During 1995, Edproj sought to establish the present teaching and learning structures and dynamics in first year programming. This involved interviews with lecturers, tutors and students, observations of lectures, discussion classes and laboratories and close evaluation of the content of the courses.

Observations made by Edproj revealed that the quality of some teaching was less than a minimum acceptable standard. Data from student interviews and surveys indicated that the tutorials and labs were their most important learning situations, yet little effort was made to train those in the front line of student contact. This had serious negative consequences on the studentsí learning.

Edprojís initial findings also showed a common paramount emphasis by the students to get the set tasks done, meet the requirements and then stop. Concepts which were not stated for assessment were ignored. Tasks intended by lecturers to be gateways to exploration and reflection were seen by students merely as hurdles to be negotiated.

As a result, Edproj developed a training programme which tutors and demonstrators attended at the beginning of the year (Carbone, Mitchell & Macdonald 1996). The programme comprised of an initial 3-day workshop, and fortnightly meetings aimed at assisting new tutors and demonstrators in their role. At the fortnightly meetings, teaching strategies were discussed, and feedback regarding the lessons and the set tasks were recorded and transcribed into a World Wide Web document (Carbone & Mitchell 1996). The document, although containing useful information was difficult to maintain and another method of recording and capturing individualís teaching strategies was needed. As a result a web based database application was designed.

This paper describes the database known as ìTeachToolsî, designed to capture and analyse teaching skills, and review the set tasks. TeachTools supplements the work of a project commissioned by the Committee for the Advancement of University Teaching in June 1995, titled Reflections on University Teaching (CAUT 1996). The database includes information, as provided by tutors, on the teaching practice itself in the context in which it occurs, and includes their review of the set tasks with suggestions for improvements. This information is intended to be used to refine the tasks set in the course and improve teaching strategies to motivate students and enhance their learning and understanding.

Components and Description of TeachTools

The database design consists of four components; Configuration and Administration of the Database, two collection areas; the Fortune Line and Question Evaluator, and the Analysers to present the collected information.

Configuration and Administration of TeachTools

Figure 1: Self-grading and evaluation component of the Fortune Line

The TeachTools database is configured with details about the administrator, the subject under review and the Laboratory and Tutorial tasks for evaluation. The administration process requires a request to become an administrator, and adding details of users to enable them to record their teaching strategies and review the set tasks. More specifically the data contains:

Figure 2: Example of a practical exercise sheet.

Tutor Fortune Line.

The Tutor Fortune Line allows staff to self evaluate their teaching on a weekly basis and to record their teaching strategies used in each lesson. Staff also have the ability to view the previously entered self evaluations and comments, and edit these comments. Figure 1 illustrates the self-grading and evaluation component of the Fortune Line. Staff select one of their assigned classes and a week (from the session identification box) in which to review their performance. Staff record and reflect on their teaching strategies, enter their own comments about the lesson and give their lesson a rating.

Question Evaluator.

The Question Evaluator component of the package is used to collect and store data about the set tutorial and programming exercises, from three sources; students, tutors and demonstrators. The information collected can be used to highlight difficult concepts to teach, the time spent on individual questions and which of the set task are helpful in aiding the students learning and understanding. The application requires each user to identify themselves then to select a task to evaluate. Once identification is complete, the task and questions are displayed for the staff to answer. Figure 2 provides an example of a practical exercise sheet and Figure 3 the corresponding question evaluation sheet.

Figure 3: Question Evaluation sheet.

The Analysers

The two information collection processes described above require separate analysation applications; The Fortune Line Analyser and the Question Analyser.

The Fortune Line Analyser

The Fortune Line Analyser, currently work-in-progress, will be used for viewing multiple fortune lines. Users will be able to select a subject taught in a particular year, resulting in a list of staff types and names so that the user then selects a staff type and one or more names of staff members to graph the self-evaluation ratings and to view comments of prior staff.

The Question Analyser

The Question Analyser, is used by the administrator to display staffís responses to the tasks. The analyser displays a grid showing which users have responded. There are multiple grids representing the staff types and students. Selection of each grid is achieved through a choice list utility.

Responses can be viewed by individual staff entry or by task title (by week). The Question Analyser can display individual responses to a given task. This will result in a new window appearing which will display the responses in a book like manner, in which the next response is found over the page. The Question Analyser can list all staff responses for a particular task. Results are displayed in a window showing the questions that the administrator set and the responses. For the value based questions, the number of people that selected each value is displayed. For the text-based responses, a button can be clicked which will result in a book window displaying the responses one at a time.

Database Design and Implementation.

Implementation

The database system is implemented in mSQL (v2.0) (a miniature version of the SQL standard) developed by Hughes Technology. The database, although implemented without the full functionality of the complete SQL standard, is still a powerful database server, designed specially for use as a web based database.

The Fortune Line and Question Evaluator use the mSQL database server as a backbone to the entire system. The Fortune Line and Question Evaluator components also use dynamically generated HTML forms based on HTML template files. The user completes the forms, which are then sent to a cgi-script that stores the information in the database.

The Question Analyser is an applet written in Java (v1.02), tested on Netscape (v3.0). Due to security restrictions, a textual based cgi mSQL server was developed which performs the query on behalf of the Java applet.

The Design of TeachTools

The final database application comprises of many tables for each subject. There are two main tables that all the subjects use; the Subjectsí Table and the Teachersí table. These tables hold information about the types of users and the subjects themselves. Apart from these tables, other tables are maintained for individual subject use.

The Subjectsí table.

The subject table stores information about the administrator (entered in the request form). It contains the following fields described below:

Name The code of the subject (e.g.csc1030).

Year The year the subject was taught.

UserCode The user code of the administrator.

Password The password for the administrator.

email The electronic mail address for the administrator.

Weeks The number of task sheets there are.

IDcode An internally set variable that unique to all other subjects entered.

The Teachers Table.

The teachers table holds further information about the subject such as; the staff type for a particular subject. Its fields include:

Type The type name of the staff member.

IDcode The same as that in the Subjects table.

Other tables

Other tables are used to hold information about the staff (ie. usercode, passwords) using the applications, and data that both the staff and students enter. These tables are duplicated for each subject, except uniquely named to reflect each subject, and are used repeatedly for all subjects.

Fortune Line Application Tables

For each staff type and subject, there are three tables that comprise the Fortune Line Application:

The Question Evaluation Tables

This component of the package uses a further four tables, in conjunction with the first two of the tables described above (in the Fortune Line Application Section). The two tables above are used for staff identification throughout both applications. The tables used by the Question Evaluator application are used to keep records of the actual tasks, when they are taught and the responses given by both staff and students.

Integration of TeachTools into the Review Process

TeachTools has been integrated in the review process for first year, Computer Science (semester 2). First year Computer Science, is divided into two components; Data Structures and Algorithms, and Computer Systems. Tutorials and practical classes run on a weekly basis and the tasks given in these settings are set by the lecturer. At the commencement of semester 2, only the Data Structure and Algorithm tasks were ready for insertion into the database for review.

On a weekly basis, tutors and demonstrators reflected on their teaching practice. They used the Fortune Line to rate their lesson and reflect on the teaching in terms of: the context in which it occurred and were required to frame what they did in their lessons, as well as how the students responded in terms of the learning that was and was not occurring. They also provided feedback on the tasks in the Data Structures and Algorithm component of the course, constantly modeling how thinking about the task and how issues involved in learning could make teaching far more informed and purposeful.

Evaluation of results, evidence of effectiveness

After integrating TeachTools in the second semester of the first year course, insights were obtained into its effectiveness by examining the Fortune Lines and the Question Analyser. Improvements to TeachTools were also suggested.

Use and analysis of Fortune Lines

It was not clear from the start how committed staff would be at recording their teaching activities. However, most tutors and demonstrators were self evaluating their teaching, on a weekly basis, using the fortune line facility. Below (Figures 4.1-4.4) are examples of the Fortune Lines from demonstrators teaching Computer Science.

TeachTools provides staff with the opportunity to make improvements to teaching and learning by the quality of their contributions. To date contributions have been very rich in ideas and recorded vignettes characterised by very high levels of energy, commitment and engagement. Below are staff comments that are very reflective about the lesson as well as their own teaching.

TeachTools has captured some outstanding teaching practices which have been effective in generating very high levels of student engagement. TeachTools holds a rich body of wisdom about the problems that students, tutors and demonstrators have faced in semester 2 of 1997. This material will be very useful to inexperienced staff employed as Tutors or Demonstrators in years to come. Capturing good teaching strategies, dealing with sticky points all play a crucial role in improving the quality of teaching at the tertiary level.

Use of Question Evaluators and Analysis of tasks

Apart from the recording of the more general reflections and analyses of events, another very important type of contribution includes providing feedback about the set tutorial and programming exercises. A snap shot of the Question Analyser (Figure 5) provides evidence of the amount of feedback received, at a particular point in time mid-way through the semester. It indicates which tutors and demonstrators have reviewed the tasks and which tasks they have reviewed. Their comments can be viewed by clicking on a highlighted cell in the grid.

The purpose of the Task Analyser is to detect problems in the curriculum and task design as well as to highlight particular constructs that students found difficult with possible reasons for this. Feedback and results accumulated have been successful at highlighting the difficulties that students face and ways in which tasks can be improved. Two examples below report on the feedback given to the tasks on "Dynamic Memory" and "Sorting and Searching methods".

 

Figure 4.1: Fortune Line

Figure 4.2: Fortune Line

Figure 4.3: Fortune Line

Figure 4.4: Fortune Line

Week 2

Rating 8 (Good)

Further Comments It was a good class and I think everyone enjoyed it.

Main Concept Pointers - for about an hour at the start of the class.

How did you attempt to help the students understand the concepts?

I gave the students an analogy. I explained that memory was like a road, then drew a road on the board (and called it ìMemory Laneî).Then I showed that the memory itself were empty blocks of land on the road, and that each block of land has its own address. The contents of the blocks of land (the data) were houses. I asked each student to name what sort of house theyíd like on their block of land; this helped me learn the studentsí names too. I explained that pointers were sort of like forwarding addresses.

How would you describe the students reactions?

On the whole, the students found that the ìMemory Laneî analogy worked quite well. Even though it took away from the time the students had to do the prac, it allowed them to finish it more quickly since they had a greater understanding of it.

 

Week 8

Rating 5 (Indifferent)

Further Comments Machines broke down so students did their work on paper. Actually, it didnít go too bad, as I got students writing big programs, starting from small bits then combining the small components.

Main Concept Loops and Arrays in MIPS

How did you attempt to help the students understand the concepts?

I used a template based approach and got the students to fill in the blanks. I encouraged students to do the C translation to MIPS in stages. For example, in questions 1, students initialised the array, printed it out, then implemented the outer loop, then the inner loop, checking their code at each stage.

How would you describe the students reactions?

After initial reluctance, students were writing modular programs (from the composition of small parts), by the end of the lab.

Figure 5: Question Analyser

 

Dynamic Memory

What concepts did the students find difficult?

How would you improve the prac sheet?

 

Elementary sorting and searching methods

What concepts did the students find difficult?

How would you improve the prac sheet?

 

Improvements to Teach Tools

In its first stage of operation several deficiencies in TeachTools were noted. Improvements include adding extra functionality to change staffís sessions and passwords, modifying the questions, making the request form simpler (perhaps multi staged) and refining the overall package to the needs of the administrators.

Conclusion

Teaching in universities, and other environments, can be a difficult job. The job is even more difficult when there is little in the way of formal training for teachers.

TeachTools has been used in semester 2 of 1997, to collect and analyse data relating to front line teaching and the set tasks. Self-reporting in teaching is geared to stimulate others to take the kinds of risks that are routine in others classes. Capturing the teaching experiences of previous tutors and demonstrators and the types of learning problems students encounter can be useful.

The flexibility of the application ensures that with little modification, it can be used in other departments. This would create an extensive database that could be used by teachers from many disciplines to enhance their teaching technique and to improve the tasks, so that they are seen by students as gateways for further explorations on the concepts intended by the lecturer.

Acknowledgments

The authors wish to thank all the first year tutors and demonstrators for reflecting on their teaching, reporting their strategies in the fortune lines, and reviewing the set tasks. The authors would also like to thank the first year Computer Science lecturers Dr. David Albrecht and Associate Professor Ingrid Zuckerman for allowing their task to be reviewed.

References

Mitchell, I., Macdonald, I., Gunn, F. & Carbone, A. 1996, ìHelpless, Isolated and Under paid: Turning Computer Science demonstrators into teachersî, Proceedings of Australasian Science Education Research Association (ASERA), Canberra, Australia

Carbone, A., Mitchell, I. & Macdonald, I. 1996, ìImproving the Teaching and Learning in First Year Computer Science Tutorialsî, Poster Proceedings of The Thirteenth Annual Conference of the Australian Society for Computers in Learning in Tertiary Education (ASCILITEí96), Adelaide, Australia

Carbone, A. & Mitchell, I.1996, Teaching Suggestions http://www.cs.monash.edu.au/~angela/teach/index.html

Committee for the Advancement of University Teaching (CAUT). 1996, Reflections on University Teaching, http://uniserve.edu.au/uniserve/dev/coord/teach

 

 

(c) Angela Carbone, Matthew Drago, Ian Mitchell

 

The author(s) assign to ASCILITE and educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author(s) also grant a non-exclusive licence to ASCILITE to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the ASCILITE 97 conference papers, and for the documents to be published on mirrors on the World Wide Web. Any other usage is prohibited without the express permission of the authors.

 


Back to List of papers

This page maintained by Rod Kevill. (Last updated: Friday, 20 November 1997)
NOTE: The page was created by an automated process from the emailed paper and may vary slightly in formatting and layout from the author's original.