Conference logo
[ ASCILITE ] [ 2004 Proceedings Contents ]

Online assessment and study

Geoff I. Swan
Physics Program
Edith Cowan University
An online quizzing environment created for a first year physics unit has resulted in high student satisfaction and participation rates. Compulsory and formative assessment has been a key ingredient in this success with students allowed to use detailed feedback to improve their quiz scores. Students believed that the quizzes helped them study more consistently over the semester and learn physics. The quizzes have also affected study habits through preparation, use of detailed feedback and collaboration.


In 2004, the author went beyond his comfort zone and introduced online quizzes as a 20% compulsory assessment component in the first semester first year physics unit SCP1111 Physics of Motion at Edith Cowan University. The carefully constructed online environment offered several unique and other advantages over offline environments and an opportunity to encourage and direct students' study with regular problem solving practice in a formative and flexible learning environment. McInnis, James & McNaught (1995, p. 3) noted that 53% of surveyed first year university students said they studied the minimum that was required by their teachers even though 83% (of all students) had a strong desire to do well in their subjects. Clearly assessment is a key factor in the study patterns of students and time saved through automatic marking and collating of results made online regular assessed homework viable.

Questions and quizzes

Questions for the quizzes were mostly created and trialled in 2003 as voluntary assessment (Swan, 2003). Only the more salient features and major changes for 2004 are given here. A quest quiz is also available to conference delegates through the unit welcome page (Thomson Learning, 2004) with a WebCT login name and password of ecuguest.

Table 1: Online quizzes for SCP1111 Physics of Motion (semester 1, 2004)

Topic(s) SetWeek SetTime in
Type of
1Kinematics in one dimension, Vectors260Highest
2Kinematics in two dimensions360Highest
3Newton's Laws460Highest
4Circular motion dynamics, Work and Kinetic Energy575Average
5Conservation of Energy, Momentum775Average
6Rotational Motion860Highest
7Static Equilibrium, Elasticity, Gravitation 1060Highest
8Oscillatory Motion, Fluids1160Highest

Summary information for the individual quizzes is given in Table 1. The questions were organised into eight online quizzes over a twelve week teaching semester according to topic as a supplement to (unassessed) selected problems from the textbook (Serway & Jewett, 2004) and student solution manual and study guide (Gordon, McGrew & Serway, 2004). Students were allowed about one week for up to two attempts at each quiz with the highest or average mark taken as the assessment. Detailed feedback was included on all questions and was normally aimed at being sufficient to assist borderline students in working out the correct answer on the second attempt. A random element in the calculated questions (which provided 20 variations) and alternate multiple choice questions ensured that students received similar but different questions on each attempt as well as with each other.

When online quizzes were trialled in 2003, participation rates were very poor with the 30% of students who attempted zero or just one of the online quizzes over four times more likely to fail or score a borderline pass in the unit (Swan, 2003). Students needed to be further encouraged and rewarded for using these quizzes in learning physics. This was primarily done by dramatically changing the nature to the assessment in two ways. Firstly, the author stepped out of his comfort zone and made quizzes a compulsory 20% component of the unit's assessment. Replacing the single mid-semester test with online quizzes marked a significant shift towards ongoing and formative assessment. Secondly, there was a substantial move away from using the average mark option to the highest mark option for the two attempts with six quizzes assessed in this way in 2004 compared with just one in 2003. This second change was influenced by success Honey & Marshall (2003) reported for online multiple choice quizzes in a nursing unit where students were allowed to use the highest mark from three attempts.

Other changes in 2004 included a non-assessed orientation quiz to ensure that students were aware of question types and answer formats before their first assessment. New questions were written, topics rearranged, and time allocated for each quiz attempt reduced. In 2003, students had reported that lack of time was a major factor in not attempting quizzes (Swan, 2003). The author aimed to dramatically increase the participation rate through the changes outline above, and in particular capture the borderline students who were most in need. His research goals were to establish overall student satisfaction, measure the participation rates and correlate with assessment outcomes, compare results with the 2003 cohort (with particular attention to borderline students), and investigate the ways in which quizzes impacted on students' study patterns. Students were given a written evaluation instrument to complete (anonymously) in lecture during the penultimate teaching week of semester.

Results and discussion

Participation rates and unit results Analysis of participation rates and unit results is based on the students who sat the end of semester exam (N=84). As expected, there was a dramatic increase in participation rates for the online quizzes with 85% of students in 2004 completing at least six of the eight quizzes compared with just 17% in 2003. This could reasonable be attributed to the change in the 20% assessment component from voluntary to compulsory in 2004. Many students saw the quizzes as a way of gaining high marks with 26% of students gaining a high distinction for this component compared with 17% high distinctions for the unit. However, the quiz pass rate of 70% was lower than the 81% pass rate for the unit. All students who failed this unit gained quiz marks below the class (modal) average with three quarters of failing students also failing the quiz component. This suggests that low quiz marks during the semester could be used as an indicator for early intervention.

Participation rates and unit resultsStudent evaluation

The results have been collated from the 34 respondents who indicated on the written evaluation instrument that they had completed at least six of the seven quizzes to date. A 4 point scale (strongly agree, agree, disagree, strongly disagree) was used for most questions. The responses in 2004 were extremely positive with at least 85% of the respondents agreeing or strongly agreeing that the quizzes:

Students normally accessed the quizzes from home (70%) and 91% of students thought that the number of quizzes (eight) was about right.

Only 79% of respondents agreed or strongly agreed that assessment through online quizzes was fair and reasonable, yet when given the choice for the 20% assessment item being the online quizzes, a mid-semester test, or a combination of both, no student chose the mid-semester test with 71% preferring the online quizzes and 29% preferring a combination of both online quizzes and a mid-semester test.

The provision of detailed feedback to students was a core component of the online quizzes and it was extremely well received with 91% of respondents agreeing or strongly agreeing that it was helpful. For their learning needs, two thirds of the students found the amount of feedback (on average) was about right, with one third requesting more detail. A small increase in detail will be considered for 2005. Intriguingly, 82% of respondents found that detailed feedback was sometimes useful even for questions that they had answered correctly. The author was not expecting this response and in subsequent conversations with students it would seem that many students like to go through the lecturer's method to compare both the concepts used and their application with their own method regardless of whether they answered the question correctly or not.

Students' approaches to the online quizzes varied somewhat with 59% of respondents agreeing or strongly agreeing that they were well prepared for their first attempt at a highest mark quiz. For average mark quizzes, 62% of respondents prepared more thoroughly. This was not surprising given that the first attempt for average mark quizzes always count as part of the assessment. Given that students are not supervised when they take the online quizzes the author was interested in how much collaboration was occurring between students. A large minority of respondents either provided help (44%) or received help (36%) from other students.

Participation rates and unit resultsOpen response questions

Open response questions were included on the student evaluation instrument to collect information on how the quizzes were being integrated into student study regimes. These questions probed students' preparation, their use of detailed feedback, and their collaboration with other students.

Students listed a variety of ways in which they prepared for highest mark quizzes including attending lectures, reading through lecture notes, reading chapters or chapter summaries in the textbook, doing set problems and even reading the formula sheet. A significant minority were not well prepared, including one student whose preparation was

Nothing, I would waste the first go & work backwards from the solution

The author had strongly advised against such practices (for the online quizzes and selected problems in the textbook for which students had worked solutions) as being unlikely to result in any meaningful learning and also unlikely to improve their exam score.

Students used detailed feedback for individual questions in ways that went beyond just getting the correct answer. Responses from three students that illustrate different uses of detailed feedback are given below:

To change the way I look at the questions
To try and understand the concepts, but sometimes it wasn't enough
I used it as a guide to lookup certain information in the book and notes. I found it very good

Collaboration between students ranged from "nothing" or "verbal" to substantial exchanges between individuals and within a group. The following two responses demonstrate two different ways in which substantial collaboration occurred:

We all worked out an answer, & compared the way we did it & the answers we got. It was through this that we were able to come to grips with theories
Group work. Do the quizzes together and work together to a solution

The author had informed students that he approved of collaboration where individual learning occurred and each actual quiz attempt was done by the student logged on to the computer. In general, students were very positive about the quizzes affect on their study habits

The quizzes made sure that we were studying without forcing us to study

The quizzes encouraged and influenced both study at and away from the computer. Thelwall (2000) reported similar results with randomly generated open access tests being popular with students and improving motivation for study.

The overall satisfaction rate (which includes agree and strongly agree but not neutral on a 5 point scale) for SCP1111 Physics of Motion obtained from the externally administered University Teaching and Evaluation Instrument increased from 64% (N=28) in 2003 to 91% (N=45) in 2004. The author attributes at least part of this increased satisfaction to the new online quizzing environment outlined in this paper.


The online quizzes provided students with regular problem solving practice and immediate detailed feedback that could be used to improve grades. The quizzes have influenced study habits of students who acknowledge that they have helped them work more consistently over the semester. In addition, some students have collaborated in a substantial manner in both positive and negative ways. Changing the nature in which online quizzes are assessed has resulted in a dramatic increase in participation rates with 94% of students believing that the online quizzes helped them learn physics. The quizzes might also be used to identify weak students for early intervention and this is a potential area for future research.


Gordon, J. R., McGrew, R. & Serway, R. A. (2004). Student Solutions Manual & Study Guide to accompany Volume 1 Physics for Scientists and Engineers Sixth Edition. Belmont: Brooks/Cole.

Honey, M. & Marshall, D. (2003). The impact of on-line multi-choice questions on undergraduate student nurses' learning. In Interact, Integrate, Impact: Proceedings 20th ASCILITE Conference. Adelaide, 7-10 December.

McInnis, C., James, R. & Hartley, R. (2000). Trends in the First Year Experience in Australian Universities. Evaluations and Investigations Programme, Higher Education Division, DETYA. Canberra: AGPS.

Serway, R. A. & Jewett, J. W. (2004). Physics for Scientists and Engineers with Modern Physics. (6th ed.). Belmont: Brooks/Cole.

Swan, G. I. (2003). Regular motion. In Interact, Integrate, Impact: Proceedings 20th ASCILITE Conference. Adelaide, 7-10 December.

Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computers & Education, 34, 37-49.

Thomson Learning (2004). WebCT server login page. Thomson Learning. [July 2004]

Authors: Dr Geoff Swan, Physics Program SOEM, Edith Cowan University, 100 Joondalup Drive, Joondalup WA 6027, Australia. Tel: +61 8 6304 5447. Email:

Please cite as: Swan, G.I. (2004). Online assessment and study. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 891-894). Perth, 5-8 December.

© 2004 Geoff I. Swan
The author assigns to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the author.

[ ASCILITE ] [ 2004 Proceedings Contents ]
This URL:
HTML created 1 Dec 2004. Last revision: 3 Dec 2004.