|[ ASCILITE ]
[ 2004 Proceedings Contents ] |
An online quizzing environment created for a first year physics unit has resulted in high student satisfaction and participation rates. Compulsory and formative assessment has been a key ingredient in this success with students allowed to use detailed feedback to improve their quiz scores. Students believed that the quizzes helped them study more consistently over the semester and learn physics. The quizzes have also affected study habits through preparation, use of detailed feedback and collaboration.
|Topic(s) Set||Week Set||Time in|
|1||Kinematics in one dimension, Vectors||2||60||Highest|
|2||Kinematics in two dimensions||3||60||Highest|
|4||Circular motion dynamics, Work and Kinetic Energy||5||75||Average|
|5||Conservation of Energy, Momentum||7||75||Average|
|7||Static Equilibrium, Elasticity, Gravitation||10||60||Highest|
|8||Oscillatory Motion, Fluids||11||60||Highest|
Summary information for the individual quizzes is given in Table 1. The questions were organised into eight online quizzes over a twelve week teaching semester according to topic as a supplement to (unassessed) selected problems from the textbook (Serway & Jewett, 2004) and student solution manual and study guide (Gordon, McGrew & Serway, 2004). Students were allowed about one week for up to two attempts at each quiz with the highest or average mark taken as the assessment. Detailed feedback was included on all questions and was normally aimed at being sufficient to assist borderline students in working out the correct answer on the second attempt. A random element in the calculated questions (which provided 20 variations) and alternate multiple choice questions ensured that students received similar but different questions on each attempt as well as with each other.
When online quizzes were trialled in 2003, participation rates were very poor with the 30% of students who attempted zero or just one of the online quizzes over four times more likely to fail or score a borderline pass in the unit (Swan, 2003). Students needed to be further encouraged and rewarded for using these quizzes in learning physics. This was primarily done by dramatically changing the nature to the assessment in two ways. Firstly, the author stepped out of his comfort zone and made quizzes a compulsory 20% component of the unit's assessment. Replacing the single mid-semester test with online quizzes marked a significant shift towards ongoing and formative assessment. Secondly, there was a substantial move away from using the average mark option to the highest mark option for the two attempts with six quizzes assessed in this way in 2004 compared with just one in 2003. This second change was influenced by success Honey & Marshall (2003) reported for online multiple choice quizzes in a nursing unit where students were allowed to use the highest mark from three attempts.
Other changes in 2004 included a non-assessed orientation quiz to ensure that students were aware of question types and answer formats before their first assessment. New questions were written, topics rearranged, and time allocated for each quiz attempt reduced. In 2003, students had reported that lack of time was a major factor in not attempting quizzes (Swan, 2003). The author aimed to dramatically increase the participation rate through the changes outline above, and in particular capture the borderline students who were most in need. His research goals were to establish overall student satisfaction, measure the participation rates and correlate with assessment outcomes, compare results with the 2003 cohort (with particular attention to borderline students), and investigate the ways in which quizzes impacted on students' study patterns. Students were given a written evaluation instrument to complete (anonymously) in lecture during the penultimate teaching week of semester.
Students normally accessed the quizzes from home (70%) and 91% of students thought that the number of quizzes (eight) was about right.
Only 79% of respondents agreed or strongly agreed that assessment through online quizzes was fair and reasonable, yet when given the choice for the 20% assessment item being the online quizzes, a mid-semester test, or a combination of both, no student chose the mid-semester test with 71% preferring the online quizzes and 29% preferring a combination of both online quizzes and a mid-semester test.
The provision of detailed feedback to students was a core component of the online quizzes and it was extremely well received with 91% of respondents agreeing or strongly agreeing that it was helpful. For their learning needs, two thirds of the students found the amount of feedback (on average) was about right, with one third requesting more detail. A small increase in detail will be considered for 2005. Intriguingly, 82% of respondents found that detailed feedback was sometimes useful even for questions that they had answered correctly. The author was not expecting this response and in subsequent conversations with students it would seem that many students like to go through the lecturer's method to compare both the concepts used and their application with their own method regardless of whether they answered the question correctly or not.
Students' approaches to the online quizzes varied somewhat with 59% of respondents agreeing or strongly agreeing that they were well prepared for their first attempt at a highest mark quiz. For average mark quizzes, 62% of respondents prepared more thoroughly. This was not surprising given that the first attempt for average mark quizzes always count as part of the assessment. Given that students are not supervised when they take the online quizzes the author was interested in how much collaboration was occurring between students. A large minority of respondents either provided help (44%) or received help (36%) from other students.
Students listed a variety of ways in which they prepared for highest mark quizzes including attending lectures, reading through lecture notes, reading chapters or chapter summaries in the textbook, doing set problems and even reading the formula sheet. A significant minority were not well prepared, including one student whose preparation was
Nothing, I would waste the first go & work backwards from the solution
The author had strongly advised against such practices (for the online quizzes and selected problems in the textbook for which students had worked solutions) as being unlikely to result in any meaningful learning and also unlikely to improve their exam score.
Students used detailed feedback for individual questions in ways that went beyond just getting the correct answer. Responses from three students that illustrate different uses of detailed feedback are given below:
To change the way I look at the questions
To try and understand the concepts, but sometimes it wasn't enough
I used it as a guide to lookup certain information in the book and notes. I found it very good
Collaboration between students ranged from "nothing" or "verbal" to substantial exchanges between individuals and within a group. The following two responses demonstrate two different ways in which substantial collaboration occurred:
We all worked out an answer, & compared the way we did it & the answers we got. It was through this that we were able to come to grips with theories
Group work. Do the quizzes together and work together to a solution
The author had informed students that he approved of collaboration where individual learning occurred and each actual quiz attempt was done by the student logged on to the computer. In general, students were very positive about the quizzes affect on their study habits
The quizzes made sure that we were studying without forcing us to study
The quizzes encouraged and influenced both study at and away from the computer. Thelwall (2000) reported similar results with randomly generated open access tests being popular with students and improving motivation for study.
The overall satisfaction rate (which includes agree and strongly agree but not neutral on a 5 point scale) for SCP1111 Physics of Motion obtained from the externally administered University Teaching and Evaluation Instrument increased from 64% (N=28) in 2003 to 91% (N=45) in 2004. The author attributes at least part of this increased satisfaction to the new online quizzing environment outlined in this paper.
Honey, M. & Marshall, D. (2003). The impact of on-line multi-choice questions on undergraduate student nurses' learning. In Interact, Integrate, Impact: Proceedings 20th ASCILITE Conference. Adelaide, 7-10 December. http://www.adelaide.edu.au/ascilite2003/docs/pdf/236.pdf
McInnis, C., James, R. & Hartley, R. (2000). Trends in the First Year Experience in Australian Universities. Evaluations and Investigations Programme, Higher Education Division, DETYA. Canberra: AGPS. http://www.detya.gov.au/archive/highered/eippubs/eip00_6/fye.pdf
Serway, R. A. & Jewett, J. W. (2004). Physics for Scientists and Engineers with Modern Physics. (6th ed.). Belmont: Brooks/Cole.
Swan, G. I. (2003). Regular motion. In Interact, Integrate, Impact: Proceedings 20th ASCILITE Conference. Adelaide, 7-10 December. http://www.adelaide.edu.au/ascilite2003/docs/pdf/706.pdf
Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computers & Education, 34, 37-49.
Thomson Learning (2004). WebCT server login page. Thomson Learning. [July 2004] http://thomson4.webct.com/public/swanserwaycowan/index.html
|Authors: Dr Geoff Swan, Physics Program SOEM, Edith Cowan University, 100 Joondalup Drive, Joondalup WA 6027, Australia. Tel: +61 8 6304 5447. Email: firstname.lastname@example.org
Please cite as: Swan, G.I. (2004). Online assessment and study. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 891-894). Perth, 5-8 December. http://www.ascilite.org.au/conferences/perth04/procs/swan.html
© 2004 Geoff I. Swan
The author assigns to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the author.