Conference logo
[ ASCILITE ] [ 2004 Proceedings Contents ]

Opening Pandora's box of academic integrity: Using plagiarism detection software

Sue Mulcahy and Christine Goodacre
Flexible Education Unit
University of Tasmania
Academic integrity issues are currently a major focus of concern at most tertiary institutions. This paper details the strategic framework for work at the University of Tasmania (UTAS) for management of these issues. It focuses on the introduction of plagiarism detection software which has served to highlight the wide variety of issues associated with academic integrity and the importance of embedding good practice on the part of both staff and students. The paper reports on the Pandora's box of implementation issues - legal, workload, training and support - that have emerged and the strategies being used to manage these, as part of the project. It recommends the use of a model which focuses on an educative approach to the management of academic integrity, as well as including mechanisms for identifying and discouraging plagiarism, and where it occurs, proceeding against it as academic misconduct. Many of the issues raised by the project have challenged the 'comfort zones' of students, staff and university academic administration. These are being managed both through the approaches being used in the pilot and the project governance adopted.


Background

The management of academic integrity and plagiarism issues within universities has been undergoing review in recent years and the strategies used to address these issues extended. UTAS is no exception.

For the purpose of this paper the term 'academic integrity' is used in a broad sense, referring to mastery of the art of scholarship. Scholarship involves researching, understanding and building upon the work of others and requires that credit is given where it is due and the contribution of others to your own intellectual efforts is acknowledged. At its core, academic integrity requires honesty. This involves being responsible for ethical scholarship and for knowing what academic dishonesty is and how to avoid it.

Plagiarism, in Webster's dictionary (1993, p. 1728) is defined as "to steal and to pass off as one's own (the idea or words of another); use (a created production) without crediting the source; to commit literary theft; present as new and original an idea or product derived from an existing source".

The Centre for Study for Higher Education at the University of Melbourne suggests in their 2002 report (James, McInnis & Devlin, 2002) that academic integrity can be managed through the introduction of and commitment to four strategies, all of which are underpinned by the central principle of ensuring fairness:

  1. A collaborative effort to recognise and counter plagiarism at every level from policy, through faculty/division and school/department procedures, to individual staff practices;
  2. Thoroughly educating students about the expected conventions for authorship and the appropriate use and acknowledgment of all forms of intellectual material;
  3. Designing approaches to assessment that minimise the possibility for students to submit plagiarised material, while not reducing the quality and rigour of assessment requirements;
  4. Installing highly visible procedures for monitoring and detecting cheating, including appropriate punishment and re-education measures.

UTAS has found this a useful reference point for the management of these issues at an institutional level and in 2001 it established a working party to review our framework and make recommendations as appropriate.

The outcome of work during 2001 - 2002 of the working party can be summarised as follows:

This working party also supported the introduction of an auditing mechanism in the form of plagiarism detection software, to assist in ensuring that the work submitted by students is their own.

In 2003 the FEU took up this work and applied a project management methodology to its continuation. To ensure continued high level support for the project, the Pro Vice-Chancellor Teaching and Learning (T&L) took on the role of project sponsor. The project steering committee consisted of the Director FEU as Chair, the project leader, Academic Registrar, Dean of Graduate Studies, two student and academic staff representatives, a representative from the central IT unit, plus the Library as observer. While the project focused on the introduction of plagiarism detection software - Turnitin - it also involved a further revision of policy and support issues. We were not sure what other issues might arise, administrative, policy or legal, and the role of the Steering Committee was to provide advice on their management as well as generally oversee the project.

Plagiarism detection software: Why Turnitin?

There is a wide range of 'solutions' to plagiarism currently available. They range from using search engines such as Google to identify offending papers, to PC and internet based options. Applications such as web based search engines like Google, Web Wombat and Answers have been no-cost solutions used by individual teachers to check suspicious essays. While these have provided some results, they come with serious limitations and in 2003 UTAS decided to implement the application, Turnitin.

The Cooperative Action by Victorian Academic Libraries (CAVAL) supports Turnitin and provides consultancy, training and help desk services for it. Turnitin is currently used at 28 Australian tertiary institutions and in nearly 50 countries world wide, including extensive use in the UK through the JISC Plagiarism Detection Service and in the US at both the tertiary and secondary level.

What is Turnitin?

Turnitin is a text matching system, which compares a submitted document with text located on "an Internet database of over 4.5 billion (web) pages ... millions of published books and journals from ProQuest ... over 10 million papers already submitted to Turnitin" (Turnitin tour, 2004). A report is produced on each document submitted, highlighting sections of text that match with an entry in Turnitin's databases. Matched text is highlighted using colours, which also indicates the originating source of the match. There are two formats for viewing the Turnitin reports, either print (Figure 1) or side by side (Figure 2) version.

Figure 1

Figure 1: Turnitin report, print version format, used with permission

After a document is submitted to Turnitin, it is added to Turnitin's database. This enables a historical archive of submitted documents to be built up and included in later checking. An additional function of collecting documents in the database is the building up of references taken from printed material. Print based material is not currently available to Turnitin through any other source.

Turnitin does not identify all potential cases of plagiarism, as its database does not contain all web pages, electronic journals, published works or individually produced works and it cannot match paraphrased text. It is only one tool in an overall strategy for managing academic integrity being implemented at the university.

Because Turnitin only reports on the degree of text matching, it is necessary for individual lecturers to review Turnitin's reports to determine the actual level of plagiarism. Turnitin does not differentiate between correctly cited references and unacknowledged copying. What it does provide is a ranking of assignments, according to the level of text matching it has found with other sources, highlighting those assignments that are most likely to include plagiarism.

Project strategies

The implementation of Turnitin required the development of strategies in the areas of policy, management, support, communications and evaluation.

UTAS joined the CAVAL Plagiarism Detection Consortium, established across Australia and New Zealand to achieve better educational outcomes in the area of plagiarism reduction. Membership is free and the consortium provides discounts on services and software and will provide software support through its Turnitin Help Desk and initial training.

Figure 2

Figure 2: Turnitin report, side by side version, used with permission

Key Performance Indicators for the pilot are as follows:

Critical success factors were identified as the following:

Key strategies can be summarised as follows:

Semester 1 pilot 2004

The Semester one pilot ran from 19th April till 28th June 2004. There were initially 16 lecturers, 17 units and approximately 1,400 students involved in the pilot. Each faculty was represented as well as units from 1st to 3rd year and from each major Tasmanian campus. During the pilot 13 lecturers made use of Turnitin in 15 units with approximately 1020 student assignments submitted. Units from 1st to 3rd year were involved and from all major Tasmanian campuses. Not all faculties had the level of participation we had hoped for and this is intended to be addressed in second semester.

The pilot aimed to investigate issues related to:

The pilot was evaluated by:

Preliminary findings and issues - Semester 1 pilot 2004

In two units where the lecturer submitted the student's work to Turnitin, the rate of plagiarism detected was approximately that reported in the 2002 study of 6 Victorian Universities, 14% (O'Connor, 2003). There is no indication from the pilot that the level of plagiarism at UTAS is significantly different from that present at other Australian Universities, or that UTAS students are not using the same resources (Carroll, 2002) as students throughout the world, in plagiarising work.

Turnitin did not highlight all occurrences of plagiarism detected in units where the lecturer submitted the student's work. Markers in these three units identified cases of plagiarism not highlighted by Turnitin. These were not identified by Turnitin because a website was not included in Turnitin's database and copied work had been paraphrased by students. However in two of these units, Turnitin highlighted the majority of plagiarism cases. In these two units it also highlighted more cases of plagiarism than initially detected by the markers.

Staff participants were surprised that a recent Australian study by Marsden (as cited in O'Connor, 2003) had detected no significant difference in rates of plagiarism between domestic and international students. They agreed that it was easier for markers to identify plagiarism by students from non-English speaking backgrounds.

Impact on students and staff

Student workload issues

From the results of the student questionnaire over 70% of respondents took less than 5 minutes to set up or submit their assignments to Turnitin.

Staff workload issues

From the staff focus group, time required to set up the student assignments or to submit student work to Turnitin was minimal. Review of Turnitin's analysis reports however could be extremely time consuming, depending on the number of students in the class and the criteria used to determine which reports were reviewed.

Legal issues

Early in the pilot, students raised several issues:

All these issues were raised with the university legal officer. None has been considered an impediment to using Turnitin, however the need to inform students that the university will use plagiarism detection software is important. The University does this in its Plagiarism statement and educates students about academic integrity issues - such as how to cite sources. The University also has a responsibility to ensure the protection of the copyright and IP rights of its students' work stored on the Turnitin database, which is addressed through its contractual arrangements.

The Steering Committee also raised a number of issues:

These were also referred to the university legal officer and again did not pose any difficulty to continued use of Turnitin. Some, such as coercion were not issues at all, but were followed up in any case to ensure full consultation and ownership of the project.

The legal issues raised required a significant amount of time to document and follow up, however, they are likely to be a significant part of any implementation project, until the use of such tools is common place. They indicate the discomfort felt by some students as the result of introducing a tool that can identify non-compliance with standards.

Training issues

Staff It was difficult to arrange convenient times to bring staff together from different schools for sessions, in order to benefit from economies of scale. Most training conducted during the pilot was on a one to one basis, which was resource intensive for the FEU and would not be sustainable in a full roll-out. The software was considered relatively easy to use when starting out. However configuring Turnitin to provide the appropriate submission model for students and appreciating what effect these settings would have on student use, was not intuitively obvious. Reading the Turnitin reports caused concern for some staff, in trying to determine which papers they should investigate.

Of even greater importance is raising staff awareness of academic integrity issues such as why students plagiarise, how it is done, how to design assessment tasks to minimise the opportunity for plagiarism, as well as information about university policies and procedures regarding plagiarism. As pilot participants were new to the use of Turnitin, it is not surprising that their efforts were initially concentrated on mastering the software, before considering changes to their teaching practice. This behaviour is described by the model of technology adoption developed by Sandholtz , Ringstaff, and Dwyer (as cited in Torrisi & Davis, 2000, p. 169). Pilot participants would be considered to be in either the entry or adaption stage of this model, learning to use the technology and apply it to their current teaching practices. They will require a higher level of confidence with the tool, before being ready to look at issues such as changing assessment practices.

Staff need to be comfortable on issues such as how much needs to be copied to be considered plagiarism?. Issues related to academic integrity are frequently not black and white, as highlighted in the excellent article by Devlin (2003). Encouraging discussion on such issues and seeking agreement is an important component of academic integrity.

Students There were few difficulties experienced by students using Turnitin, indicated by no requests being received by local support services. Participants in the focus group felt that first year students and particularly international students needed to be effectively inducted into the university's policy on plagiarism and referencing standards. These students wanted information relating to plagiarism reinforced throughout the unit, not just in the unit outline and the first week of lectures. The focus group participants also reported that different schools required different referencing standards and that this caused considerable confusion for students. Participants in the student focus group were not aware of university procedures to deal with plagiarism, nor the range of penalties that could be imposed, despite promotion of them by the university. The focus group students felt that the action which had most raised awareness amongst students of plagiarism issues was an email from a unit coordinator announcing that two students had been found to have plagiarised.

Support issues

As the limited number of support issues encountered by staff and students using Turnitin were not routine, the project leader handled them through the CAVAL Help Desk. Students encountered relatively few problems using Turnitin. Those that were reported included: difficulty accessing Turnitin via Macintoshes (a list of recommended browsers and operating systems were provided by Turnitin), student file sizes too large for Turnitin due to the inclusion of graphics, maximum file size for Turnitin is 1.9Mb (resolved by removing graphics), students not able to submit multiple files to Turnitin (solution the files were concatenated into the one file).

Staff also encountered relatively few problems using Turnitin, these included: issues with accessing Turnitin from a Macintosh, text files were only identified as such when the suffix .txt was added to the file name, Turnitin was not able to handle very early versions of Word documents (these had to be resaved in a new version of Word to be analysed by Turnitin), Turnitin failed to identify a piece of copied text from a website (the website URL was sent to Turnitin who included it in their 'crawl list', the contents of the site would be added to their data base within two weeks), two or more sources for a piece of matched text can cycle rather than being excluded as requested (this was a known problem and an individual request to reanalyse the document would have to be made to Turnitin), this problem was identified in test data used on the site and not reported in any student work.

Administration

The administrative requirements centrally were minimal, although over time with the need to maintain records and provide reports, this could increase. Staff involved in the pilot did not comment on any difficulty with the administrative responsibilities that they had.

Attitudes towards use of plagiarism detection software

Students in the focus group felt positive about the use of plagiarism detection software, as it may influence the probability of those plagiarising being caught.

Staff attending the focus group felt that the software could be used to deter academic misconduct; as a 'sharp hook' for discussion about plagiarism; and, assist students improve their referencing. They felt it was very important that staff and students were clear regarding what could be expected of Turnitin - that it was just one tool of many used by the university. They felt it could be promoted to staff as a way of tracking down suspect material; as an initial tool for flagging obvious cases; and, that it would provide more confidence to staff marking work, that cases of plagiarism would be detected. They felt it could be promoted to students as one way of preserving the integrity of the award they receive and give higher confidence that they will receive the mark they deserve relative to the rest of the group.

Responses to the student questionnaire (with a limited 17% response rate) indicated that participants felt Turnitin made the results of the assessment fairer (60%) and less than 20% would not recommend Turnitin to other students. However respondents indicated a very strong preference that students always know when Turnitin was to be used, always have access to Turnitin's analysis of their work and always be able to resubmit their work after seeing the initial Turnitin report. This again highlights students discomfort at the possible regulatory use of Turnitin.

Semester 2 pilot 2004

At the conclusion of the first semester pilot a number of issues required resolving before a wider implementation of Turnitin could be planned and these are being addressed in the extended pilot during second semester. These issues include:

Conclusion and summary

Reasonable levels of participation in the pilot and beyond are only possible with high level commitment within the university to addressing academic integrity issues and the use of software as one method to detect plagiarism. A high level commitment is also necessary if the institution is to address the Pandora's box of associated legal and policy issues. Without the support provided by the project sponsor and steering committee, the number and extent of issues encountered during the semester one pilot could have resulted in the project being delayed or abandoned.

Interestingly, we found that students were more aware of the extent and nature of plagiarism activity within the institution than were staff. Students were also very supportive of university initiatives to detect and punish cases of misconduct, though also considered important, improved student awareness and assessment practices. The widespread implementation of plagiarism detection software at UTAS is likely to result, at least initially, in an increased level of detection. Dealing with such an increase will be difficult for all sectors of the university community. It is important that academic staff and relevant committees are aware of this possible phenomena and its causes, so that it is not misinterpreted and briefings have been undertaken to address this.

On reflection, we can see that we were optimistic in our ambition to both introduce plagiarism detection software and encourage staff to adopt measures, in relation to the assessment practices for instance, which would reduce the possibility of plagiarism. This was too wide a range of changes to be made in one step. Staff involved in the pilot were focused on how to use the software and we are now planning a strong staff development program for 2005 to encourage staff to review teaching practice. In our view, a staged approach over a period of years is required.

Workloads and ease of use are important issues. The first semester pilot highlighted for us that staff and students will only use plagiarism detection software if it is easy to use and does not add to their workloads. The semester one pilot highlighted workload as an issue for staff when reviewing Turnitin reports, this issue had to be addressed to enable wider adoption of the software. As a result the guidelines for the extended semester two pilot includes two options for the review of reports, which would cause limited impact on staff workload.

At the end of Semester two we should have a model, with options and guidelines, which will enable the effective and sustainable use of plagiarism detection software, and is also acceptable to staff and students.

Plagiarism detection software is viewed at UTAS as one tool in a broader approach to addressing issues of academic integrity. We believe it is important to focus on the development of an educative and developmental approach with students, embedding good practice in scholarship and academic referencing. In late 2004 and 2005 more work will need to be undertaken with staff embedding this practice in curriculum and raising awareness about the importance of compliance in the application of policy and procedures and consistency in the application of penalties, this inevitably will move many staff beyond their comfort zones.

Looking to the future from a technological perspective, Turnitin is only the beginning of the application of technology to detect plagiarism. It addresses only text documents and other tools are in development and use to detect plagiarised computer code and author identification. These tools will assist in identifying cases of academic misconduct. However, relying heavily on technology to provide a "purely 'catch and punish' approach ... will simply lead to a never ending 'arms race' between the students and the university." (Carroll, 2002)

Reference list

Carroll, J. (2002). A Handbook for Deterring Plagiarism in Higher Education. Oxford: The Oxford Centre for Staff and Learning Development.

Devlin, M. (2003). The problem with plagiarism. Campus Review, Nov 12-18 2003, 4-5.

James, R., McInnis, C. & Devlin, M. (2002). Assessing Learning in Australian Universities. Centre for the Study of Higher Education, University of Melbourne. [29 Dec 2002, verified 23 Oct 2004] http://www.cshe.unimelb.edu.au/assessinglearning/

O'Connor, S. (2003). Cheating and Electronic Plagiarism - Scope consequences and detection. Paper presented at EDUCAUSE conference May 2003, Adelaide). [28 Jul 2004, verified 23 Oct 2004] http://www.caval.edu.au/about/staffpub/docs/Cheating%20and%20electronic%20plagiarism%20-%20scope,%20consequences%20and%20detection%20EDUCASUE%20May%202003.doc

Torrisi, G. & Davis, G. (2000). Online Learning as a catalyst for reshaping practice - The experiences of some academics developing online learning materials. The International Journal for Academic Development, 5(2), 166-76.

Turnitin Tour. [20 Jul 2004] http://www.turnitin.com/static/tour/tour_master.html

Webster's Third New International Dictionary of the English Language, Unabridged (1993). Cologne: Konemann.

Authors: Sue Mulcahy. Flexible Education Unit, Office of Pro Vice-Chancellor Teaching and Learning, University of Tasmania, Private Bag 133, Hobart 7001. Sue.Mulcahy@utas.edu.au
Christine Goodacre, Director, Flexible Education Unit, Office of Pro Vice-Chancellor Teaching and Learning, University of Tasmania, Private Bag 133, Hobart 7001. Christine.Goodacre@utas.edu.au

Please cite as: Mulcahy, S. & Goodacre, C. (2004). Opening Pandora's box of academic integrity: Using plagiarism detection software. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 688-696). Perth, 5-8 December. http://www.ascilite.org.au/conferences/perth04/procs/mulcahy.html

© 2004 Sue Mulcahy & Christine Goodacre
The authors assign to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the authors.


[ ASCILITE ] [ 2004 Proceedings Contents ]
This URL: http://www.ascilite.org.au/conferences/perth04/procs/.html
HTML created 1 Dec 2004. Last revision: 1 Dec 2004.