Conference logo
[ ASCILITE ] [ 2004 Proceedings Contents ]

Interactive QuickTime: Developing and evaluating multimedia learning objects to enhance both face to face and distance e-learning environments

Thomas Cochrane
Learning Technologies
Unitec New Zealand
This paper discusses the application of learning objects in supporting the teaching of principles of Audio Engineering - a subject traditionally taught face to face utilising specialised audio equipment. It will demonstrate how creating interactive multimedia learning objects can enhance traditional teaching methods - moving beyond the 'comfort zone', and potentially providing virtual learning environments for online delivery. The paper will also focus upon the application of the multimedia architecture 'QuickTime' - which was chosen because of its cross platform capability, multi platform delivery and scalability (Internet, CDROM, Hard drive...), ease of authoring, high level of interactivity possible, and excellent audio capabilities. The insights gained from developing and evaluating several interactive learning objects will be highlighted.


... a learning object is defined as any entity, digital or non-digital, that may be used for learning, education or training. (LTSC, 2002)

Learning object: Any digital resource that can be reused to support learning. The term learning objects generally applies to educational materials designed and created in small chunks for the purpose of maximizing the number of learning situations in which the resource can be utilized. (Wiley, 2002)

As a relatively new concept within education, learning objects are the subject of debate regarding their definition. Wiley criticises the LTSC (Learning Technologies Standards Committee) definition of learning objects for being so broad that it "encompasses the whole universe" (Wiley, 2000). Most definitions focus upon digital learning objects, due to their ease of distribution and reusability. Douglas (Douglas, 2001) describes learning objects as small independent learning components that have metadata, and can be used individually or combined and modified for multiple purposes. These are basically 'building blocks' for instruction. Such broad definitions of learning objects have led to the lego block analogy, small components that can be combined to create a larger unit. However many people argue that learning objects must have a context and implicit or explicit pedagogy for them to be classified as learning objects rather than simply 'information objects'. Wiley prefers an 'atom' analogy (Wiley, 1999) that emphasises the context element of learning objects. Just as only certain atoms can combine with other atoms to form specific molecules, so there must also be design in the combining of learning objects. A learning object's context will determine its level of compatibility with other learning objects.

Aspects of learning object design that have been the subject of recent research include: reusability, metadata standards, learning object repositories, reworking instructional design principles for applying to learning object design, evaluation criteria for learning objects.

Reusability of learning objects has become a major focus of their development. Several studies emphasise the need to plan for reusability in the earliest stages of learning object design (Boyle, 2002a; Currier & Campbell, 2002; McNaught, Burd, Whithear, Prescott, & Browning, 2002). Research has focused upon the need to create learning objects that are small enough to enable re-purposing, while retaining enough context information to remain educationally useful. This size/scope relationship has been given the term 'granularity'. South and Monson (2000) provide a useful definition of granularity of learning objects, defining a 'learning threshold' and a 'context threshold' within which learning objects sit. They describe the trade off for achieving useful granularity as the increased need for metadata and storage requirements. South and Monson also provide an economic argument for the reusability of learning objects. Although the initial development cost of learning objects is high, the ability to reuse these objects across a wide range of courses, and the ability to repackage learning objects for distance delivery courses, ultimately will save money.

Metadata is descriptive information about the learning object, requirements, Author, copyright etc. Metadata standards have been developed by several international organisations, and there is ongoing collaboration to make these standards inter-operable. For learning objects to be re-purposed, they need to be categorised according to content and context within a standard 'library' format. This information is needed so teachers and students can search for and find appropriate learning objects. Metadata allows learning objects to be correctly categorised within learning object repositories. learning object repositories are libraries of either actual learning objects or databases of hyperlinks to catalogued learning objects. Metadata standards are not a major focus of this research, however a standard template was utilised to provide appropriate metadata for all learning objects created as part of the project.

As the research project is focused upon the educational design of learning objects, an investigation into instructional design principles was included. While there is a great body of research applied to traditional instructional design, there is relatively little research specific to design principles for learning objects. "There are relatively few studies and tools relating to the systematic analysis, design and documentation that should precede construction and delivery..." (Douglas, 2001). The pedagogical usefulness of learning objects is a major concern of researchers. Williams (2000) argues that evaluation of learning objects should parallel development. He argues for a participant oriented approach to evaluation. He then embeds these evaluation processes within the ADDIE development model (Assess needs, Design, Develop, Implement, and Evaluate instruction). Williams also argues that evaluation of learning objects should be both 'external' and 'internal'. A similar approach to this was used in the research project.

One of the most mature approaches to evaluating learning objects is utilised by the MERLOT (1997) repository. The MERLOT Peer review process is modeled on the scholarly peer review process of peer reviewed journals (Hanley, 2003). The goal of adopting this approach to evaluation of learning objects is to encourage the adoption of learning objects within the higher education community. The MERLOT repository is divided into several subject categories/communities, with an editorial board for each category. Experts in each field are allocated to review submitted learning objects within each category. The result of the peer review process is a rating from one to five, plus comments for every learning object reviewed. The rating scale represents the following:

  1. Materials not worth using at all.
  2. Materials do not meet minimal standards but there might be some limited value.
  3. Materials meet or exceed standards but there are some significant concerns.
  4. Materials are very good overall but there are a few minor concerns.
  5. Materials are excellent all around.
The ratings are used to give preferential listing in searches of the MERLOT repository, and provide users with a quick idea of the quality and usefulness of a learning object. learning objects with review grades of less than 3 are not displayed.

The research project

This research project involved developing and piloting multimedia learning objects during 2003/2004. It implements research and instructional design principles to enhance learning via multimedia learning objects, within a context of Audio Engineering. The project also employs an action research method implemented over a period of two years.

Current educational thought postulates that learning objects can provide the basis for reusable, pedagogically rich learning materials. The project tested these assumptions and demonstrates that although pedagogically rich learning objects are currently scarce (within New Zealand) for teaching principles of Audio Engineering, they can be designed and delivered to meet the goals of providing reusable and pedagogically rich learning environments within this traditionally face to face taught discipline. Currently there is a lack of quality multimedia learning resources for audio engineering. There are few examples of multimedia learning resources available in New Zealand for audio engineering, although some do exist (SAE Institute, 2001). Most resources are text and audio based only, online (Hambly, 2002) or CDROM examples, that provide little more than replacements for textbooks, although they do usually contain audio examples (Everest, 1997). There are a couple of notable exceptions (Neumann, 1996; Sides, 1995), however these are large resources that were not designed as learning objects. Some resources are outdated, both technically and in content, while most resources focus upon text or audio examples only, with little user interactivity. The general mode for learning the principles of audio engineering is still an on campus/site hands on approach. Some online examples (Gibson, 2000, Mellor, 2001, 2003) do however provide facilities for email and discussion group support for students. In general these resources are designed as complete unique packages or courses, and have not been designed for re-purposing.

The objective of this research was to utilise established instructional design principles to produce interactive learning resources based upon appropriate learning objects. These resources are intended to support both distance and face to face learning. By using learning objects the context and outcomes will be able to be modified by educators to suit the needs of different users. Two distinct groups of users were targeted to test the reusability of these learning objects: Audio Engineering tertiary students (at MAINZ - the Music & Audio Institute of New Zealand, where the researcher taught during the implementation of this research), and Church Sound Engineers. The first group of users (MAINZ Tutors and their students) encompasses a range of Audio Engineering related courses including: Certificate of Foundation Studies in Music - Level 3, Certificate of Live Sound - Level 4, Certificate of Audio Engineering - Level 5, Diploma of Contemporary Music Performance - Level 5, and Diploma of Audio Engineering - Level 6). Different courses with different curricula and levels thus provided an indication of the reusability of the learning objects. The second group of users (Church Sound Engineers and their teams) consisted of representatives from five medium to large contemporary church congregations throughout Auckland. The churches included:

The principal sound operator for each church was asked to evaluate each of the Learning objects throughout the period of the study. This group was chosen as it has been noticed that there is a lack of part time or flexible training opportunities for Church Sound Engineers in New Zealand. Appropriate learning objects could help fill this gap. Their goals were oriented towards gaining practical skills for a specific situation (sound for Church Services) rather than meeting the demands of a tertiary curriculum like the MAINZ students, and do not have the benefit of expert tuition or student peers. The research involved designing and delivering, to these distinct groups of users, several appropriate learning objects. The users evaluated each learning object, over the period of a semester. Data collected from these evaluations was used to inform the development of new learning objects to be evaluated in the following semester by the same groups of users. There were four iterations of design, delivery and evaluation over the period of the study. The project provides reflective feedback from the target users for modification and development of the resources.

Research questions

  1. Can learning objects be designed that are reusable for learning concepts in Audio Engineering - a
  2. Can these learning objects support a high level of learner interactivity and interest, and thus provide pedagogically rich learning environments that engage and motivate the learner?

Project methodology

The project is qualitative in nature, and uses a small 'sample' of participants evaluating the developed learning objects. The project was conducted part time over a two year period, and used action research as its methodology, involving one research cycle per semester. The action research cycles provided time for reflection and feedback between researching and developing an appropriate interactive learning resource and trialing the resource on users. This reflection and feedback provided data on the success of the embedded pedagogy within the resource and areas needing modification. The approach of action research provides a close fit with the researcher's own view of education (transformative - seeking to produce change) and preference for qualitative rather than quantitative research. Action research also provides a close fit with the underlying philosophy of learning objects - constructivism (Bannan-Ritland, Dabbagh, & Murphy, 2000). Wadsworth (1998) identifies the key characteristics of 'participatory action research' as: the researcher is a participant, the researcher is the main research instrument, it is cyclical in nature, involves action followed by reflection followed by informed action, and is concerned with producing change. This change is ongoing throughout the process, and the research is interested in input from participants/stakeholders. The design and development of learning objects follows the classic cyclical nature of action research, and educational research often puts the researcher in the key role of prime collector of data (Wadsworth, 1998). Qualitative research provides rich data for educational situations (Hoepfl, 1997). Data collection methods included focus groups and keeping a reflective journal.

Table 1: Data gathering

Data gathering processesDescription
Initial need analysisA short survey deployed to MAINZ tutors to establish the perceived need for interactive learning objects within the context of Audio Engineering, and suggestions for concepts to be covered.
Diploma of Audio Engineering student EvaluationsA short evaluation sheet to gain initial feedback on the pre-release versions of the learning objects from the researchers own group of students.
MAINZ Tutors and student evaluationsEach learning object was delivered to MAINZ tutors on CDROM with a paper copy of the modified MERLOT evaluation form for filling out. Tutors were asked to nominate a couple of students within their course to evaluate the learning objects as well.
Church sound operator evaluationsEach learning object was delivered to selected church sound operators on CDROM with a paper copy of the modified MERLOT evaluation form for filling out.
Focus groupsTwo focus groups were convened, one consisting of representatives from MAINZ tutors, and the other consisting of representatives from the church sound operators.
Web deployment of learning objectsThe learning objects were progressively uploaded to an Internet server for access via the Internet.
Feedback from overseas international expertsSeveral international Audio experts were contacted to evaluate the learning objects supplied on the web site, using the modified MERLOT evaluation form.
MERLOT evaluationsAfter completion, the four learning objects were contributed to the MERLOT repository for MERLOT peer evaluation
Reflective JournalA reflective journal was kept detailing key events etc of the project.

Evaluators were given a copy of each learning object on CD (cross platform for both Macintosh and Windows operating systems), and an evaluation form. As discussed earlier, the MERLOT learning object evaluation criteria are well developed, and have been in use for several years. It was therefore decided to use the MERLOT (2000) criteria as the basis for the evaluation of the developed learning objects for this research. To make the evaluation process simpler for the evaluators, the check list version of the MERLOT evaluation criteria was used. This check list was developed as part of a conference workshop (Bennett & Metros, 2001). The evaluation criteria are divided into questions covering the three main criteria: reusability, interactivity and pedagogy. The questions were modified for the context of this research project, and some questions were changed to better reflect the research questions of the project as well. The questions are structured as Likert scale rating responses to statements regarding the learning object, plus a section for long answer comments for each category. There are eight questions regarding reusability, and six questions in each of the interactivity and pedagogy categories respectively. The evaluation form questions were briefly moderated by a couple of MAINZ tutors before finalising.

The goal of the project was to develop learning objects that were scalable for delivery over a variety of delivery formats. The learning objects were supplied to evaluators on CDROM, but were also made available from the Internet. As each learning object was developed, it was also uploaded to a web server to test its accessibility and download times over the Internet. Once all four learning objects were developed, a web site was created to provide access and information that could be updated as feedback was received, and any 'bugs' found in the learning objects could be corrected and the updated learning objects made available from the website. The website delivery necessitated the inclusion of metadata about the learning objects. A web based form (Koch & Borell, 1997) was used to create metadata in the Dublin Core format for each learning object. This metadata was included as a text document associated with each Learning object, and the meta tags generated by the online form were also pasted into the of each web page holding the learning objects. This allows search engines to correctly categorise the learning objects. The website also provided access to the learning objects for the international evaluators.

Although the study is qualitative, some quantitative data and basic analysis was helpful, especially in providing a way to summarise trends and comparisons in evaluation feedback. The MERLOT evaluation process includes a numerical assignment from 1 to 5 for each evaluation question. This allows an average overall rating from 1 to 5 to be assigned to an evaluated learning object. This gives potential users of the learning objects immediate feedback as to the overall quality of the learning object. This practice was adopted by this study, to provide a quick comparative rating for each learning object. Because the MERLOT evaluation criteria questions were modified to the context of the research project, the resulting rating is called an 'equivalent MERLOT rating'. An overall rating is derived from the average of all the evaluator responses, and separate ratings are derived from each separate category of evaluators to give a comparison in ratings given for each learning object when evaluated in different learning contexts. A comparison of the equivalent MERLOT rating for each learning object from the three main evaluation groups (MAINZ Tutors, Church Sound Engineers, and MAINZ students) is then made possible.

Learning object design

The learning objects developed and delivered in this research project are digital and software based. learning objects can be viewed as small interactive multimedia elements. Commercial multimedia authoring software provides many tools that can be used to develop learning objects and deliver them across a variety of platforms, for example Macromedia Flash (Heins & Himes, 2002). While a range of authoring applications were utilised in the development of the learning objects, including: LiveStage Pro, Flash, QuickTime Pro, Dreamweaver, HMTL and JavaScript, QuickTime VR Studio, and Final Cut Pro, the key multimedia format used was QuickTime.

QuickTime (Apple Computer, 2004; Beverly, 2004) was chosen as the architecture for authoring and delivery of the learning objects, for several reasons:

QuickTime is a track based multimedia architecture. A QuickTime movie can contain over two hundred different media formats, an unlimited number of tracks, and supports a wide range of state of the art compression codecs. It also supports JavaScript, Java, its own scripting language QuickTime script, and is based on XML.

Currently there is only one authoring application that taps the full potential of interactive QuickTime development - LiveStage Pro from Totallyhip Software (2003). This required learning a new authoring application and scripting language, however it is very similar to Flash and Director. These authoring applications were unsuitable (by themselves) for developing the audio related learning objects, as they only support eight tracks of audio and provide limited synchronisation of tracks. Some of the learning objects developed have over twenty four audio tracks synchronised together within QuickTime.

Using QuickTime allowed certain levels of 'real time' control and signal processing of the audio examples included in this project. Certain aspects of the learning objects were therefore simulated rather than actual processing of the audio files. More powerful real time processing could be achieved by writing cross platform digital signal processing software. This was beyond the expertise and time frame of this study.

Content areas were chosen from the researcher's experience gained from teaching Audio Engineering. Areas that students generally find difficult to grasp, and are key concepts, and from feedback from a survey of potential end users. learning objects were developed that illustrate some of the following concepts in Audio Engineering: operation of an audio mixing desk, achieving an appropriate mix, microphone choice and placement, dynamics processing, and equalisation. The learning object interface is presented as a floating window on the users desktop (or as an element within an html page if web delivered). This presentation format was chosen rather than a full screen mode, to emphasise the Learning 'Object' concept as a piece of the learning rather than an end in itself (see the following screenshot examples). The structure of the learning objects is designed for learner investigation, rather than forcing a linear progression through the sections of each.

Figure 1

Figure 1: Screenshot of the main window of learning object 1 - an interactive audio mixer

The graphical interface of the learning objects is designed to resemble the actual physical equipment that users would normally be working with. This meant a departure from the graphical designs of commercial audio software and plug-ins, which tend to 'jazz up ' their interface elements, to a more 'conservative' graphical representation. In response to user feedback, the first page of each learning object opens with the overview and instructions for its use/navigation. Initially the author made the instructions available via a clickable '?' icon, but some users did not associate this with instructions and initially felt lost. Context is established by the images of typical professional quality audio equipment - that students actually use within their courses and in live sound environments, and the use of actual audio files recorded using this equipment. Virtual environments are created by using the panoramic and cubic virtual scene capabilities of QuickTime. Text is kept to a minimum as the emphasis is upon recreating an immersive simulated environment, rather than the typical textbook approach. Pedagogy is embedded within each learning object by providing opportunities for interactive feedback and formative assessment. In the example below (Fig 2), the user selects an example audio mix (from different genres) and then must identify the microphones used in recording the mix by listening to the characteristics of the mix. The 'home' section of this learning object provides comparison audio files for the user to practice identifying the characteristic sound of each different microphone. All four learning objects can be found on the author's web site (

Figure 2

Figure 2: Screenshot of the main window of learning object 2: An interactive microphone chooser

Learning object evaluation

Below are the average equivalent MERLOT ratings for each learning object from all evaluators.
learning object 1 Evaluation, MERLOT equivalent rating = 3.95
learning object 2 Evaluation, MERLOT equivalent rating = 4.20
learning object 3 Evaluation, MERLOT equivalent rating = 4.09
learning object 4 Evaluation, MERLOT equivalent rating = 4.22
These results indicate that the learning objects were generally highly rated by all users.

Below are example summaries of user feedback gathered from the modified MERLOT evaluation form for the learning objects. The charts give a quick indication of a comparison between the two main contexts within which the learning objects were evaluated.

Feedback from tutors included: "It looks great! I like the interactivity and the ability to store mixes", "I could definitely see us being able to use that!" "Really good I learnt a lot of useful information in a short space of time", "Focused, clear intention of learning object. Useful for a wide range of students, not just Audio Engineers". Feedback indicated that the more experienced user and people with teaching experience can see the potential in the learning objects more than the first time user or inexperienced teacher. MAINZ tutors consistently rated the learning objects higher in every category than the Church sound engineers. Feedback from the group of church sound engineers was positive, but they tended to struggle with the computer interface more than the MAINZ users. A common theme that arose was - make instructions or 'help' much more obvious - as it appears several people did not discover the built in 'help' page in the first learning object.

Figure 3

Figure 3: Summary of responses to evaluation questions for learning object 1

Figure 4

Figure 4: Summary of responses to evaluation questions for learning object 2

Feedback on the functionality and interface focused upon extending the functionality, adding a wider choice of audio tracks, and 'modernising' the 'look'. Most users were satisfied with the level of interactivity available within the QuickTime architecture. Feedback from international evaluators is still in the process of being returned. But indications so far are of a very positive response to the learning objects. Overall there has been a very positive response from users to the learning objects developed so far. Some tutors began to see the possibility of teaching audio principles online. The research project continues until the end of this year, so more feedback from users will be gathered and evaluated and utilised to improve the next learning objects.

Figure 5

Figure 5: Summary of responses to evaluation questions for learning object 3

Figure 6

Figure 6: Summary of responses to evaluation questions for learning object 4


Development of the learning objects took a lot longer than initially anticipated. However, the choice of QuickTime as architecture, and the implementation of instructional design principles provided the basis for developing interactive learning objects that successfully enhanced the learning of a wide range of users at different levels and different contexts. Implementing a participant oriented evaluation process within the design cycle of the learning objects provided useful feedback for their modification. Interactivity and learning control of the learning objects was highly valued by evaluators. Trained educators managed to see the pedagogical possibilities of the learning objects better than the Church sound operators. This is expected, as modifying an educational context does require expertise (Otherwise Learning objects could replace tutors). The utilisation of action research produced a research project that has real world tangible results that will benefit the researcher's educational practice, and the wider field of Audio Engineering within New Zealand. Future recommendations for the study include obtaining evaluations of the learning objects from a wider range of users e.g. from international audio courses.


Apple Computer (2004). Why QuickTime?

Bannan-Ritland, B., Dabbagh, N. & Murphy, K. (2000). Learning object systems as constructivist learning environments: Related assumptions, theories, and applications. In D. Wiley (Ed.), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.

Bennett, K. & Metros, S. (2001). Learning object/Module Checklist. [viewed 23 Feb 2003, verified 13 Oct 2004]

Beverly, B. (2004). QuickTime: The on-line, cross-platform, multimedia architecture of the present and future. Wright State University. [verified 13 Oct 2004]

Boyle, T. (2002). Towards a theoretical base for educational multimedia design. Journal of Interactive Media in Education, 2, 1-16.

Currier, S. & Campbell, L. M. (2002). Evaluating Learning Resources for reusability: The "DNER & learning objects" Study. In Winds of Change in the Sea of Learning: Proceedings 19th ASCILITE Conference. Auckland, New Zealand: UNITEC Institute of Technology.

Douglas, I. (2001). Instructional design based on reusable learning objects: Applying lessons of object-oriented software engineering to learning systems design. Paper presented at the 31st ASEE/IEEE Frontiers in Education Conference, Reno, NV, 10-13 October.

Everest, F. A. (1997). Critical Listening and Auditory Perception: The complete audio-visual training course. On Mix Pro Audio Series [CDROM & Textbook/guide]. Emeryville: MixBooks.

Gibson, B. (2000). AudioPRO Recording Courses. [viewed 29 Jan 2003, not found 13 Oct 2004]

Hambly, C. (2002). Audio Courses Distance Learning Online Sound Engineering School. [viewed 3 Dec 2002, verified 13 Oct 2004]

Hanley, G. (2003). Online Resource: MERLOT: Peer-to-Peer Pedagogy. Syllabus (Campus Technology). [verified 13 Oct 2004]

Heins, T. & Himes, F. (2002). Creating learning objects with Macromedia Flash MX. [viewed 1 Feb 2003, verified 13 Oct 2004]

Koch, T., & Borell, M. (1997-1998). Dublin Core Metadata Template. [viewed July 2003, verified 13 Oct 2004]

LTSC (2002). IEEE Learning Technology Standards Committee. [viewed 2 Mar 2003, verified 13 Oct 2004]

McNaught, C., Burd, A., Whithear, K., Prescott, J. & Browning, G. (2002). It takes more than metadata and stories of success: Understanding barriers to reuse of computer-facilitated learning resources. In Winds of Change in the Sea of Learning: Proceedings 19th ASCILITE Conference. Auckland, New Zealand: UNITEC Institute of Technology.

Mellor, D. (2001). Distance Learning for Sound Engineering and Music Recording. [viewed 12 Mar 2003, verified 13 Oct 2004]

Mellor, D. (2003). [viewed 12 Mar 2003, verified 13 Oct 2004]

MERLOT. (1997). MERLOT: Multimedia Educational Resource for Learning and Online Teaching. [viewed 13 May 2002, verified 13 Oct 2004]

MERLOT (2000). Evaluation Standards for Learning Materials in MERLOT. [viewed 23 Feb 2003 at, see verified 13 Oct 2004]

Neumann, G. (1996). Sound Engineering Contest 1998 [CDROM]. Berlin: Georg Neumann.

SAE Institute (2001). Reference Material Center. [viewed 13 May 2002, verified 13 Oct 2004]]

Sides, A. (1995). Allen Sides' microphone cabinet [CDROM]. Emeryville: Cardinal Business Media Inc.

South, J. B. & Monson, D. W. (2000). A university-wide system for creating, capturing, and delivering learning objects. In D. Wiley (Ed), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.

Totally Hip Software (2003). LiveStage Pro. [verified 13 Octt 2004]

Wadsworth, Y. (1998). What is participatory action research? Action Research Resources. [viewed 3 May 2002, verified 13 Oct 2004]

Wiley, D. A. (1999). The Post-LEGO learning object. [viewed 8 Oct 2003, verified 13 Oct 2004]

Wiley, D. (2000). Connecting learning objects to instructional design theory: A definition, a metaphor, and a taxonomy. In D. Wiley (Ed), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.

Wiley, D. (2002). Learning objects - a definition. In A. Kovalchick & K. Dawson (Eds), Educational Technology: An Encyclopedia. Santa Barbara: ABC-CLIO. [verified 13 Oct 2004]

Williams, D. D. (2000). Evaluation of learning objects and instruction using learning objects. In D. Wiley (Ed.), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.

Author: Thomas Cochrane can be contacted on

Please cite as: Cochrane, T. (2004). Interactive QuickTime: Developing and evaluating multimedia learning objects to enhance both face to face and distance e-learning environments. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 201-211). Perth, 5-8 December.

© 2004 Thomas Cochrane
The author assigns to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the author.

[ ASCILITE ] [ 2004 Proceedings Contents ]
This URL:
HTML created 22 Nov 2004. Last revision: 22 Nov 2004.