|[ ASCILITE ]
[ 2004 Proceedings Contents ] |
This paper discusses the application of learning objects in supporting the teaching of principles of Audio Engineering - a subject traditionally taught face to face utilising specialised audio equipment. It will demonstrate how creating interactive multimedia learning objects can enhance traditional teaching methods - moving beyond the 'comfort zone', and potentially providing virtual learning environments for online delivery. The paper will also focus upon the application of the multimedia architecture 'QuickTime' - which was chosen because of its cross platform capability, multi platform delivery and scalability (Internet, CDROM, Hard drive...), ease of authoring, high level of interactivity possible, and excellent audio capabilities. The insights gained from developing and evaluating several interactive learning objects will be highlighted.
... a learning object is defined as any entity, digital or non-digital, that may be used for learning, education or training. (LTSC, 2002)As a relatively new concept within education, learning objects are the subject of debate regarding their definition. Wiley criticises the LTSC (Learning Technologies Standards Committee) definition of learning objects for being so broad that it "encompasses the whole universe" (Wiley, 2000). Most definitions focus upon digital learning objects, due to their ease of distribution and reusability. Douglas (Douglas, 2001) describes learning objects as small independent learning components that have metadata, and can be used individually or combined and modified for multiple purposes. These are basically 'building blocks' for instruction. Such broad definitions of learning objects have led to the lego block analogy, small components that can be combined to create a larger unit. However many people argue that learning objects must have a context and implicit or explicit pedagogy for them to be classified as learning objects rather than simply 'information objects'. Wiley prefers an 'atom' analogy (Wiley, 1999) that emphasises the context element of learning objects. Just as only certain atoms can combine with other atoms to form specific molecules, so there must also be design in the combining of learning objects. A learning object's context will determine its level of compatibility with other learning objects.
Learning object: Any digital resource that can be reused to support learning. The term learning objects generally applies to educational materials designed and created in small chunks for the purpose of maximizing the number of learning situations in which the resource can be utilized. (Wiley, 2002)
Aspects of learning object design that have been the subject of recent research include: reusability, metadata standards, learning object repositories, reworking instructional design principles for applying to learning object design, evaluation criteria for learning objects.
Reusability of learning objects has become a major focus of their development. Several studies emphasise the need to plan for reusability in the earliest stages of learning object design (Boyle, 2002a; Currier & Campbell, 2002; McNaught, Burd, Whithear, Prescott, & Browning, 2002). Research has focused upon the need to create learning objects that are small enough to enable re-purposing, while retaining enough context information to remain educationally useful. This size/scope relationship has been given the term 'granularity'. South and Monson (2000) provide a useful definition of granularity of learning objects, defining a 'learning threshold' and a 'context threshold' within which learning objects sit. They describe the trade off for achieving useful granularity as the increased need for metadata and storage requirements. South and Monson also provide an economic argument for the reusability of learning objects. Although the initial development cost of learning objects is high, the ability to reuse these objects across a wide range of courses, and the ability to repackage learning objects for distance delivery courses, ultimately will save money.
Metadata is descriptive information about the learning object, requirements, Author, copyright etc. Metadata standards have been developed by several international organisations, and there is ongoing collaboration to make these standards inter-operable. For learning objects to be re-purposed, they need to be categorised according to content and context within a standard 'library' format. This information is needed so teachers and students can search for and find appropriate learning objects. Metadata allows learning objects to be correctly categorised within learning object repositories. learning object repositories are libraries of either actual learning objects or databases of hyperlinks to catalogued learning objects. Metadata standards are not a major focus of this research, however a standard template was utilised to provide appropriate metadata for all learning objects created as part of the project.
As the research project is focused upon the educational design of learning objects, an investigation into instructional design principles was included. While there is a great body of research applied to traditional instructional design, there is relatively little research specific to design principles for learning objects. "There are relatively few studies and tools relating to the systematic analysis, design and documentation that should precede construction and delivery..." (Douglas, 2001). The pedagogical usefulness of learning objects is a major concern of researchers. Williams (2000) argues that evaluation of learning objects should parallel development. He argues for a participant oriented approach to evaluation. He then embeds these evaluation processes within the ADDIE development model (Assess needs, Design, Develop, Implement, and Evaluate instruction). Williams also argues that evaluation of learning objects should be both 'external' and 'internal'. A similar approach to this was used in the research project.
One of the most mature approaches to evaluating learning objects is utilised by the MERLOT (1997) repository. The MERLOT Peer review process is modeled on the scholarly peer review process of peer reviewed journals (Hanley, 2003). The goal of adopting this approach to evaluation of learning objects is to encourage the adoption of learning objects within the higher education community. The MERLOT repository is divided into several subject categories/communities, with an editorial board for each category. Experts in each field are allocated to review submitted learning objects within each category. The result of the peer review process is a rating from one to five, plus comments for every learning object reviewed. The rating scale represents the following:
Current educational thought postulates that learning objects can provide the basis for reusable, pedagogically rich learning materials. The project tested these assumptions and demonstrates that although pedagogically rich learning objects are currently scarce (within New Zealand) for teaching principles of Audio Engineering, they can be designed and delivered to meet the goals of providing reusable and pedagogically rich learning environments within this traditionally face to face taught discipline. Currently there is a lack of quality multimedia learning resources for audio engineering. There are few examples of multimedia learning resources available in New Zealand for audio engineering, although some do exist (SAE Institute, 2001). Most resources are text and audio based only, online (Hambly, 2002) or CDROM examples, that provide little more than replacements for textbooks, although they do usually contain audio examples (Everest, 1997). There are a couple of notable exceptions (Neumann, 1996; Sides, 1995), however these are large resources that were not designed as learning objects. Some resources are outdated, both technically and in content, while most resources focus upon text or audio examples only, with little user interactivity. The general mode for learning the principles of audio engineering is still an on campus/site hands on approach. Some online examples (Gibson, 2000, Mellor, 2001, 2003) do however provide facilities for email and discussion group support for students. In general these resources are designed as complete unique packages or courses, and have not been designed for re-purposing.
The objective of this research was to utilise established instructional design principles to produce interactive learning resources based upon appropriate learning objects. These resources are intended to support both distance and face to face learning. By using learning objects the context and outcomes will be able to be modified by educators to suit the needs of different users. Two distinct groups of users were targeted to test the reusability of these learning objects: Audio Engineering tertiary students (at MAINZ - the Music & Audio Institute of New Zealand, where the researcher taught during the implementation of this research), and Church Sound Engineers. The first group of users (MAINZ Tutors and their students) encompasses a range of Audio Engineering related courses including: Certificate of Foundation Studies in Music - Level 3, Certificate of Live Sound - Level 4, Certificate of Audio Engineering - Level 5, Diploma of Contemporary Music Performance - Level 5, and Diploma of Audio Engineering - Level 6). Different courses with different curricula and levels thus provided an indication of the reusability of the learning objects. The second group of users (Church Sound Engineers and their teams) consisted of representatives from five medium to large contemporary church congregations throughout Auckland. The churches included:
|Data gathering processes||Description|
|Initial need analysis||A short survey deployed to MAINZ tutors to establish the perceived need for interactive learning objects within the context of Audio Engineering, and suggestions for concepts to be covered.|
|Diploma of Audio Engineering student Evaluations||A short evaluation sheet to gain initial feedback on the pre-release versions of the learning objects from the researchers own group of students.|
|MAINZ Tutors and student evaluations||Each learning object was delivered to MAINZ tutors on CDROM with a paper copy of the modified MERLOT evaluation form for filling out. Tutors were asked to nominate a couple of students within their course to evaluate the learning objects as well.|
|Church sound operator evaluations||Each learning object was delivered to selected church sound operators on CDROM with a paper copy of the modified MERLOT evaluation form for filling out.|
|Focus groups||Two focus groups were convened, one consisting of representatives from MAINZ tutors, and the other consisting of representatives from the church sound operators.|
|Web deployment of learning objects||The learning objects were progressively uploaded to an Internet server for access via the Internet.|
|Feedback from overseas international experts||Several international Audio experts were contacted to evaluate the learning objects supplied on the web site, using the modified MERLOT evaluation form.|
|MERLOT evaluations||After completion, the four learning objects were contributed to the MERLOT repository for MERLOT peer evaluation|
|Reflective Journal||A reflective journal was kept detailing key events etc of the project.|
Evaluators were given a copy of each learning object on CD (cross platform for both Macintosh and Windows operating systems), and an evaluation form. As discussed earlier, the MERLOT learning object evaluation criteria are well developed, and have been in use for several years. It was therefore decided to use the MERLOT (2000) criteria as the basis for the evaluation of the developed learning objects for this research. To make the evaluation process simpler for the evaluators, the check list version of the MERLOT evaluation criteria was used. This check list was developed as part of a conference workshop (Bennett & Metros, 2001). The evaluation criteria are divided into questions covering the three main criteria: reusability, interactivity and pedagogy. The questions were modified for the context of this research project, and some questions were changed to better reflect the research questions of the project as well. The questions are structured as Likert scale rating responses to statements regarding the learning object, plus a section for long answer comments for each category. There are eight questions regarding reusability, and six questions in each of the interactivity and pedagogy categories respectively. The evaluation form questions were briefly moderated by a couple of MAINZ tutors before finalising.
The goal of the project was to develop learning objects that were scalable for delivery over a variety of delivery formats. The learning objects were supplied to evaluators on CDROM, but were also made available from the Internet. As each learning object was developed, it was also uploaded to a web server to test its accessibility and download times over the Internet. Once all four learning objects were developed, a web site was created to provide access and information that could be updated as feedback was received, and any 'bugs' found in the learning objects could be corrected and the updated learning objects made available from the website. The website delivery necessitated the inclusion of metadata about the learning objects. A web based form (Koch & Borell, 1997) was used to create metadata in the Dublin Core format for each learning object. This metadata was included as a text document associated with each Learning object, and the meta tags generated by the online form were also pasted into theof each web page holding the learning objects. This allows search engines to correctly categorise the learning objects. The website also provided access to the learning objects for the international evaluators.
Although the study is qualitative, some quantitative data and basic analysis was helpful, especially in providing a way to summarise trends and comparisons in evaluation feedback. The MERLOT evaluation process includes a numerical assignment from 1 to 5 for each evaluation question. This allows an average overall rating from 1 to 5 to be assigned to an evaluated learning object. This gives potential users of the learning objects immediate feedback as to the overall quality of the learning object. This practice was adopted by this study, to provide a quick comparative rating for each learning object. Because the MERLOT evaluation criteria questions were modified to the context of the research project, the resulting rating is called an 'equivalent MERLOT rating'. An overall rating is derived from the average of all the evaluator responses, and separate ratings are derived from each separate category of evaluators to give a comparison in ratings given for each learning object when evaluated in different learning contexts. A comparison of the equivalent MERLOT rating for each learning object from the three main evaluation groups (MAINZ Tutors, Church Sound Engineers, and MAINZ students) is then made possible.
QuickTime (Apple Computer, 2004; Beverly, 2004) was chosen as the architecture for authoring and delivery of the learning objects, for several reasons:
Currently there is only one authoring application that taps the full potential of interactive QuickTime development - LiveStage Pro from Totallyhip Software (2003). This required learning a new authoring application and scripting language, however it is very similar to Flash and Director. These authoring applications were unsuitable (by themselves) for developing the audio related learning objects, as they only support eight tracks of audio and provide limited synchronisation of tracks. Some of the learning objects developed have over twenty four audio tracks synchronised together within QuickTime.
Using QuickTime allowed certain levels of 'real time' control and signal processing of the audio examples included in this project. Certain aspects of the learning objects were therefore simulated rather than actual processing of the audio files. More powerful real time processing could be achieved by writing cross platform digital signal processing software. This was beyond the expertise and time frame of this study.
Content areas were chosen from the researcher's experience gained from teaching Audio Engineering. Areas that students generally find difficult to grasp, and are key concepts, and from feedback from a survey of potential end users. learning objects were developed that illustrate some of the following concepts in Audio Engineering: operation of an audio mixing desk, achieving an appropriate mix, microphone choice and placement, dynamics processing, and equalisation. The learning object interface is presented as a floating window on the users desktop (or as an element within an html page if web delivered). This presentation format was chosen rather than a full screen mode, to emphasise the Learning 'Object' concept as a piece of the learning rather than an end in itself (see the following screenshot examples). The structure of the learning objects is designed for learner investigation, rather than forcing a linear progression through the sections of each.
Figure 1: Screenshot of the main window of learning object 1 - an interactive audio mixer
The graphical interface of the learning objects is designed to resemble the actual physical equipment that users would normally be working with. This meant a departure from the graphical designs of commercial audio software and plug-ins, which tend to 'jazz up ' their interface elements, to a more 'conservative' graphical representation. In response to user feedback, the first page of each learning object opens with the overview and instructions for its use/navigation. Initially the author made the instructions available via a clickable '?' icon, but some users did not associate this with instructions and initially felt lost. Context is established by the images of typical professional quality audio equipment - that students actually use within their courses and in live sound environments, and the use of actual audio files recorded using this equipment. Virtual environments are created by using the panoramic and cubic virtual scene capabilities of QuickTime. Text is kept to a minimum as the emphasis is upon recreating an immersive simulated environment, rather than the typical textbook approach. Pedagogy is embedded within each learning object by providing opportunities for interactive feedback and formative assessment. In the example below (Fig 2), the user selects an example audio mix (from different genres) and then must identify the microphones used in recording the mix by listening to the characteristics of the mix. The 'home' section of this learning object provides comparison audio files for the user to practice identifying the characteristic sound of each different microphone. All four learning objects can be found on the author's web site (http://ltxserver.unitec.ac.nz/~thom/).
Figure 2: Screenshot of the main window of learning object 2: An interactive microphone chooser
learning object 1 Evaluation, MERLOT equivalent rating = 3.95These results indicate that the learning objects were generally highly rated by all users.
learning object 2 Evaluation, MERLOT equivalent rating = 4.20
learning object 3 Evaluation, MERLOT equivalent rating = 4.09
learning object 4 Evaluation, MERLOT equivalent rating = 4.22
Below are example summaries of user feedback gathered from the modified MERLOT evaluation form for the learning objects. The charts give a quick indication of a comparison between the two main contexts within which the learning objects were evaluated.
Feedback from tutors included: "It looks great! I like the interactivity and the ability to store mixes", "I could definitely see us being able to use that!" "Really good I learnt a lot of useful information in a short space of time", "Focused, clear intention of learning object. Useful for a wide range of students, not just Audio Engineers". Feedback indicated that the more experienced user and people with teaching experience can see the potential in the learning objects more than the first time user or inexperienced teacher. MAINZ tutors consistently rated the learning objects higher in every category than the Church sound engineers. Feedback from the group of church sound engineers was positive, but they tended to struggle with the computer interface more than the MAINZ users. A common theme that arose was - make instructions or 'help' much more obvious - as it appears several people did not discover the built in 'help' page in the first learning object.
Figure 3: Summary of responses to evaluation questions for learning object 1
Figure 4: Summary of responses to evaluation questions for learning object 2
Feedback on the functionality and interface focused upon extending the functionality, adding a wider choice of audio tracks, and 'modernising' the 'look'. Most users were satisfied with the level of interactivity available within the QuickTime architecture. Feedback from international evaluators is still in the process of being returned. But indications so far are of a very positive response to the learning objects. Overall there has been a very positive response from users to the learning objects developed so far. Some tutors began to see the possibility of teaching audio principles online. The research project continues until the end of this year, so more feedback from users will be gathered and evaluated and utilised to improve the next learning objects.
Figure 5: Summary of responses to evaluation questions for learning object 3
Figure 6: Summary of responses to evaluation questions for learning object 4
Bannan-Ritland, B., Dabbagh, N. & Murphy, K. (2000). Learning object systems as constructivist learning environments: Related assumptions, theories, and applications. In D. Wiley (Ed.), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.
Bennett, K. & Metros, S. (2001). Learning object/Module Checklist. [viewed 23 Feb 2003, verified 13 Oct 2004] http://itc.utk.edu/educause2001/checklist.htm
Beverly, B. (2004). QuickTime: The on-line, cross-platform, multimedia architecture of the present and future. Wright State University. [verified 13 Oct 2004] http://www.wright.edu/ctl/media/multimedia/quicktime.html
Boyle, T. (2002). Towards a theoretical base for educational multimedia design. Journal of Interactive Media in Education, 2, 1-16.
Currier, S. & Campbell, L. M. (2002). Evaluating Learning Resources for reusability: The "DNER & learning objects" Study. In Winds of Change in the Sea of Learning: Proceedings 19th ASCILITE Conference. Auckland, New Zealand: UNITEC Institute of Technology. http://www.ascilite.org.au/conferences/auckland02/proceedings/papers/059.pdf
Douglas, I. (2001). Instructional design based on reusable learning objects: Applying lessons of object-oriented software engineering to learning systems design. Paper presented at the 31st ASEE/IEEE Frontiers in Education Conference, Reno, NV, 10-13 October.
Everest, F. A. (1997). Critical Listening and Auditory Perception: The complete audio-visual training course. On Mix Pro Audio Series [CDROM & Textbook/guide]. Emeryville: MixBooks.
Gibson, B. (2000). AudioPRO Recording Courses. [viewed 29 Jan 2003, not found 13 Oct 2004] http://www.artistpro.com/audioPROcourses.cfm
Hambly, C. (2002). Audio Courses Distance Learning Online Sound Engineering School. [viewed 3 Dec 2002, verified 13 Oct 2004] http://www.audiocourses.com/
Hanley, G. (2003). Online Resource: MERLOT: Peer-to-Peer Pedagogy. Syllabus (Campus Technology). [verified 13 Oct 2004] http://www.campus-technology.com/article.asp?id=7780
Heins, T. & Himes, F. (2002). Creating learning objects with Macromedia Flash MX. [viewed 1 Feb 2003, verified 13 Oct 2004] http://download.macromedia.com/pub/solutions/downloads/elearning/flash_mxlo.pdf
Koch, T., & Borell, M. (1997-1998). Dublin Core Metadata Template. [viewed July 2003, verified 13 Oct 2004] http://www.lub.lu.se/metadata/DC_creator.html
LTSC (2002). IEEE Learning Technology Standards Committee. [viewed 2 Mar 2003, verified 13 Oct 2004] http://ltsc.ieee.org/
McNaught, C., Burd, A., Whithear, K., Prescott, J. & Browning, G. (2002). It takes more than metadata and stories of success: Understanding barriers to reuse of computer-facilitated learning resources. In Winds of Change in the Sea of Learning: Proceedings 19th ASCILITE Conference. Auckland, New Zealand: UNITEC Institute of Technology. http://www.ascilite.org.au/conferences/auckland02/proceedings/papers/021.pdf
Mellor, D. (2001). Distance Learning for Sound Engineering and Music Recording. [viewed 12 Mar 2003, verified 13 Oct 2004] http://www.audiomasterclass.com/index.html
Mellor, D. (2003). Record-Producer.com. [viewed 12 Mar 2003, verified 13 Oct 2004] http://www.record-producer.com/
MERLOT. (1997). MERLOT: Multimedia Educational Resource for Learning and Online Teaching. [viewed 13 May 2002, verified 13 Oct 2004] http://www.merlot.org/
MERLOT (2000). Evaluation Standards for Learning Materials in MERLOT. [viewed 23 Feb 2003 at http://taste.merlot.org/eval.html, see http://taste.merlot.org/collection/peer_review/eval_criteria.htm verified 13 Oct 2004]
Neumann, G. (1996). Sound Engineering Contest 1998 [CDROM]. Berlin: Georg Neumann.
SAE Institute (2001). Reference Material Center. [viewed 13 May 2002, verified 13 Oct 2004]] http://www.saecollege.de/reference_material/index.html
Sides, A. (1995). Allen Sides' microphone cabinet [CDROM]. Emeryville: Cardinal Business Media Inc.
South, J. B. & Monson, D. W. (2000). A university-wide system for creating, capturing, and delivering learning objects. In D. Wiley (Ed), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology.
Totally Hip Software (2003). LiveStage Pro. [verified 13 Octt 2004] http://www.totallyhip.com/livestage.asp
Wadsworth, Y. (1998). What is participatory action research? Action Research Resources. [viewed 3 May 2002, verified 13 Oct 2004] http://www.scu.edu.au/schools/gcm/ar/ari/p-ywadsworth98.html
Wiley, D. A. (1999). The Post-LEGO learning object. [viewed 8 Oct 2003, verified 13 Oct 2004] http://wiley.ed.usu.edu/docs/post-lego.pdf
Wiley, D. (2000). Connecting learning objects to instructional design theory: A definition, a metaphor, and a taxonomy. In D. Wiley (Ed), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology. http://www.reusability.org/read/chapters/wiley.doc
Wiley, D. (2002). Learning objects - a definition. In A. Kovalchick & K. Dawson (Eds), Educational Technology: An Encyclopedia. Santa Barbara: ABC-CLIO. [verified 13 Oct 2004] http://wiley.ed.usu.edu/docs/encyc.pdf
Williams, D. D. (2000). Evaluation of learning objects and instruction using learning objects. In D. Wiley (Ed.), The instructional use of learning objects. Bloomington: Association for Educational Communications and Technology. http://www.reusability.org/read/chapters/williams.doc
|Author: Thomas Cochrane can be contacted on email@example.com
Please cite as: Cochrane, T. (2004). Interactive QuickTime: Developing and evaluating multimedia learning objects to enhance both face to face and distance e-learning environments. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 201-211). Perth, 5-8 December. http://www.ascilite.org.au/conferences/perth04/procs/cochrane.html
© 2004 Thomas Cochrane
The author assigns to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the author.