Conference logo
[ ASCILITE ] [ 2004 Proceedings Contents ]

Implementing and evaluating e-learning environments

Sue Trinidad
Department of Education
Curtin University of Technology

John Pearson
Faculty of Education
The University of Hong Kong
The research reported in this paper used the Online Learning Environment Survey (OLES) as a tool to evaluate e-learning environments. Data gathered from university classes in Hong Kong using blended e-learning were used to illustrate the value of OLES in helping educators reflect on the online learning environment provided for students. Comments from interviews were used to verify the data gathered online. OLES was found to be a valuable instrument for gathering data to help educators reflect on what had worked or what might be improved in their classes. These educators were also able to see the extent to which the actual experiences of students in the module matched their preferred online learning environment.


Many factors influence the learning experience. These include the infrastructure, quality of content and assessment, quality of learner support systems, assumptions made by learners and educators about the learning experience itself and peer support networks for learners and educators (Macnish, Trinidad, Fisher & Aldridge, 2003). It is also suggested that, given the emerging nature of e-learning [1], or increased use of blended learning [2] there is a need for research in this field to inform teaching and learning development. As teachers begin to use the technology more in their classrooms to support their pedagogical practice, they are able to be involved in community building environments designed to support blended learning. Within these communities of practice, teachers and students can work as teams and engage in reflective, collegial patterns of work. These learning environments can facilitate both cognitive as well as social scaffolding, enabling educators and students to become progressively more involved in the community and sustain their commitment and interests. Such environments can support academic programmes that rely heavily on pedagogies that emphasise the emergence and growth of autonomous, collaborative learning, rather than teacher directed delivery of learning materials.

Research studies describing psychosocial learning environments have examined numerous factors that influence learning in classrooms. Such research on classroom learning environments commenced over three decades ago with the work of Walberg (1979) and Moos (1974). Learning environment research has provided a useful focus in evaluations of educational innovations (Fisher, Aldridge, Fraser & Wood, 2001; Fraser & Maor, 2000; Maor & Fraser, 1996; Newby & Fisher, 1997; Teh & Fraser, 1995; Zandvliet, 2003) and more recently web based learning (Jegede, Fraser & Fisher, 1995; McLoughlin & Luca, 2003; Taylor & Maor, 2000; Trinidad, Aldridge & Fraser, 2004; Walker, 2002). Past research has found links between classroom environments and student outcomes (Fraser, 1999a, 1999b; Goh, Young & Fraser, 1995) and the effectiveness of outcomes focused and technology rich learning environments in promoting student retention, achievement, attitudes and equity (Aldridge, Fraser, Fisher, Trinidad & Wood, 2003; Trinidad, Macnish, Aldridge, Fraser & Wood, 2001). Such research has shown that students' outcomes are likely to be better when the actual learning environment more closely matches their preferred learning environment (Aldridge, Fraser, Fisher, Trinidad & Wood, 2003; Fraser, 1998b, 1999a; Fraser & Fisher, 1983).

Development of the online learning environment survey

The online learning environment survey (OLES) ( is a dual format instrument where students are asked to rate the 'actual' learning environment experienced in a module (subject) with their 'preferred' learning environment using a five point rating (Almost Never, Seldom, Sometimes, Often, Almost Always) for actual and preferred items. The purpose of OLES is to provide educators using e-learning with a mechanism to reflect on the learning environment provided based on the results gained.

OLES is adapted from the What is happening in this classroom (WIHIC) learning environment instrument (Fraser, Fisher & McRobbie, 1996), which has been shown to have high reliability and validity in educational settings and has been validated in a number of different languages and contexts. Two scales are also used from the Distance education learning environments survey (DELES), which also has high reliability and validity (Jegede, Fraser & Fisher, 1995; Walker, 2002). It has been designed to suit the nature and characteristics of e-learning environments with seven primary scales from the WIHIC and two primary scales from DELES measuring 54 items: Computer Usage (CU); Teacher Support (TS); Student Interaction & Collaboration (SIC); Personal Relevance (PR); Authentic Learning (AL); Student Autonomy (SA); Equity (EQU); Enjoyment (EN); and Asynchronicity (AS). Internal consistency reliability and factor structure were provided by the administration of OLES to 324 students (Trinidad, Aldridge & Fraser, 2004). To examine whether the items in a scale assess the same construct, the internal consistency reliability was calculated. For both the actual and preferred forms of the OLES, the internal consistency (Cronbach alpha reliability) estimates ranged from 0.86 to 0.96 for that actual version and from 0.89 to 0.96 for the preferred version (see Trinidad, Aldridge & Fraser, 2004). A second form of the OLES contains parallel items designed to obtain measures from the educator teaching the class and their expressed preferences for aspects of the e-learning environment.

The study methodology

An interpretative framework involving both quantitative and qualitative data collection was used in this study. Data were gathered using OLES (administered online) at, or soon after the final class in each module. Qualitative data were also gathered from email interviews with students in each of the three classes (n=14), online reflective journals and online forum discussions, and were used to elaborate on the statistical data gathered in order to provide a more empathetic understanding of the effectiveness of the learning environments. The approach is similar to other evaluations (Aldridge, Fraser, Fisher, Trinidad & Wood, 2003; Fisher, Aldridge, Fraser & Wood, 2001; Trinidad, Macnish, Aldridge, Fraser & Wood, 2001) that have investigated e-learning environments.

The case study groups

The OLES survey was administered at the University of Hong Kong to part time students in the Master of Science in Information Technology (MSc[ITE]) modules Teaching and learning with IT (n=33) and Information Technology and educational leadership (n=29), and the Postgraduate Certificate in Education module Use of Computers in Education (n=12) that had made extensive use of e-learning in the form of blended learning. These classes were conducted in face to face mode in rooms with tables of laptops arranged in groups and supported by an online course room provided by the Interactive Learner Network (ILN) software (

Teaching and learning with IT (MITE6004) was structured around 'rich assessment tasks' (Trinidad and Albon, 2002) in which students completed group and individual tasks to construct their own knowledge. The module was structured on the philosophy that learning does not take place in a solitary manner but in a socially, active learning environment where learners are given opportunities to construct their own learning. The students in this module were asked to form groups of two to four students and participated in these groups in online and face to face activities for the 12 sessions. Two lecturers in charge taught this group of students with both lecturers teaching the first and last sessions jointly, and each lecturer teaching five individual sessions each, Lecturer 1 conducted three face to face and two online sessions, and Lecturer 2 conducted five face to face sessions.

In Information technology and educational leadership (MITE6003), learning activities consisted of six presentations by the lecturer in charge, with opportunities for students to discuss (in groups) specific questions about content introduced, followed by the posting of ideas/recommendations to the ILN online forum. At each weekly session there were presentations made by small groups of students about set readings, followed by opportunities for other students to comment individually on both the issues/concerns raised and the presentation itself in online forums. Students were encouraged to reflect on other issues/concerns related to leadership in ICT, which arose from time to time over the duration of the module.

In the Use of computers in education (PCED6901), students used the ILN online learning environment to access resource materials (documents, presentations) prepared by the lecturer in charge about topics and issues introduced in the module; students' presentations on set readings; and discussions of issues raised in students' presentations. During the eight sessions there was also discussion about issues associated with the use of ICT in schools.


The effect sizes (reported in Tables 1, 2 and 3) were calculated to estimate the significance of the differences between students' scores on the actual and preferred forms of the OLES, as recommended by Thompson (1998; 2001). A MANOVA was used to investigate whether differences between actual and preferred scores on the nine OLES scales were significantly different.

Table 1: Average item mean, average item standard deviation and difference (effect size and MANOVA results) between students' actual and preferred scores on the OLES using the individual as the unit of analysis for teaching and learning with IT (MITE6004)

OLES ScaleAverage Item MeanAverage Item S.D.Difference
ActualPreferredActualPreferredEffect SizeF
Computer Usage 3.793.940.730.580.2280.20
Teacher Support 3.863.400.651.45-0.4090.86
Student Interaction and Collaboration 3.703.720.991.420.0160.00
Personal Relevance 3.613.751.221.370.1080.05
Authentic Learning 3.623.801.050.940.1810.12
Student Autonomy 3.604.070.960.580.5931.17
Equity 3.814.240.740.490.6901.64
Enjoyment 3.644.000.720.560.5581.14
Asynchronicity 3.764.030.490.250.6941.60
*p<0.01 N=15 students

Table 1 and Figure 1 provide the OLES statistics and chart for this group of students undertaking their first MSc[ITE] degree module Teaching and learning with IT (MITE6004). None of the scales were significantly different. Both the quantitative and the qualitative data indicate that students were very satisfied with the module and the support they received, and felt that having two lecturers was very beneficial. They rated Teacher Support higher for 'actual' than 'preferred' (actual 3.86; preferred 3.40). The two lecturers were well organised and supportive and there were many positive comments such as:

Teachers provide a lot of information on what we should prepare for each class and reply to my questions promptly and clearly. They are good facilitators during my learning process as well. It is really great that they provide a lot of opportunity for us to contribute and share our knowledge in the class.
Figure 1

Figure 1: Teaching and Learning with IT (MITE6004)

As expected from previous research (Fraser 1998; 2002) 'preferred' scores were higher for all scales, except Teacher Support. Students felt they had more than adequate teacher support and opportunities for group work in a relevant and authentic learning environment. Much effort had gone into the planning and teaching of this module, as it was the first module most Masters students undertake in the program. Therefore this module was carefully structured to introduce the ILN online learning environment to students and to encourage group work through both online and face to face sessions. This module set the foundation for the following modules in the MSc[ITE] program and the effectiveness of this module was summed up by a student's comment about the e-learning experienced:

It is a convenient and effective teaching and learning method. It is amazingly good that I can have such [a] learning experience, as it can motivate me to provide such kind of way of teaching in my work place. And the design of the course is great, as it shows me what can successful e-learning be like.
The two lecturers reflected on the chart (Figure 1) and while happy with the overall outcomes agreed that two areas for improvement were in the area of equity and student autonomy. Within the module there had been some group work where not all students had contributed equally and it was decided in the next iteration of this module, that the lecturers would ask students to more fully describe the roles of group members during the 12 sessions so problems could be dealt with early in the module.

Within this module, two out of the 12 sessions were totally online and Lecturer 1 ran these sessions. The results reported in Figure 1 indicate that the Lecturer 1 perceived the e-learning environment to be more positive than students, and Lecturer 1 scored higher than Lecturer 2 on all of the OLES scales including the enjoyment of online learning. Lecturer 1 felt that it was important to provide the students with the opportunity to expand their ideas of how to work in both face to face and online environments. Lecturer 1 was passionate about providing experiences that were personally relevant to the students through authentic learning, strongly believing in modeling principles to students to help them expand and broaden their ideas about e-learning and how learning in any environment must equate to quality experiences to help students learn better. Lecturer 2 was somewhat skeptical of the benefits of learning online in Hong Kong. This is indicated in the comment that "the Hong Kong situation is one where students equate face to face education with quality education, and therefore this mind set informs or affects the use of online environments". Lecturer 1 and Lecturer 2 had different teaching styles and philosophies to learning and these are highlighted in Figure 1 where Lecturer 1 rated many items 'often' and 'almost always' whereas Lecturer 2 rated many items 'sometimes' and 'often'. Both lecturers valued the opportunity to reflect on the conduct and the outcomes of their module based on the OLES chart.

The OLES table (Table 2) and chart (Figure 2) for the second group of MSc[ITE] students taking Information technology and educational leadership (6003) indicates that: (a) the 'actual' and 'preferred' scores for most scales are generally high; (b) 'actual' experiences scored lower than 'preferred' experiences (except on the Enjoyment scale); and (c) differences in scores are quite small, except for the scales of Teacher Support, Personal Relevance and Authentic Learning where differences between 'actual' and 'preferred' scores are wider. The scores for Authentic Learning (AL) were statistically significant (p < 0.01 level).

The lecturer reflected on the chart (Figure 2) and felt that the online forums for this module could be reviewed to ensure they involve more 'genuine' activities, particularly in view of the diverse backgrounds of students enrolled in the module. This may involve planning discussions over longer periods of time to make better use of the permanently stored messages, rather than simply using the forum as a place to record the outcomes of group discussions in face to face classes. It may also involve designing discussion topics for students employed in educational settings other than government primary and secondary schools to ensure that the outcomes of discussions have 'personal relevance' to participants.

Table 2: Average item mean, average item standard deviation and difference (effect size and MANOVA results) between students' actual and preferred scores on the OLES using the individual as the unit of analysis information technology and educational leadership (MITE6003)
OLES ScaleAverage Item MeanAverage Item S.D.Difference
ActualPreferredActualPreferredEffect SizeF
Computer Usage
Teacher Support 3.824.280.630.640.7244.06
Student Interaction and Collaboration 4.064.410.570.590.6032.80
Personal Relevance 3.684.280.850.930.6733.43
Authentic Learning 3.774.440.600.571.1459.80*
Student Autonomy 4.254.610.640.500.6272.93
Enjoyment 3.613.530.851.150.0790.04
Asynchronicity 3.904.210.880.780.3731.06
*p<0.01 N=15 students

The chart (Figure 2) for the lecturer in charge of the module, and the chart of students' 'preferred' scores are indicative of strong preferences for working online. As one student noted:

By posting interesting responses to forum topics we can actively build up knowledge on technology leadership such as the discussions about the Way Forward consultation document.
Figure 2

Figure 2: Information technology and educational leadership (MITE6003)

Another student stressed the collaborative aspect of online work:

We were often given a task and the group worked together to complete the task. We usually divided the labour so that each of use looked at a specific area. Then one member of the group summarised the information and presented it online. I find this interaction stimulates each student to think and eliminates non-participants [evident] in class.
By highlighting differences in 'actual' and 'preferred' scores, an OLES chart helped with the identification of aspects of a module that should be reviewed with a view to enhancing the online learning environment for students.
Table 3: Average item mean, average item standard deviation and difference (effect size and MANOVA results) between students' actual and preferred scores on the OLES using the individual as the unit of analysis for use of computers in education (PCED6901)
OLES ScaleAverage Item MeanAverage Item S.D.Difference
ActualPreferredActualPreferredEffect SizeF
Computer Usage
Teacher Support 3.363.561.251.370.1530.54
Student Interaction and Collaboration
Personal Relevance
Authentic Learning
Student Autonomy 3.323.401.391.600.0530.07
Equity 3.403.391.441.58-0.0100.00
Enjoyment 2.672.641.051.13-0.0280.03
Asynchronicity 3.223.321.361.540.0690.11
*p<0.01 N=15 students

The OLES table (Table 3) and the chart (Figure 3) for the group of students (all secondary teachers) taking Use of computers in education (PCED6901) reveals lower scores for 'actual' and 'preferred' scales of the online learning environment than the MSc[ITE] students (Figures 1 and 2). The reasons are not known but may reflect differences of interest in, and experience with online learning environments as none of the students enrolled in this module had previously used an online environment to access resources or participate in online discussions. A relatively close match is evident in 'actual' and 'preferred' scales, although Teacher Support could be a focus when the online learning environment was reviewed.

Figure 3

Figure 3: Use of computers in education (PCED6901)

There is a marked difference between the preference of the lecturer in charge and the students enrolled in this module for working in an online environment. On reflection, the lecturer felt this probably resulted from different levels of experience in working online. This is a reminder that activities should be planned which build students' confidence and skills particularly in the initial stages of working online (Pearson & Selinger, 1999), and that ongoing guidance and support is often necessary to maintain high levels of participation in online discussions.


This paper has reported on an evaluation of three e-learning modules using a Web based instrument (OLES) to gather data on students' and lecturers' 'preferred' and 'actual' online learning experiences and present this graphically in the form of charts. Similarities and differences in students' 'actual' and 'preferred' scores on OLES in these three modules were identified, and indications given about those aspects of the online environments that could be considered when revisions of the modules were being undertaken. Two of the three lecturers perceived the e-learning environment to be more positive than students. This is similar to findings from other research that compares teachers' and students' perceptions of the learning environment (Fraser, 1998; Trinidad, Aldridge & Fraser, 2004).

When comparing the three modules, the lecturers, who were all Western, agreed that English was an issue with some of the Chinese students for whom English can be considered a "foreign language" rather than a second language as English is not used all the time. Often English is only spoken during classes, as the university is an English medium university. Providing quality English e-learning environments for Chinese students allows them to go over materials to further understand meaning, and working in groups allows them to reinforce their knowledge in their native language (Cantonese) giving adequate cognitive processing time to go back and forth between the two languages to further understand meaning.

Educators need not only knowledge of learning theories and models of best practice to design and implement e-learning environments, but also information (feedback) on how specific attempts to do so have matched the preferred learning environment of students. The evaluation of examples of e-learning reported in this paper indicates that the charting of data using OLES provides a practical strategy by which information can be presented. These charts can then be used to inform discussions about changes to the design of actual e-learning environments so that these match the preferred learning environment of students and lead to improved learning outcomes. After using OLES in the evaluation of these three modules, the lecturers involved concluded that OLES provided valuable data about online components of the learning environment. Further research will be undertaken to assess whether OLES is also valuable to a wider range of teachers working in e-learning environments.


  1. Electronic learning or e-learning, as defined by Jackson (2002) Defining eLearning - Different Shades of "Online", can be technology-enhanced learning and/or technology-delivered learning. Both dimensions describe e-learning for the purpose of this paper.
  2. Blended Learning is learning which combines online and face-to-face approaches (NSW Department of Education and Training, 2002).


Albon, R. & Trinidad, S. (2002). Building learning communities through technology. In K. Appleton, C. Macpherson, & D. Orr (Eds.), International Lifelong Learning Conference: Refereed papers from the 2nd International Lifelong Learning Conference (pp. 50-56). Yeppoon, Central Queensland, Australia: University of Central Queensland.

Aldridge, J., Fraser, B., Fisher, D., Trinidad, S. & Wood, D. (2003). Monitoring the success of an outcomes-based, technology-rich learning environment. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

Ellis, A. & Phelps, R. (2000). Staff development for online delivery: A collaborative team based action learning model. ustralian Journal of Educational Technology,A 16, 26-44.

Fisher, D., Aldridge, J., Fraser, B. & Wood, D. (2001). Development, validation and use of a questionnaire to assess students' perceptions of outcomes-focused, technology-rich learning environments. Proceedings AARE Conference, Perth.

Fraser, B. J. (1981). Tests of Science-Related Attitudes (TOSRA). Melbourne: Australian Council for Educational Research.

Fraser, B. J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 493-541). New York: Macmillan.

Fraser, B. J. (1998a). Science learning environments: Assessment, effects and determinants. In B. Fraser & K. Tobin (Eds.), International handbook of science education (pp. 527-564). Dordrecht, The Netherlands: Kluwer.

Fraser, B. J. (1998b). Classroom environment instruments: Development, validity and applications. Learning Environment Research: An International Journal, 1, 7-33.

Fraser, B. J. (1999a). Using learning environment assessments to improve classroom and school climates. In H. J. Freiberg (Ed.), School climate: Measuring, improving and sustaining healthy learning environments (pp. 65-83). London: Falmer Press.

Fraser, B. J. (1999b). "Grain sizes" in learning environment research: Combining qualitative and quantitative methods. In H. C. Waxman & H. J. Walberg (Eds.), New directions for teaching practice and research, (pp. 285-296). Berkeley, CA: McCutchan.

Fraser, B. J. (2002). Learning environment research: Yesterday, today and tomorrow. In S. C. Goh & M. S Khine (Eds.), Studies in educational learning environments: An international perspective (pp. 1-26). Singapore: World Scientific.

Fraser, B. J. & Fisher, D. (1983). Student achievement as a function of person-environment fit: A regression surface analysis. British Journal of Educational Psychology, 53, 89-99.

Fraser, B., Fisher, D. & McRobbie, C. (1996). Development, validation and use of personal and class forms of a new classroom environment instrument. Paper presented at the annual meeting of the American Educational Research Association, New York.

Fraser, B. J. & Maor, D. (2000). A learning environment instrument for evaluating students' and teachers' perceptions of constructivist multimedia learning environments. Paper presented at the annual meeting of the National Association for Research in Science Teaching, New Orleans.

Godfrey, C. (2001). Computer technologies: Scaffolding tools for teaching and learning. Australian Educational Computing, 16(2), 27-29.

Goh, S., Young, D. & Fraser, B. J. (1995). Psychosocial climate and student outcomes in elementary mathematics classrooms: A multilevel analysis. The Journal of Experimental Education, 43, 90-93.

Jegede, O., Fraser, B. & Fisher, D. (1995). The development and validation of a distance and open learning environment scale. Educational Technology Research and Development, 43, 90-93.

Maor, D. & Fraser, B. J. (1996). Use of classroom environment perceptions in evaluating inquiry-based computer assisted learning. International Journal of Science Education, 18, 401-421.

McLoughlin, C. & Luca, J. (2003). Overcoming "process-blindness" in the design of an online environment: Balancing cognitive and psycho-social outcomes. In G. Crisp, D. Thiele, I. Scholten, S. Barker and J. Baron (Eds), Interact, Integrate, Impact: Proceedings 20th ASCILITEl Conference, (pp. 332-342). Adelaide, 7-10 Dec.

Moos, R. H. (1974). The Social Climate Scales: An overview. Palo Alto, CA: Consulting Psychologists Press.

Macnish, J., Trinidad, S., Fisher, D. & Aldridge, J. (2003). The online learning environment of a technology-rich secondary college. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

Newby, M., & Fisher, D. (1997). An instrument for assessing the learning environment of a computer laboratory. Journal of Educational Computing Research, 16, 179-190.

NSW Department of Education and Training (2002). Learning Technologies: Blended Learning. [viewed 30 Apr 2004]

Pearson, J. & Selinger, M. (1999). Linking different types of knowledge in professional education and training: The potential of electronic communication. In M. Selinger & J. Pearson (Eds.), Telematics in Education: Trends and Issues (pp. 15-31). Oxford: Elsevier.

Thompson, B. (1998). Review of 'what if there were no significance tests?' Educational and Psychological Measurement, 58, 334-346.

Thompson, B. (2001). Significance, effect sizes, stepwise methods and other issues: Strong arguments move the field. Journal of Experimental Education, 7, 80-93.

Taylor, P. & Maor, D. (2000). Assessing the efficacy of online teaching with the Constructivist On-Line Learning Environment Survey. In A. Herrmann and M.M. Kulski (Eds.), Flexible Futures in Tertiary Teaching. Proceedings of the 9th Annual Teaching Learning Forum, 2-4 February 2000. Perth: Curtin University of Technology.

Teh, G. P. L. & Fraser, B. J. (1995). Development and validation of an instrument for assessing the psychosocial environment of computer-assisted learning classrooms. Journal of Educational Computing Research, 12, 177-193.

Trinidad, S., Macnish, J., Aldridge, J., Fraser, B., & Wood, D. (2001). Integrating ICT into the learning environment at Sevenoaks Senior College: How teachers and students use educational technology in teaching and learning. Proceedings AARE Conference, Perth.

Trinidad, S., Aldridge, J. & Fraser, B. (2004). Development and use of an online learning environment survey. Paper presented to the Annual Meeting of the American Educational Research Association AERA 2004, San Diego, CA.

Walberg, H. J. (1979). Educational environments and effects: Evaluation, policy and productivity. Berkeley, CA: McCutchan.

Walker, S. (2002). Insight: Distance education learning environments survey. [viewed 10 Jan 2004]

Zandvliet, D. (2003). Learning environments in Malaysian "Smart School" classrooms. Paper presented at the annual meeting of the American Educational Research Association, Chicago.

Authors: Sue Trinidad can be contacted on
John Pearson can be contacted on

Please cite as: Trinidad, S. & Pearson, J. (2004). Implementing and evaluating e-learning environments. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 895-903). Perth, 5-8 December.

© 2004 Sue Trinidad and John Pearson
The authors assign to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the authors.

[ ASCILITE ] [ 2004 Proceedings Contents ]
This URL:
HTML created 1 Dec 2004. Last revision: 3 Dec 2004.