Closing the loop: Quality in process
University of Wollongong
In our context of producing educational multimedia resources as well as developing the skills of staff, we use end of project surveys to gain information about our performance. This has resulted in limited returns, usually positive but lacking in detail. This paper describes an additional step we have taken to improve our products and processes by closing the loop on our procedure, using client experiences to inform and modify our structure and approach. After our clients have used their creations with students, we mine their experience through targeted questions that provide prescriptions for action. The information is used internally by our management group to judge and to redesign our production processes and procedures. Issues identified include: language problems between academics and creative staff: need for greater acknowledgment for academic staff within their peer group.
CEDIR is an educational development and support unit at the University of Wollongong. We are centrally funded, so the Flexible Learning Services (FLS) unit of CEDIR can offer free (though rationed) educational multimedia services to teaching staff. FLS consists of the manager, 3 workflow coordinators, 3 learning designers, 3 web/db programmers, 3 graphic artists, 2 video producers and 2 desktop publishers. We create learning resources with our academic clients in a six month cycle and also work with them on the design of their teaching strategies. Teaching staff are not required to use our services, so our continued existence relies on our being able to demonstrate that we are serving staff needs; we rely on client satisfaction to generate a continuing flow of new and repeat clients.
A previous ASCILITE paper (Lambert 2003) described our system and the end of project surveys that we use to gain information about our performance. Clients are asked to complete an online form indicating their satisfaction with our performance. This has resulted in limited returns, usually positive but lacking in detail. This paper describes an additional step we have taken to improve our products and processes by closing the loop on our procedure, using client experiences in more detail to inform and modify our structure and approach.
Feedback loops occur throughout our multimedia production processes, as we create prototypes and test their match to client requirements. We also modify our processes based on client satisfaction surveys. An additional procedure developed in 2003 was the collection from former clients of more detailed information about the efficiency and effectiveness of our production processes, and the match of their expectations to their experiences working with us.
The information was collected through 1-1 interviews with clients, with a limited number of questions. The interviews were audio-recorded in the client's office. The client was provided with the printed questions which were also spoken by the interviewer to reduce ambiguity. Additional prompts were provided by the interviewer if answers were too brief to satisfy the purpose of the question. Verbatim transcripts were later emailed to clients with a request for any corrections or altered emphasis they chose to provide.
Because we were interested also in the impact of our work on teaching and learning, we chose to interview clients who had worked with us 6 months earlier, clients who had since used their new teaching resource or design with their students but who could still remember the difficulties and successes of the production process.
Questions were designed so that their answers would provide information we could act on to improve our processes (Berger 2000), rather than collecting information for statistical purposes or seeking comments on our performance. Initial questions were reworked by FLS Learning Designers, with additional questions added to allow clients free comment. The Appendix to this paper contains the final questions and their rationale.
We obtained rich responses from many clients who were clearly seeking an avenue to express their opinions about the process, and were glad of the opportunity. The results of the survey were reported only internally in FLS, advising management decision making. The results were presented to the management group (see Fig. 1) in a way that allowed rapid comparison of:
- answers to the same question across the range of clients
- answers to all questions from one client
Verbatim transcripts were available to the management group as well as summaries. Early action was taken when interviews revealed issues that required immediate responses.
Figure 1: Reporting screen
Survey outcomes: End of 2003
Changes are continually being made to most organisations to respond to new situations. The survey process at the end of 2003 revealed that a specific change we had made to our organisation, assigning a Learning Designer to every project rather than just large projects, was timely and effective means of improving quality and reducing waste. This change was possible because we appointed an additional staff member to be responsible for client liaison, reducing the need for learning designers to do this task.
Aside from the improvements in design that arose from the involvement of skilled professionals, there was an additional gain in the better understanding and reduced friction between the academic and the creative producer (graphic artist, programmer, etc). The Learning Designer here acted as an interpreter, speaking the jargon of both parties and rephrasing as necessary.
Any academic who becomes usefully involved in a development project with us needs to contribute a lot of time and mental effort to the production of a new learning resource. The interviews showed that clients felt unappreciated by their peers, especially as they had to do a lot of work and be quite organised during their projects.
In the past we have used former clients as presenters in workshops, and also encouraged publication of their work at conferences. We have also made presentations to Faculty groups as a means of generating new clients, during which presentations we have shown the work of former clients. While both of these strategies had benefits for the clients and for us, the interviews showed that this was giving them insufficient exposure among their peers. To an extent, it was distancing them from their peers and was counterproductive in gaining benefits for our clients in their workgroup.
We responded to this through activities that publicised the work of clients within their own Faculties, both mid-project and at project completion. This is an issue that needs ongoing attention.
In this survey we are covering a wider range of project sizes; the previous survey was limited to 9 large projects. We have retained the questions in order to allow comparison with 2003, now that the organisational changes of 2003 have affected all projects.
Preliminary outcomes mid-2004
CEDIR and specifically FLS needs to publicise itself better; staff commented that they were unaware of the depth of skill and support available before they were well involved with their project.
As we hoped, staff did develop their skills in scheduling, workload anticipation, material collection and interaction design, though they were often unaware until reflection prompted by questioning.
Anticipated changes for end of 2004
- We may alter the questions to seek information about new issues that we become concerned about in our processes.
- We will recruit a new questioner. The person conducting the interviews must be part of the production system; they need to understand it well enough to ask ad-hoc follow up questions, to develop complexity or to seek further details in the information provided by the client. But the interviewer should not have such a vested interest in the system that the client feels pressured to make positive comment.
Appendix: Final questions and rationale
- How did you find out about the FSA process?
[If clients are finding out via other channels than our advertising, we might improve our advertising or use alternative channels]
- What was your overall impression of the FSA process
[Scoping question, looking for general level of satisfaction and opening path for specific complaints]
- The project development cycle has a series of stages; (see diagram). Did you understand what was required of you at each stage as you went through the project? If not, please explain.
[If not, we need to be more informative at the start of the process and to refer to the stages/diagram during the project life. More visual measures of achievement may be required.]
- What did you set out to do?
[If the answers are all in mechanistic terms rather than in achieving learning improvements, we might need to do more work at the project entry to cast the work for them in learning terms rather than in multimedia terms.]
- Was this a new way of teaching or a substitute for something you were doing that was in some way equivalent?
[This allows us to see whether we are improving the efficiency of previously existing learning designs, or doing things that couldn't be done in f2f terms. It also prompts the client reflect on ways that their teaching has changed, as a preamble to Q6 and Q7.]
- How did the actual process/product differ from your original intentions?
[We might expect some change as a measure of whether the client has developed their understanding during the project. This would be normal with inexperienced clients. If the client was inexperienced and still says that the product didn't differ from their intentions, then perhaps we've missed an opportunity. For experienced clients, this question measures how well we've met the brief.]
- What were the outcomes for students, and did these differ from your intentions?
[If the outcomes for students were negative, we need to investigate what went wrong with the process. If the outcomes were positive, they may be different from expectations, and so educate us about possible future applications of our work.]
- Did your own skills and/or understanding develop as a result of the FSA project? (project participation, scheduling, workload anticipation, material collection, interaction design, etc)
[Yes: we've probably involved them well in the design and development process No: we've missed an opportunity, and have to do better if they come back.]
- Have there been any increased linkages/collaboration within your Faculty as a result of the FSA project?
[We hope there will be effects of our work beyond the individual academic, as we know that much of the effort focused on individuals is eventually wasted. If there has been no wider impact, then we need to publicise the project within the relevant faculty]
- Did anything go wrong with the process from your point of view? If so, how could that have been avoided?
[We want to know what went wrong, and how to avoid this in future.]
- Do you have any other suggestions for improving the process?
[We're open to suggestions from anyone to improve our processes and outcomes.]
Lambert, S. (2003). Learning design at the University of Wollongong In G. Crisp, D. Thiele, I. Scholten, S. Barker and J. Baron (Eds), Interact, Integrate, Impact: Proceedings of the 20th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education. Adelaide, 7-10 December.
Berger, C. (2000). Everything You Want to Know About Surveys... and Weren't Afraid to Ask.
http://carat.umich.edu/carat/files/surveyfordecision.ppt [verified 6 Oct 2004]
Please cite as: Pennell, R. (2004). Closing the loop: Quality in process. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference (pp. 777-780). Perth, 5-8 December. http://www.ascilite.org.au/conferences/perth04/procs/pennell.html|
© 2004 Russ Pennell
The author assigns to ASCILITE and educational non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to ASCILITE to publish this document on the ASCILITE web site (including any mirror or archival sites that may be developed) and in printed form within the ASCILITE 2004 Conference Proceedings. Any other usage is prohibited without the express permission of the author.
[ ASCILITE ]
[ 2004 Proceedings Contents ]
This URL: http://www.ascilite.org.au/conferences/perth04/procs/pennell.html
HTML created 1 Dec 2004. Last revision: 3 Dec 2004.