Spanner Back Home

Full Paper

Back to List of papers

Facilitating Clinical Reasoning Through Computer Based Learning and Feedback

Deborah A. Bryce*, Nicholas J.C. King*, Richmond W. Jeremy†, and J. Hurley Myers¶ debryc@anatomy.usyd.edu.au, nickk@med.usyd.edu.au

*The Department of Pathology and †The Department of Medicine, The University of Sydney, N.S.W. 2006 Australia,
¶Department of Physiology, School of Medicine, Southern Illinios University at Carbondale, Carbondale, IL 62901, US.

 

Abstract

Literature on the evaluation of educational technology, its design and application, stresses the importance of targeting the needs of learners and avoiding technology-driven approaches. Appropriate use of technology can provide teaching and learning opportunities not previously possible. The following paper describes the adaptation of a diagnostic reasoning program (DxR) to an educational setting that creates new learning opportunities for students, provides a powerful tool for teachers to assess student progress.

The DxR program can be used to facilitate the clinical reasoning processes of students and give teachers insight into the effectiveness of the clinical reasoning skills of individual students. When used with feedback from tutors, in the later years of clinical training, it can help students refine their clinical reasoning skills, allowing them to identify strengths and weaknesses in their clinical reasoning approach and knowledge base, giving them greater confidence through the process of practice and feedback. The DxR program also creates opportunities for research into how the clinical reasoning process of students can be fostered.

Introduction

Clinical diagnostic problems are examples of ill structured problems. These types of problems are defined by Barrows and Feltovich (1987) as having the following characteristics:

Facilitation of the clinical reasoning process for medical students therefore presents an educational challenge. How can students be assisted in developing their knowledge base in ill structured domains and how can the application of this knowledge base be facilitated?

In traditional medical curricula, students are given a vast amount of facts but little instruction, and limited opportunity, for developing strategies for linking information. Research into the clinical reasoning process (for a review, see Elstein, 1995) indicates that it is not the amount of knowledge that differentiates an expert from a novice, but the richness of the connections between that information. The richness of the interconnections makes the information more useable (e.g., recognition of situations in which it can be used) and more readily accessible (e.g., increased speed of access). The educational challenge is therefore to support students in making effective connections between items of clinical information. To enhance this process, opportunities need to be created for students to practice and receive feedback on these skills.

One such opportunity is problem based learning. In this method, students work together as a group, while a 'facilitator' guides the process. The advantages are that students support each other to explore a diagnostic problem and generate solutions. One potential disadvantage is that students, are dependent on the group for the unfolding of the problem and the generation of solutions. However, given the critical importance of developing effective and independent diagnostic reasoning strategies, it may be argued that students in the later years of their clinical training need opportunities for more independent reasoning, separate from a group environment.

The DxR program can provide the opportunity for students to reason independently through a diagnostic problem. The program presents a patient problem, provides tools for gathering information, and encourages the user to generate and refine hypotheses and complete the investigation in an efficient manner. It also provides feedback on performance at the end of each case (King, Bryce, Graebner, Evans, & Myers, 1996). It therefore provides practice and feedback for individual students on their diagnostic reasoning skills.

Initial studies with the program (Bryce, King, Graebner, & Myers, 1996), however, identified some dissatisfaction with the computer feedback; students wanted more feedback than the computer provided (e.g., lists of differential diagnoses an expert would generate). This could have been incorporated into the computer feedback, however, because of the critical part that feedback plays in the learning process, it was decided to trial feedback sessions led by a clinician.

This approach was adapted in order to:

This paper describes the trial of clinician-led feedback sessions on computerised patient cases, with students in Year Four of a traditional medical curriculum at the University of Sydney.

Method

Nine DxR cases were selected by a clinician to cover the range of diagnostic problems encountered in 4th Year medicine. Where necessary, the case content and evaluation parameters were modified. Cases were appropriately sequenced, so that students had developed some background knowledge of each area, prior to working through a computer case.

Training was given in how to use the program and students worked in groups through an initial practice case. During the next four cases students were asked to work in a group (of their own choosing) and for the last four cases on their own. Students at all teaching hospitals were given the opportunity to work through the nine DxR cases. At one of the teaching hospitals, each case was followed by a feedback session with a clinician. Student participation was voluntary.

Data was gathered by a number of methods:

  1. Focus groups at three (of the four) main teaching hospitals evaluated the needs of students for development of their reasoning skills, as well as their confidence levels in clinical reasoning. These were carried out shortly after students had their first encounter with DxR.
  2. An observer sat in on, and at times recorded (audiotaped), the feedback sessions.
  3. On completion of the feedback sessions, a focus group was held with students, who had participated, to learn what worked and didn't.
  4. The clinician, who conducted the feedback sessions, was also interviewed at the end of the process to learn what worked and didn't, and what insights it revealed.

At each feedback session, students received a 'Student Report' which summarised their plan of investigation, and included: their statement of the problem; a list of their diagnostic hypotheses (when generated and when deleted); lab investigations; and items missed in the investigation. The clinician received a summary of the patient data (findings from the history, physical exam and laboratory tests) and a print-out of the students' 'trail of logic'.

During feedback sessions the clinician sequentially reviewed the students' diagnostic hypotheses, and the enquiry process (history, physical examination and laboratory investigations), and discussed the findings in terms of prioritisation and elimination of hypotheses, significance of findings, sequence of lab investigations and the relative value of these investigations.

Results

Students' perceptions of their clinical reasoning skills

Students in focus groups at three teaching hospitals identified a number of problems with their clinical reasoning skills:

Students' perceptions of feedback sessions

What worked

The students felt the range of cases was comprehensive and made them consider a number of important differential diagnoses. Some students would have like more cases.

Although participation rates were low, the students attending feedback session found them useful for:

Students appreciated the stepwise feedback on how an experienced clinician approached a diagnostic problem. It provided opportunities to clarify misunderstandings and differentiate clinical findings, and was particularly useful in terms of understanding what tests to order, what not to order, and why. Their confidence levels were boosted by learning what they did right and in cases where they had not performed well, expressed appreciation at learning this from the feedback session. One student, who only joined in the later sessions, expressed regret at not attending earlier.

"I would just get enlightened by the feedback sessions", "I think lots of people didn't come along to feedback sessions because they didn't realise how useful they were."

One significant outcome was the altered history taking approach of a student in a real patient encounter in the hospital following the DxR pulmonary embolus case. "and as I was taking the history I just, it suddenly clicked, that this sounds very similar to the case and then I started to gear my questions toward pulmonary embolism". This is one example of a change from a typical stereotyped approach to history taking, to a more hypothesis driven approach to history taking (the desired outcome).

What didn't work

One student felt a little 'exposed' in the feedback sessions because of the small numbers present, however attended every session and asked a large number of questions. Another student did not attend the feedback sessions that followed cases done on his own, for fear of being singled out and asked to explain his line of thought. Generally, however, people taking part were disappointed that more students didn't take up the opportunity and felt the cases and feedback sessions should have been made compulsory. This opinion reflected several factors, students wished to discuss the case with more of their peers, secondly to learn from how others approached the problem (during the 'trail of logic' review) and finally they felt it would make the feedback sessions more interactive .

Some students became frustrated when the computer did not recognise their misspelled diagnoses. This seemed to frustrate the students who did not attend feedback sessions but did not concern the students who did attend feedback sessions, who were less dependent on the computer evaluation. Even so, the computer evaluation created other problems for students attending feedback sessions, who felt the computer print-out of their performance (handed out in the feedback session) did not reflect their own perceptions of how well they worked through the case, and expressed little confidence in using it as an indicator of their performance. Review of the computer evaluation by a clinician was felt to be the most desirable form of evaluation and feedback.

Clinician's perceptions of feedback sessions

What worked

The feedback sessions were "a real insight into the way the group were thinking". The process of accessing students' 'trails of logic' acted to confirm suspicions (based on previous experience with students in tutorials) that the students were:

The feedback sessions confirmed the need for students to have more practice and feedback at linking their knowledge by applying it to clinical problems.

What didn't work

The attendance rate of students was the most disappointing part of the pilot study. "I think they're missing out enormously".

Discussion and Future Applications

One of the significant features of the pilot study was the low participation rates. Potential reasons include lack of interest (cases were voluntary and not assessable) and fear of having one's thought processes exposed to scrutiny. A pilot study of DxR in 1996, allocated students to specific groups based on subgroups of tutorial groups. However, this raised problems of students working with groups members they were not happy with. To counteract the problem, students this year were asked to work in groups of their own choice. Perhaps this less structured approach for allocating students to groups also affected initial participation. The presence of an observer at the feedback sessions may also have been a contributing factor.

The feedback sessions were found to be of great value to both students who attended and clinicians. Both the students (including the one who did not attend the later sessions for the fear of being singled out) and clinician felt that:

The clinician felt that cases should be introduced "early on when it serves as an illustration of how things should be done, half way through ... the year when they can see how they're going and what progress they've made, and a couple of occasions toward the end of the year, when they ought to be performing satisfactorily." Students at feedback sessions also felt that starting earlier in the year was preferable (the study began at the end of semester 1).

Based on the findings of the 1996 and 1997 studies it is the evaluators' view that the cases should be worked through, at first in groups (students were of the opinion that 2-3 cases were enough to become familiar with the program), then later individually. Working in groups (after an initial training session) allows students to support less confident colleagues, and by working on one's own on subsequent cases, students can begin to identify gaps in their own personal knowledge base that need remediation, rather than being reliant on others in the group to think for them; "its easier to agree than to come up with your own differential diagnosis". To encourage greater discussion prior to feedback sessions and to avoid the problem of computer non-recognition of misspelled diagnoses, it is the view of the evaluators that the computer evaluation be inactivated (switched off) so that students do not get feedback until the feedback session.

The suggested application of the program, in light of findings are:

The aim of feedback session would need to be clarified for tutors and students. Ideally the sessions will encourage a balance between clarification of uncertainties and stimulating students' reasoning, and modelling of the reasoning processes of an experienced clinician.

Potential advantages of DxR cases followed by small group feedback sessions are:

The advantages stated above were not previously possible for students and teachers in the Fourth Year learning environment. In addition, the proposed method of implementation arising from the findings of the 1997 study, fills a number of current needs: the desire for more feedback; increased opportunity for applying knowledge to solving a diagnostic problem (as opposed to repeating rote-learned lists); and opportunities to develop autonomy and confidence in their clinical reasoning skills through practice and feedback.

In addition, the DxR program opens up possibilities for research on the clinical reasoning processes. For example student activity records ('trails of logic') can be used to compare students performance in an unstructured voluntary setting (1997) to a more a structured, guided setting (1998). Do the 'trails of logic' of students in the second group show improved performance in diagnostic reasoning? What correlation exists between diagnostic skills as measured by DxR and diagnostic skills as measured by other methods? Can DxR be used as a tool to detect students with difficulty linking clinical information and therefore provide an early avenue for remedial intervention?

Conclusions

Despite accumulating a vast number of facts in their prior education, Fourth Year students lack confidence in applying these facts to a clinical problem. This is primarily due to lack of practice at linking the fragmented facts, and insufficient feedback on their own reasoning skills and the reasoning processes of experienced clinicians. The 1997 study of the DxR program combined with teacher feedback sessions, confirmed the difficulty students had in applying their knowledge, and the value of DxR as a tool to help facilitate their clinical reasoning skills.

The proposed application of DxR to the fourth year learning environment provides opportunities for students to develop higher order skills through practice, feedback and reflection not previously possible. The program gives students an opportunity to apply their knowledge and create links between their fragmented facts and rote learned list. By having access to the 'trail of logic' of individual students (in the later years of clinical training), teachers can get direct insight into the reasoning process of these students and can facilitate students' clinical reasoning by providing individualised feedback and modelling of how an expert would reason through a case. This process allows students to identify their own strengths and weaknesses and provides a tool that allows teachers to identify students with consistent difficulties in reasoning.

References

Barrows, H.S., & Feltovich P.J. (1987). The clinical reasoning process. Journal of Medical Education, 21, 86-91.

Elstein, A.S. (1995). Clinical reasoning in medicine. In J. Higgs & M. Jones (Eds.), Clinical Reasoning in the Health Professions (pp. 49-59). Oxford: Butterworth-Heinemann.

King, N.J.C., Bryce, D.A., Graebner, C.F., Evans, M., & Myers, J.H. (1996). Wanda Holmes: Developing diagnostic reasoning skills through a computer-based case series. CDROM, AUC Conference, From Virtual to Reality, The University of Queensland.

Bryce, D.A., King, N.J.C., Graebner, C.F., & Myers, J.H. (1996). Evaluation of a medical diagnostic reasoning program. In A. Christie, P. James, & B.Vaughan (Eds.), Making New Connections, The Thirteenth Annual Conference for the Australian Society for Computers in Learning in Tertiary Education, pp. 566-568.

 

(c) Deborah A. Bryce, Nicholas J.C. King, Richmond W. Jeremy, and J. Hurley Myers

 

The author(s) assign to ASCILITE and educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author(s) also grant a non-exclusive licence to ASCILITE to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the ASCILITE 97 conference papers, and for the documents to be published on mirrors on the World Wide Web. Any other usage is prohibited without the express permission of the authors.

 


Back to List of papers

This page maintained by Rod Kevill. (Last updated: Friday, 21 November 1997)
NOTE: The page was created by an automated process from the emailed paper and may vary slightly in formatting and layout from the author's original.