2010 – 2013 for Institut für Medizinische Lehre together with Sebastian Hunkeler, Stephan Schallenberger, Felix Schmitz, Philippe Zimmermann, Markus Stolze

The challenge was to build an eAssessment system for medical examinations that delivers more accurate results than the current paper checklists do.

Universities in Switzerland and around the world use an examination format called objective structured clinical examinations (OSCE) to assess the clinical skills of their medical students. Trained doctors act as examiners and rate the students' performances. This process is slow, expensive and imprecise because of the paper checklists used: nearly every checklists contains missing or unreadable markings. 

 
 
Old: paper checklists with lots of corrections and missing markings.

Old: paper checklists with lots of corrections and missing markings.

New: examiner using an iPad to assess the performance of a student.

New: examiner using an iPad to assess the performance of a student.

 
 

Main Challenge #1:

How do you design for people that have never used a touchscreen before and are expected to use it after 5 minutes of training. Especially when these high stakes examples take place only every half year and doctors are expensive to user test with.

Main Challenge #2:

Introducing technology to a place where there was none before poses new questions for those that have to prepare, administer and make sure everything is set up correctly. How do we make sure the right examiner gets the right tablet? What happens should they drop it?

 
 
Ideo Interview PDF.012.jpg
 
 

The solution is a eAssessment system consisting of an iPad app for examiners, a Mac app for exam responsibles and a iPhone app for exam supervisors.

OSCE-Eval is an iPad application that makes assessing the practical skills of medical students easier, fairer and more precise. It has eliminated paper-checklists from OSCE examinations at the University of Berne and the University of Basel. Further studies published at USAB 2011 have shown that examiners prefer using the digital checklists and report a lower cognitive load during the examination.

 

How to begin assessing a candidate and how to change previous assessments.

How to change from one candidate's checklist to another's.

How to keep track of the steps a candidate has performed.

How to finalise an examination after all checklists have been filled in.

 
One way to deal with the issue of configuring iPads and iPhones was to come up with a solution that relied on scanning a QR code to automatically set up a device for an examiner. This was also useful to replace an iPad in case they were to drop it.

One way to deal with the issue of configuring iPads and iPhones was to come up with a solution that relied on scanning a QR code to automatically set up a device for an examiner. This was also useful to replace an iPad in case they were to drop it.

 
 
 
 

The impact of the project is that universitites on two continents now use for their clinical skill examinations. And more are interested.

The development of eOSCE continues at the Institut für Medizinische Lehre in Berne. The project has attracted the interest of several high profile universities around the world (e.g. in Australia). eOSCE is now regularly used in Swiss medical clinical skill examinations. Examiners love the new examination system and report to be less stressed than before. The iPad app OSCE-Eval (App Store) and the iPhone app OSCE-Track (App Store) are both available for free on the Apple App Store for you to try out. 

 
 
 
 

The process of the project was iterative with a test of the prototype at the end of each cycle. The findings of the tests were imperative to the projects final success.

Research
Our user research revealed that we had to focus on examiners over 40 and with little to no experience with touch–screens. Furthermore, the examiners reported to be under significant stress whilst assessing students. Contextual inquiries and field studies showed that OSCE examinations are fraught with obstacles: nervous students wandering into the wrong stations, examiners that do not bother showing up in time or are entirely unprepared. All findings implied that our solution needed to be highly flexible and error–tolerant to have the potential of improving OSCE exams. It also became apparent that to improve the situation, the checklists would have to provide more immediate feedback. Herein lies the potential of digital checklists. But could we design a digital checklist that is at least as convenient and hassle-free as paper?

Hardware Evaluation
Digital checklists require electronic devices that are mobile, quiet, light, durable and need the stamina to last a whole day. After a lengthy hardware study and after creating dozens of paper prototypes in varying degrees of fidelity, I moved to building both an interactive Windows 7 and iPad prototype. Both prototypes scored really well in usability tests, but the Windows 7 tablets were not up to the task. They were to heavy, expensive and examiners reported discomfort because the devices were running too hot.

Displaying Digital Checklists
Above all, medical examiners need to assess their students easily, efficiently and precisely. This meant that a lot of thought had to be put into how the examiners would interact with the digital checklists. I learned that the visual appearance of a checklist can impact how the examiners rate the students. For instance, if some answer items were displayed larger than others, examiners would be inclined to choose them more often. To decide between two variants we set up a scientific experiment which resulted in a publication at the prestigious IFIP TC13 conference on human computer interaction in Lisbon (Interact 2011).

The left design displays all answer items as buttons within the checklist itself. The second uses progressive disclosure — the examiner needs to tap a button before being presented with the answer items. The results showed that the subjects using the second design performed significantly better, which they also strongly preferred.

The left design displays all answer items as buttons within the checklist itself. The second uses progressive disclosure — the examiner needs to tap a button before being presented with the answer items. The results showed that the subjects using the second design performed significantly better, which they also strongly preferred.

Testing
Throughout the project we used cognitive walkthroughs and usability tests after each major iteration of a prototype. I was closely involved with these tests, but the main work was done by specialists with a dedicated usability laboratory. Based on their reports I strove to improve the designs, constantly trying to remove unnecessary features and interactions before handing the prototypes back to the usability lab. With every iteration I got more insights into what worked and what didn’t.

Building OSCE-Eval
OSCE-Eval is a native iPad app written in Objective C. The development of the app was a team effort – I wrote approximately half of the final code base. I had software engineering best practices in mind (continuous integration, refactoring and testing). Two of the main difficulties were the usage of electronic signatures and data encryption as well as the mechanisms to back up the data to the cloud. I also took over the responsibility of designing the iconography and visual identity.

 
 
 
 

The main lesson learned was to look at the problem holistically rather than focussing on designing the tablet app.

 

Things I have learned

  • designing and developing for touch-screens and different input modalities.
  • importance of testing early and often
  • value of usability testing
  • how to collaborate in multi-disciplinary teams
  • importance of not letting the high stakes get to you.

 

Things I would do differently

  • had little access to the initial research and it was difficult to discern what were assumptions and observed facts.
  • find more ways to shorten the cycles between prototype and test.
  • look at the problem holistically rather than focussing on designing the tablet app.
  • Knowledge on service design methodology would really have come in handy.
 

 
 
Employer Institut für Software, HSR Hochschule der Technik Rapperswil
Collaborators Institut für Medizinische Lehre, Universität Bern
Year 2010 – 2013 (3 years)
Skills Interaction Design, iPhone and iPad Software Development, Mac Software Development, Windows 7 Tablet Development, Service Design, Project Managment, HCI Research, Usability Tests, Visual Design, Web Design, Documentation, Customer Relations, Academic Writing
Team Members Sebastian Hunkeler, Stephan Schallenberger, Felix Schmitz, Philippe Zimmermann, Markus Stolze