The challenge was to design a multimodal interface that supports drivers of highly automated truck convoys in 2030.

One of the main challenges was how to combine both the truckers' needs and values of 2013 with the future scenarios of 2030. We were trying to come up with a highly usable design concept and simultaneously faced the issue of grounding the scenarios in reality. 

Given these challenges we wanted to humanize the automation technology within the truck, making it a trustworthy companion on the truck operators journey. We also wanted to give technology a clear point of reference within the truck and finally aimed to make sure truck drivers remain aware of the automation's actions and decisions at all time.

 
 

The solution is the heart of automation: a skin-conductive steering wheel, unobtrusive voice control and a context-aware heads-up display.

The steering wheel has been used to interact with other drivers with a honk of the horn. The horn's original purpose was to be used in cases of impending danger, however it's function has inevitably broadened beyond a single message. How might the truck operator communicate with others on the highway in 2030? Introducing the heart of Scania.

The primary output will happen on the trucks windshield though an advanced heads up display. These types of displays have significant safety benefits associated with heads up displays. They make sure that the relevant information is directly in the drivers view and minimize eye movement. Given that when trucks are platooning, most of their field of vision is severely limited by the back of the truck driving in front, we can make use of this dead space for our heads-up display information (see screens in storyboard). Additionally, we will use both ambient displays and sound feedback. We envision the usage of different elements in the drivers' cabin that can subtly change colour to indicate the difference in the automation status. This has the benefit that we can reduce the clutter displayed on the heads up display. Audio feedback can be utilized to help operators understand the necessity to address their attention to the system interface. 

 
Heads–up display interface when the truck driver is actually in full manual control of his truck. The heads–up control only shows the most essential elements.

Heads–up display interface when the truck driver is actually in full manual control of his truck. The heads–up control only shows the most essential elements.

The heads–up display interface when the driver has joined a platoon. Now the interface elements take up more of the windshield as not his full attention is required.

The heads–up display interface when the driver has joined a platoon. Now the interface elements take up more of the windshield as not his full attention is required.

In full-automation mode (while being part of a truck convoy) the drivers interface strongly resembles a entertainment system. This is possible, because the lead platooner is responsible for the safety of all drivers.

In full-automation mode (while being part of a truck convoy) the drivers interface strongly resembles a entertainment system. This is possible, because the lead platooner is responsible for the safety of all drivers.

 
Press Down  – confirm actions or suggestions made by automation system. Possibly problematic as action associated with the car horn.

Press Down – confirm actions or suggestions made by automation system. Possibly problematic as action associated with the car horn.

Hold  – hold steering wheel centre to start voice command inputs. Skin conducting material could make this possible.

Hold – hold steering wheel centre to start voice command inputs. Skin conducting material could make this possible.

Retract Steering Wheel   – unlock and retract steering wheel to give up manual control of the truck. Pull back to regain control.

Retract Steering Wheel – unlock and retract steering wheel to give up manual control of the truck. Pull back to regain control.

Point to Select   – a 3D trackpad enables you select items from the choices displayed on the heads up display.

Point to Select – a 3D trackpad enables you select items from the choices displayed on the heads up display.

 

sfasf

 

The process involved finding a way to design for truck drivers operating with technology that does not exist yet.

It became clear to us that for automated platooning to be adopted by the truck driver it needs to respect the driver’s fundamental values. To determine these values and to determine what characteristics truck drivers share, we created a set of behavioural variables based on the findings of our contextual inquiry. The other teams then added their input to allow us to compare the characteristics of 6 different truck drivers. This contextual inquiry made us realise that there is no archetypical trucker. We also found that most of our interviewees described the concept of “Freedom” in their job that they liked. The data that we managed to collect let us then build a realistic truck driver persona. To understand what situation our persona faces throughout a workday in 2030 we created a user journey. With our user journey we visualised the context, touch-points and both the users and systems actions at during each important event. This gave us seven relevant scenarios to work with.

To be able to decide on interactions that match truck operators expectations we decided to put ourselves in the situation and act out each interaction (bodystorming). We used simple props made out of elements we could find around school. This gave us the following two main insights: Firstly, it became obvious how important the human communication between truck drivers is. This finding is backed up by our user research. Secondly, there was an explicit need to see each other's actions. We therefore decided on defining the lead platooner’s role aspirational and full of responsibility. 

We unconsciously went into the user research with a trucker stereotype, which we quickly found not to be true at all.

We then moved forward to storyboarding our design solutions through storytelling. We picked the two most important scenarios to start with: joining a platoon and exiting a platoon. We separated and came up with four different storyboards that each conveyed different design interventions. Using these results we discussed the advantages and shortcomings of each intervention and summarized what worked for each. Collectively, we then scripted our story and produced a low-fi version of the storyboards for both main scenarios. This forced us to think about all interfaces involved and forced us come up with preliminary sketches for the interfaces. We then focussed on detailing the storyboards and thereby the in-truck interfaces. 

Final Keynote Presentation.033.jpg

The main lesson learned was to find a way where augmented reality feels true to me, in that it is used to visualise things that are actually invisible to our eyes.

Things I have learned

  • a way of designing for future concepts
  • importance of a positive team spirit throughout the project
  • visualising is often better than talking
  • user interviews can be very exhausting and need a clear goal
  • working with people with different strengths is a lot of fun
  • patent considerations can take a long time

Things I would do differently

  • engage in more co-design activities rather than talking for extended periods of time
  • talk about each others expectations of the projects and agreeing on a group contract from the very beginning 
  • keep the production values simpler, After Effects can be a huge sinkhole
  • find better ways of recapping user research