What do you know about the driver, what do you demand of him or her and how do you apply this in the design of self-driving vehicles? In order to answer these and other questions about the human factor in self-driving vehicles, TNO conducts research on the road and in driving simulators.
“Human behaviour and acceptance are not a given, but are rather dynamic,” says Marieke Martens, researcher at TNO and professor of Intelligent Transport Systems and Human Factors at the University of Twente. “In order for the self-driving car and its driver to better understand each other, we need to make clear to the driver what the vehicle does or does not do. At the same time, we need to provide the vehicle with driver information so that the vehicle knows better what it should and should not expect from the person inside.”
Confidence in the self-driving car
Today, most people still drive in cars without self-driving functions, in which the driver is responsible for all the driving tasks. At the other end of the spectrum is a car that is 100% self-driving, in which people no longer have to do anything. “Trust calibration - having the right level of confidence, tuned to what the vehicle can and cannot do - is about everything in between,” Martens explains. “If a vehicle system already functions well, but the driver sits in the car with anxiety and constantly intervenes, then we talk about ‘undertrust’ and you cannot use the system optimally for its intended purpose.”
“On public roads, we have measured and recorded whether the driver was still paying attention to the road and whether or not the systems were being used”
As if you don't have to do anything anymore
At the same time, drivers driving commercially available Level 2 vehicles, sometimes over-rely on the self-driving functions, defined as ‘over-trust’. “These cars support the driver so well that at certain times it seems as if you don't have to do anything anymore as a driver,” says Martens. “But we simply don't know exactly what they do in every situation and how safe they are in the end, because everything depends on how they are used. Just think of the reports of a number of accidents involving Tesla, and even of accidents involving test vehicles on public roads that already have more advanced technology. That is why we at TNO do research in real traffic as well as in driving simulators. The safety of something ultimately depends on what the vehicle is able to do and see, what behaviour the driver shows in the vehicle and the conditions under which the vehicle drives.”
Equipped with additional sensors
On behalf of the Ministry of Infrastructure and Water Management, Dutch road authorities (Rijkswaterstaat) and the Netherlands Vehicle Authority (RDW), TNO performed a study with nine commercially available vehicles equipped with self-driving functions. “We equipped them with various cameras and additional sensors and had different participants use them several months during their everyday driving. On public roads, we have continuously measured and recorded what happened, how surrounding traffic responded, whether the driver was still paying attention to the road, whether or not the systems were being used and on what sort of roads, and so on. Since none of the participants had any prior experience with these systems, we can even determine the learning curve of the participants and how long it takes to get used to these functions.”
“Our knowledge can be used directly as input to set future requirements for self-driving vehicles”
From results to guidelines
It was the naturalistic context in particular that made the project so complicated. Martens: “In a driving simulator you expose the driver on several occasions to different conditions and then you analyse and compare the responses. In this project we logged data from different drivers, different systems and different road conditions. Try making head or tail of that. We are now in the process of interpreting all the bits and pieces and answer the first series of questions. We will present our first results at the end of 2018. This knowledge can be used directly as input to set future requirements for such vehicles. It can also be used to better instruct first-time users of these systems and to improve the interfaces and technology.”
Pulling a trick
Wherever possible and safe, TNO conducts research on public roads. “But situations such as a driver reading a book on a high-speed public road are not (yet) allowed on the road,” says Martens. “Then we pull a trick by pretending that someone is driving in a self-driving vehicle while someone else is actually controlling this vehicle. Or we use the driving simulator. Last year, for example, we built a driver readiness estimation model for truck platooning. With truck platooning, the truck drives very closely to the vehicle in front, because this happens automatically and the driver is temporarily no longer a driver. In these circumstances, a truck driver could temporarily do other things since he or she no longer has to pay attention.”
“With our model, we estimate how long it takes for a specific driver to be ready to take control of the steering wheel again”
Giving the driver back control
“But it's possible that you'll end up in a situation where you're not sure whether the vehicle is able to receive all the information it needs to drive automatically and you might want to return control to the driver,” Martens continues. “You want to be able to estimate how long it will take for this specific driver to take control of the steering wheel again. With our model, we calculated this by identifying where the driver has his hands and feet, what he is doing, whether he regularly looks outside, whether he is wearing reading glasses, whether the seat has been adjusted, and so on.”
Let the vehicle take full control
There is a lot of talk about self-driving cars, although they do not actually exist yet and currently the driver still needs to pay close attention. “In this complex variety of different forms of automation, we are constantly increasing the level of comfort,” concludes Martens. “With higher levels of automation, you could leave full control to the vehicle in more favourable conditions. For example, on a motorway, if it is not snowing and the vehicle does not have to make complex route choices. Or think of a public transport-like concept, in which you board a self-driving minibus that has been specifically trained in a restricted environment and in which the speed is somewhat lower. Depending on the exact level of automation and driving conditions, the human role is constantly changing. This again shows that these three factors together- the vehicle, the human being and the environment – determine how well the system performs.”
“How safe something is ultimately depends on what the vehicle is able to do and see, what the driver does in the vehicle and the conditions under which the vehicle drives”
The self-driving car definitely has potential – that much is true. But there is still a way to go. Getting these types of vehicles on the road requires intensive collaboration between TNO and industry and government. We are keen to engage in this collaboration. If you have any ideas or would like to know more, please get in touch with us.
More information can be found on the page 'TNO and the self-driving car'. TNO outlines the potential of the self-driving car. Different facets are discussed: technology development and validation, effects on traffic flow, ICT and the human factor. You will also find relevant links to pages on TNO.nl.
Do you have any ideas about a collaboration?
Or would you like to know more about one of the four prerequisites discussed for the transition to self-driving vehicles? Then we would like to get in touch with you.