How Ready Is Humans With Autonomous Cars? This Is The Research
JAKARTA - Autonomous technology is one of the most serious competitions developed by a number of car manufacturers today. Not only Tesla, Ford, GM, BMW, BYD, VW, and even new manufacturers like Xiaomi are serious about developing it.
Autonomous cars are promised to handle most of the driving tasks, of course, it will allow drivers to relax or do other things while traveling.
However, the question is can the driver return to focus and take control quite quickly when a critical situation occurs?
Launching Autocar, June 7, researchers at the University of Glasgow investigated whether augmented reality (AR) could help drivers in the situation.
This study was published in a journal entitled "Can you hazard a guess? Evaluating the effects of augmented reality issues on hazard prediction". Research on autonomous technology is very important if a driverless car wants to be commonplace in the future
From this study, it is said that when the car takes control and the driver is engrossed in other activities, the role of the driver becomes like "half-hearted", this is the problem.
According to the researchers, humans are not good at carrying out "sustainable surveillance tasks".
"We get bored easily, don't be careful with road conditions, and are too slow to react to sudden changes around. In addition, there is also the phenomenon of "see but don't see" where we don't process something that's actually in sight," the content of the journal.
SEE ALSO:
The researchers argue that AR can help attract the attention of drivers who are focused on other tasks, so they can quickly refocus on driving during an emergency. To test this theory, they make simulations of driving in a laboratory using a steering wheel with a screen featuring replicas of road views.
The screen displays 40 video clips while the participants do one of two tasks while wearing an AR headset. Sometimes they look forward to doing tasks on the screen, sometimes looking down at the tablet. One of the tasks is to play a simple game, namely collecting moving virtual gems. The second task requires participants to type the phone number displayed on the screen.
In both scenarios, the video is stopped shortly before a potential hazard appears, for example pedestrians who will cross the road. The participants must then predict what will happen next based on their understanding of the situation before the video is stopped. The results are then compared with similar experiments where predictions are made without participants doing other tasks.
Not surprisingly, the participants' prediction skills decreased when they were doing other tasks, both when looking forward and looking down. However, when given visual instructions through AR a few seconds before the video is stopped, their awareness and prediction ability in the situation of looking forward are getting better than when they look down at the tablet.
The conclusion is that there is a possibility that humans can do other activities while remaining alert to the situation on the streets. Then what can't?