New cars that can steer and brake themselves risk lulling people in the driver’s seat into a false sense of security — and even to sleep. One way to keep people alert may be providing distractions that are now illegal.
That was one surprising finding when researchers put Stanford University students in a simulated self-driving car to study how they reacted when their robo-chauffer needed help.
The experiment was one in a growing number that assesses how cars can safely hand control back to a person when their self-driving software and sensors are overwhelmed or overmatched. With some models already able to stay in their lane or keep a safe distance from other traffic, and automakers pushing for more automation, the car-to-driver handoff is a big open question.
The elimination of distracted driving is a major selling point for the technology. But in the Stanford experiment, reading or watching a movie helped keep participants awake.
Among the 48 students put in the driver’s seat, 13 who were instructed to monitor the car and road began to nod off. Only three did so when told to watch a video or read from a tablet.
Alertness mattered when students needed to grab the wheel because a simulated car or pedestrian got in the way.
There’s no consensus on the right car-to-driver handoff approach: the Stanford research suggests engaging people with media could help, while some automakers are marketing vehicles with limited self-driving features that will slow the car if they detect a person has stopped paying attention to the road.
Self-driving car experts at Google, which is pursuing the technology more aggressively than any automaker, concluded that involving humans would make its cars less safe. Google’s solution is a prototype with no steering wheel or pedals — human control would be limited to go and stop buttons.
Meanwhile, traditional automakers are phasing in the technology. Mercedes and Toyota sell cars that can hit the brakes and stay in their lane. By adding new features each year, they might produce a truly self-driving car in about a decade.
One potential hazard of this gradualist approach became clear this fall, when Tesla Motors had to explain that its “auto pilot” feature did not mean drivers could stop paying attention. Several videos posted online showed people recording the novelty — then seizing the wheel when the car made a startling move.
Starting late next year, the Cadillac CTS will get a Super Cruise system, which will allow semi-autonomous highway driving. If the driver’s eyes are off the road, and they don’t respond to repeated prodding, the car slows.
One riddle automakers must solve: How to get owners to trust the technology so they’ll use it — but not trust it so much that they’ll be lulled into a false sense of security and therefore slower to react when the car needs them.
Trust was on the mind of researchers who in August published an extensive report on self-driving cars funded by the National Highway Traffic Safety Administration. “Although this trust is essential for widespread adoption, participants were also observed prioritizing non-driving activities over the operation of the vehicle,” the authors wrote.
Another wide-open question: How to alert the person in the driver’s seat of the need to take over.
Read more of the original article in the San Jose Mercury News.
The post When Self-Driving Cars Need Help: Stanford Study’s Surprising Finding appeared first on Fleet Management Weekly.
from Fleet Management Weekly http://ift.tt/1SHOLpO
Sourced by Quik DMV - CADMV fleet registration services. Renew your registration online in only 10 minutes. No DMV visits, no lines, no phone mazes, and no appointments needed. Visit Quik, Click, Pay & Print your registration from home or any local print shop.
0 comments:
Post a Comment