How do human driving skills — based on flesh-and-blood sensory perception, cognition and eye-hand coordination — measure up against autonomous vehicle technologies?
According to a new study, human drivers — when attentive, sober and well rested — still generally maintain an advantage in terms of reasoning, perception and sensing. But machines and computers are quite capable of performing tasks such as driving and have an edge in the areas of reaction time, power output and control, consistency and multichannel information processing.
The study, released by Sustainable Worldwide Transportation at the University of Michigan, offers a scientific take on the man-versus-machine driving debate. Human eyes, for example, are compared to autonomous vehicle radar, LIDAR and camera systems.
One advantage that people have is that their sensory powers are in a much more compact package.
“Matching (or exceeding) human sensing capabilities requires AVs [autonomous vehicles] to employ a variety of sensors, which in turn requires complete sensor fusion across the system, combining all sensor inputs to form a unified view of the surrounding roadway and environment,” the report noted. “While no single sensor completely equals human sensing capabilities, some offer capabilities not possible for a human driver (e.g., accurate distance measurement with lidar, seeing through inclement weather with radar.”
A fully connected autonomous vehicle — one that employs dedicated short-range communications, for example, to exchange data with other vehicles and infrastructure — offers the best potential to effectively and safely replace the human driver when operating vehicles at automation levels 4 (high automation) and 5 (full automation), according to the study.
Brandon Schoettle, a project manager at the University of Michigan Transportation Research Institute’s Human Factors Group, authored the report.
Below is a table summarizing some of his findings:
Performance aspect |
Human |
AV Radar |
AV LIDAR |
AV Camera |
Connected vehicle with DSRC |
Connected autonomous vehicle |
Object detection |
Good |
Good |
Good |
Fair |
n/a |
Good |
Object classification |
Good |
Poor |
Fair |
Good |
n/a |
Good |
Distance estimation |
Fair |
Good |
Good |
Fair |
Good |
Good |
Edge detection |
Good |
Poor |
Good |
Good |
n/a |
Good |
Lane tracking |
Good |
Poor |
Poor |
Good |
n/a |
Good |
Visibility range |
Good |
Good |
Fair |
Fair |
Good |
Good |
Poor weather performance |
Fair |
Good |
Fair |
Poor |
Good |
Good |
Dark or low illumination performance |
Poor |
Good |
Good |
Fair |
n/a |
Good |
Ability to communicate with other traffic and infrastructure |
Poor |
n/a |
n/a |
n/a |
Good |
Good |
Source: “Sensor Fusion: A Comparison of Sensing Capabilities of Human Drivers and Highly Automated Vehicles,” released by Sustainable Worldwide Transportation at the University of Michigan. DSRC refers to dedicated short-range communications.
Follow @automotivefleet on Twitter
from Automotive Fleet http://ift.tt/2xPWWvw
Sourced by Quik DMV - CADMV fleet registration services. Renew your registration online in only 10 minutes. No DMV visits, no lines, no phone mazes, and no appointments needed. Visit Quik, Click, Pay & Print your registration from home or any local print shop.
0 comments:
Post a Comment