The views expressed by contributors are their own and not the view of The Hill

Self-driving vehicles: A greater challenge than we thought

self driving cars autonomous vehicles study uber safety dangerous highway roads accidents crashes human drivers Insurance Institute for Highway Safety (IIHS)
ANGELO MERENDINO/AFP via Getty Images


You’re driving down a busy road and notice that an out-of-place traffic cone near a construction site ahead will send you into the opposing lane for a moment. Oh, and that pedestrian looks as if he might jaywalk in front of you, and that guy in the SUV sure is riding your rear bumper…

Odds are you’ll do just fine. You’ll make the right adjustments in braking and accelerating, picking just the right moment to go in and out of your lane, all while carrying on a conversation with your passenger.

Generally, human drivers do relatively well with this kind of complexity. We’re able to respond to the out-of-the-ordinary intuitively.

On the other hand, we get bored, tired, and easily distracted. Even with all the improvements in automotive safety over the years and depressed vehicle miles traveled from the COVID-19 pandemic, an estimated 38,000 Americans died on the roads in 2020.   

Autonomous vehicles have long been promised as the solution, with their sensors that see far ahead in all kinds of weather and their ever-vigilant computers that follow the rules of the road with hyper focus.  

But the reality is more complex. As often happens with new technologies, some challenges are harder than they seemed at first. In fact, the industry is going through a bit of a reset, with companies like Uber and Lyft selling their autonomous vehicle units.

Why is autonomy hard? The roadways are highly cluttered with many sizes and shapes of obstacles; various objects move in and out quickly and unpredictably; and human behavior is difficult for a machine to predict. Whenever people “act naturally” during an odd scenario, that’s a good time to worry about autonomous vehicles.

For a long time, the answer appeared to be to let an autonomous vehicle do the bulk of the driving, but only with a human behind the wheel, ready to take over in case of failure. The best of both worlds, right?

Unfortunately, no.

Research has shown that humans make poor safety drivers. By the time they gain situational awareness in an emergency, it’s often too late to react. In fact, relying upon “human fallback” might actually be more dangerous than fully automated systems.

Further, the approach of using virtual simulations to prove system safety is facing a harsh reality: The endlessly complex environment, coupled with constantly evolving hardware and software, means that relying upon systems to drive millions of simulated miles, while necessary, is insufficient.

So, the promise of widely deployed self-driving vehicles remains elusive.

We still think it’s likely that highly automated vehicles, in some form and performing certain tasks, will become a reality. Where we first see these systems on the roads is uncertain: Maybe electric vehicles — cheaper to produce than complex gas vehicles — will include expensive autonomous systems; or maybe autonomous freight trucks on highways may have the right economic incentives for autonomy adoption.

Regardless of which operational model succeeds, we believe there are four major steps we can take in the short term:

  • Let autonomous vehicles communicate with one another through connected vehicle technology — The world is a complex place for autonomous vehicles. Weather makes it hard to virtually “see” with sensors; emergency responders alter the roadways to make accident scenes safe; and people around the vehicle are hidden by buildings and foliage. Any technology that simplifies this world is crucial to fast and safe adoption of these capabilities. Connected vehicle technology lets these systems “talk” with each other and with the world around them, sharing information about traffic, objects, and intended vehicle paths. To make this vision a reality, industry and the government must come together to implement and standardize these communications, as well as invest in necessary infrastructure.
  • Share hard problems, challenging scenarios, and safety data — As safety issues, technical limitations, or hard scenarios are discovered, industry should share non-proprietary information to look for system-wide issues. Transportation sectors, like aviation and traditional ground transportation, have programs to share data and foster safety cultures across the industry. Highly automated vehicle developers must be encouraged or required to do the same.
  • Establish a corporate culture of safety — If autonomous vehicles are to succeed, the industry must have a clear focus on safety — from the boardroom to the test engineer. A “positive safety culture” throughout an organization is critical in preventing accidents and avoiding hidden risks. In other words, what’s needed is a culture that encourages, not punishes, employees for reporting safety issues.
  • Certify these non-human drivers — Human beings are required to pass a driving test before getting behind the wheel. But currently, highly automated vehicle systems are not. Right now, the regulatory landscape and certification process is uncertain and piecemeal. Yet the answer seems clear: There should be an “autonomous vehicle driving test.” To do this, government agencies should divvy up responsibilities between federal, state, and local partners, and create a clear path to certification.

Through decades of development and delayed rollout, it’s clear that developing safe and functional autonomous vehicles is not as easy as it may appear. But we truly believe that self-driving vehicles are in our future. This future will come down the road more quickly and safely if government and industry work together to share knowledge and attack the hard problems facing autonomous vehicles.

Zach LaCelle is a robotics and autonomy expert and leader of MITRE’s Mobile Autonomous Systems Experimentation (MASE) lab. His work helps advance the prototyping, testing, experimentation, deployment, and adoption of mobile autonomy in transportation systems. He is also the author of the recently released Safety Building Blocks of Highly Automated Vehicles.

Dr. Christopher Hill is the chief engineer of MITRE’s Transportation Safety Division. He specializes in public-private partnerships and represents MITRE in the Partnership for Analytics Research in Traffic Safety (PARTS), a group dedicated to the advancement of traffic safety.

Tags Emerging technologies Lyft Road safety Road transport Self-Driving Car Self-driving vehicles Uber

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.