Go to navigation Go to content
Toll-Free: (888) 348-2616
Phone: (816) 471-5111
Kansas City Accident Injury Attorneys

How safe will autonomous vehicles be once they hit the market?

Recent crashes involving semi-autonomous cars have raised concerns about just how safe these vehicles are now and in the future, when some models are fully autonomous. While the new technology is exciting, it's also raised questions about how the government will approach regulating the industry. Some of these questions were answered in the fall of 2016 when the Department of Transportation (DOT) released a detailed set of guidelines for manufacturers of autonomous vehicles to apply as they develop the technology.

What Is an Autonomous Car?autonomous vehicle

Currently, several automotive manufacturers offer cars with semi-autonomous technology. These are vehicles that automatically assist the driver with things such as braking, accelerating, steering, and maintaining a safe following distance—while the driver remains fully in control of the car.

A fully autonomous car, however, is capable of navigating without human input of any kind. While these cars exist and are being tested on public roads, they're not yet commercially available. Most experts agree these vehicles will be on our roads in the future, and the DOT is concerned with doing what it can to make sure they are safe.

DOT Guidelines Address Several Areas

When the average driver thinks about what could go wrong with a self-driving car, he may think of things such as the potential for a crash if there is a technological glitch in the car. However, there are many other concerns raised by this technology.

The DOT’s guidelines include 15 areas of concern that manufacturers of self-driving cars will be asked to report on. These areas include everything from cybersecurity to crashworthiness. As part of a safety self-assessment, the DOT will ask manufacturers to report on the following areas:

  1. Data recording and sharing. How will crash data be collected and reported?
  2. Privacy. How will consumers’ privacy be protected?
  3. System safety. What safety measures are in place if the system malfunctions?
  4. Vehicle cybersecurity. What safety measures are in place to protect cyber systems from attack?
  5. Human machine interface. When and how will human input be needed while driving?
  6. Crashworthiness. Does the vehicle meet existing NHTSA crashworthiness standards?
  7. Consumer education and training. What guidelines ensure a driver knows how to operate the vehicle?
  8. Registration and certification. How will software updates and recalls be communicated to owners?
  9. Post-crash behavior. What will be done to ensure safety of a vehicle for use after a crash?
  10. Federal, state, and local laws. How will vehicles adapt to conform to various local laws throughout the country?
  11. Ethical considerations. How will the vehicle be programmed to handle ethical dilemmas, such as breaking a traffic law in the name of safety?
  12. Operational design domain. In what environments and at what speeds is the vehicle safe to operate, e.g. types of roadway, weather conditions, and so on?
  13. Object and event detection and response. How is the vehicle programmed to handle normal and emergency obstacles?
  14. Fall back. How will the vehicle warn the driver when systems have failed and human control is needed?
  15. Validation methods. Are testing methods valid and reliable?

With these guidelines, the DOT is attempting to steer manufacturers of self-driving cars towards taking all potential safety issues into consideration before releasing a car.

Are Self-Driving Cars the Answer?

While many people are frightened by the idea of self-driving cars, manufacturers point out that the leading causes of highway deaths are mistakes made by humans. Whether a driver becomes distracted by a cellphone; chooses to drive while intoxicated; or simply cannot react to an emergency situation fast enough, autonomous cars would solve those problems.

However, when something goes wrong with a car that's doing all the driving, who's to blame? Will the driver—or his insurance company—bear any responsibility for a fatal crash caused by a robot car? If the driver overrides the system and causes a crash, will the manufacturer bear any responsibility? These are questions Kansas City Accident Injury Attorneys are sure to be asking in the future.

 

James Roswold
James Roswold is a Kansas & Missouri personal injury, workers comp, and medical malpractice attorney.

Live Chat