In New York, the roads are so congested with dangers lurking around every corner, the increased use of autonomous vehicles is being cautiously welcomed. Technological advancements are undoubtedly a positive thing. Automation is designed to remove the possibility of human error leading to mistakes, misjudgments, accidents and injuries. This is happening in many industries and companies are striving to be at the forefront.
Still, the technology has not yet been perfected and there are often stories about autonomous vehicles failing in their stated goal of taking the responsibility for driving out of a human operator’s hands. Auto accidents can cause extensive damage regardless of how the vehicle was being operated – by a person or autonomously. A frequent question that is coming up is who is responsible for the crash. People who were hurt or lost a loved one in this type of collision need to know the facts before deciding on the next steps.
Autonomous vehicles and responsibility
Auto manufacturers, lawmakers and legal professionals alike are analyzing responsibility in relatively new world of autonomous vehicles. Understanding the classifications of self-driving vehicles can help with making this determination. A recent story said that as part of its entry into the U.S. market for conditionally automated operation, Mercedes-Benz is releasing a Level 3 vehicle. This puts the onus on the carmaker for accidents and incidents that come about when the vehicle is in autonomous mode.
The levels go as follows: 0-1 means the driver is operating the vehicle; Level 2 means it is partially automated, but the driver is monitoring the operation of the vehicle; Level 3 is conditional automation with the driver not constantly monitoring it; Level 4 is when the vehicle is driving itself but does not make all decisions on its own; and Level 5 removes any responsibility for operating the vehicle from the would-be driver.
This does not eliminate a driver’s responsibility entirely. With the Mercedes-Benz technology, it cannot be used all the time. Only roads with a speed limit less than 40 mph are allowed and only during the day. Liability varies depending on the country, but the U.S. is weighing how to to address liability for autonomous vehicles.
As the debate continues, it is important to know that the situation will largely determine if the driver or the vehicle was responsible for the accident. If they could have taken action that would have prevented it, then they might be responsible. That includes taking control away from the automated system or not using them at all when it is preferable not to. An example would be in a traffic jam in an urban area.
According to the National Highway Traffic Safety Administration, the most recent statistics for fatal accidents lends itself to trying to find ways to reduce their number. More than 42,500 people were killed in auto accidents across the nation. A significant portion were because of human error. Autonomous technology would largely remove that variable.
That would include forward collision warning where the driver is warned and vehicles can stop on their own. Lane departure warning is when the vehicle is drifting into other lanes. A warning when the vehicle is backing up is meant to tell the driver when a crash could happen and the driver can see with a backup camera. Blind spot collisions are common and a warning would tell the driver when there is another vehicle in the blind spot. Other advancements can stop the vehicle when the driver does not do so to avoid a crash.
Since liability is one of the key aspects of a legal claim to be compensated for all that was lost after an auto accident, this must be analyzed based on the circumstances. The New York City roads are dangerous as it is and with this technology still somewhat new, these questions remain unanswered. After a crash, it is important to have trustworthy guidance and an advocate to assist in making a full recovery in every way. This is key with situations that could be complicated as a crash with an autonomous vehicle would be.