With driverless cars set to be on our roads before the end of 2019, there are ongoing discussions around who would be held responsible in an accident and whether driverless vehicles can be programmed to act in an ethical way. We look at whether drivers, manufacturers or both will be held liable for injuries caused by automated vehicle accidents.
Are Driverless Vehicles Safe?
With 90% of road accidents being caused as a result of human error (according to the European Commission) driverless vehicles have the potential to significantly improve road safety. However, as driverless technology advances and the transition from human-driven to self-driven vehicles is made, accidents will inevitably happen, and they are happening.
There have been a number of high profile road accidents in the last couple of years involving self-driving vehicles. These accidents have been caused by a combination of human error on the part of a human who was 'supervising' the self-driving car and faults with the driverless technology itself.
In one of these accidents, an autonomous car in self-drive mode hit and tragically killed a pedestrian. In another, the driver of an automated vehicle sadly died when the car (in autopilot) collided with a crash barrier.
So while safety is obviously of paramount concern to the manufacturers of driverless cars, there is still a way to go before they become completely failsafe, if this is even possible.
Where We Are with Driverless Technology
Currently self-driving cars are not fully automated and they still require a human to be behind the wheel, ready to intervene if needed. Fully automated vehicles that do not require any human supervision are not expected to be seen on our roads for a few years to come.
There are five levels of automation when it comes to driverless technology, as follows:
- Level 1 – The driver controls most of the vehicle's functions, but the vehicle can perform a specific function, such as steering or accelerating, automatically.
- Level 2 – The vehicle will provide at least one automated assistance system to the driver of both steering and acceleration or deceleration, using information about the driving environment. Vehicles that offer cruise control and lane-centering fall into this category.
- Level 3 – The vehicle can take full control of all 'safety-critical' functions in certain conditions, but the driver still needs to be present and ready to intervene if necessary.
- Level 4 – The vehicle can perform all safety-critical functions and monitor conditions for the entire trip. This is called 'fully autonomous' but it won't cover all driving conditions.
- Level 5 – The vehicle is fully autonomous and can perform at the same level as a human driver in all conditions, including extreme environments such as dirt roads.
Currently driverless technology has only reached level 3, with some experts arguing that we haven't even got there yet. As such, level 5 vehicles are unlikely to be developed in the near future.
So Who Will Be Responsible for Driverless Vehicle Accidents?
In July 2018, the UK Government passed legislation on driverless vehicles. The Automated and Electric Vehicles (AEV) Act 2018 sets out a legal framework for driverless technology and automated vehicles, in preparation for the future.
The Automated and Electric Vehicles (AEV) Act imposes a strict liability on the vehicle's insurer when an accident occurs in self-drive mode. This means that someone injured by a self-driving vehicle will be able to claim compensation from the other side's insurance. If the accident is then found to be because of a technical fault in the self-drive technology, it will be up to the insurer to recover compensation from the vehicle manufacturer. Some experts have highlighted, however, that this legislation will only apply to level 4 and 5 vehicles, and the automation levels that have already been reached won't be covered.
This means that until driverless technology reaches level 4, people that have been injured because of an accident that was caused by an automated vehicle may struggle to claim compensation. Without liability being placed on the vehicle's insurers, some fear that injury victims may be left trying to claim compensation directly from vehicle manufacturers themselves, using product liability laws that are not fit for this purpose.
There are also fears that, while effective level 4 and 5 vehicles could potentially make our roads safer, the transition period before we reach this point could pose significant dangers. In the near future we are likely to see automated vehicles on our roads in which humans are required to take an active role while the vehicle self-drives to a point. One fear is that drivers could begin to over-rely on the vehicle's automation, to the detriment of their own driving skills, while others warn that drivers could react to hazards too late, once they realise the vehicle is not automatically reacting as it should.
There are likely to be complex liability disputes arising from semi-automated vehicle accidents, with challenges in determining whether an accident was caused by the driver or by the vehicle. If an accident has been caused by the vehicle, what if the owner has retrospectively installed software that compromised the vehicle's safety, or what if they have refused to install software updates that would make their vehicle safer?
While driverless vehicles will revolutionise the way we travel, there will inevitably be some sticking points along the way as the law adjusts to adequately protect people injured in automated vehicle accidents.