Self-driving cars are one of the most anticipated and exciting new car technologies to come out in recent years. However, you can’t buy a self-driving vehicle right now, and likely will not be able to in the immediate future. There are a number of different reasons why this is the case, but we’ll look at three major ones: development cost and complexity, liability and the many edge-cases they can encounter in the real world.
Self-driving cost and complexity
Development cost and complexity: One major reason that many self-driving cars aren’t here yet is the high development price for both the hardware technology needed in the car and the robust, mature software required to run in real time. Google revealed in 2016 its prototype driver less car driven completely “without a steering wheel or an accelerator pedal” with custom radars, actuators, and control software.
On the software side, mature, robust software is absolutely critical. Being that these systems are entrusted with keeping us and our families safe and alive, they must be very stable. Software crashes may be acceptable in our mobile phones or web browsers, but crashes in a car, well… completely unacceptable and may put our lives in significant risk. Large software endeavors such as this take years to develop to the level of safety we need and expect.
The physical hardware technology required in these early self-driving vehicles are considered too expensive to put in a standard production vehicle. Specifically LIDAR, used by Waymo, is a costly and large laser based system that gives the self-driving car it’s eyes to see the road and hazards.
Who is liable for a self-driving car?
Liability and regulations: Federal regulation currently does not allow for fully self-driven vehicles, meaning it is not legal to operate a driver less vehicle in the United States. Some companies, such as Waymo and Uber, do have exceptions as they are testing out their systems on roads. Many of these exemptions require a live human driver in the seat to be able to take over the vehicle at any time. This is also without it’s issues, specifically as Uber had the fatal accident in 2018.
SEE ALSO: 2021 New Car Models
And, the legal question still exists, if a self-driving vehicle is found at fault, who’s legally responsible for its actions: the owner or the manufacturer/software developer? These are questions we will face in the future as these vehicles take to the road and issues arise.
But, there is progress being made in this area. “NHTSA’s insistence of enabling the fast deployment of self-driving vehicles by amending rules written for cars with drivers, instead of recognizing the unique characteristics of autonomous technology, may be the fastest way to authorize the deployment of autonomous vehicles but it is not a consumer-safety driven approach,” said Jason Levine, executive director of the Center for Auto Safety. This means special regulations designed for self-driving cars are coming, and they won’t be the same as regular passenger vehicles.
The many unknowns of self-driving
Edge-cases: A large reason why Tesla’s self-driving efforts are still “forthcoming” is the massive, nearly infinite amount of edge cases that a vehicle can encounter in the real world. Humans are astoundingly good at adapting and making decisions on the fly. Computers do best with a large amount of known or controlled variables.
Unpredictable situations come up frequently when driving a vehicle, and the software and hardware must be able to adapt safely and quickly to the ever-changing environment. Some edge cases include weather and road conditions (debris or potholes in your lane), construction, the unpredictability of other vehicles, people or animals.
All of these issues, and more need to be researched, tested, solved and considered before we have wide-spread self driving vehicles. The technical hurdles are immense, but the upside is huge and many companies and teams are working on solving the challenge.