We hear so many different estimates of how long before we see them on the road, but where are the self-driving cars? I’m using the term self-driving to mean a driverless car capable of navigating, avoiding obstacles, and parking without any human involved. And here it’s important to make the distinction between autonomous cars which have a driver at the wheel so they’re not driverless, and truly self-driving cars that don’t need a human operator or even a steering wheel.
Making a Care Autonomous
A car becomes autonomous by having AI software trained on virtual cars. The car drives billions of miles with every conceivable obstacle and situation thrown at it to see how it responds and uses deep learning algorithms to teach itself what actions lead to crashes. This way the car slowly learns how it should drive on real roads. Only through using AI can the car then go out on a real road to drive.
The Safety Challenge
An obvious reason for the ongoing delay of autonomous cars on the road is safety. In 2018 an autonomous car being tested by Uber hit and killed a woman walking a bicycle across the street in Arizona even though the car had a driver at the wheel and elsewhere in the US, 3 Tesla drivers have died in crashes when the drivers and the autopilot failed to detect and react to road hazards. While 80% of the technology exists to put self-driving cars into routine use, the remaining 20% is much more difficult because the AI software still needs to improve to the point where the cars can reliably anticipate what other drivers, pedestrians, even cyclists will do and navigate the unexpected situations.
The Standardization Challenge
The second challenge is more regulatory. Standard definitions are needed for what constitutes reasonable actions taken by the car such as how fast to drive or when to change lanes. All autonomous cars have been programmed with algorithms for speed and lane change but these algorithms need to be standardized for the industry so that automakers can program their cars to act only within those bounds. This also gives a legal framework for evaluating blame in accidents based on whether the car’s decision-making system followed the accepted standards.
Governments are moving to create standards and then approve not just autonomous but self-driving cars for use on a national level. There’s many concerns about the accidents and AI malfunctions. But even more troubling is the concern about malicious AI attacks by hackers, who could, for example, infiltrate the artificial intelligence system of a fleet of self-driving cars and cause them to ignore safety laws. Researchers at a watchdog group called Open AI and whose members include Elon Musk, Max Tegmark and others concerned about responsible AI have called for companies to work with each other and with lawmakers to safe guard against potential vulnerabilities to hacking. But will rivals such as Uber, Waymo, and Tesla be willing to share data for the safety of all in such an intensely competitive market?
Autonomous Vehicles in Use Today
Surprisingly, autonomous vehicles are actually in use today. A company called May Mobility operates autonomous, six passenger golf carts in 3 cities, driving short defined routes at 25 mph and the Brooklyn Navy Yard will have 25 mph driverless shuttles in use this year (fyi my daughter’s workshop is in the Brooklyn Navy Yard so I’ll have to go check it out). At low speeds in defined routes, autonomous vehicles are safer so the technology can be used today.
Getting back to the question of when will autonomous cars be on the road? Two automakers, Ford and Volkswagon have teamed up with an AI company and predict they will have ride sharing services in a few urban areas as early as 2021. Elon Musk, ever the optimist, has said “I’d be shocked if it’s not next year at the latest.”
#14 Self-driving cars Download transcript here