By Zack Pack
It seems like self-driving cars have always been the hot item for disruptive innovators in the auto industry. The image of sitting back, Minority Report style, in a sleek new vehicle with an AI-powered robot that effortlessly guides your car across town has bewitched many starry-eyed investors. In fact, over $100 billion dollars have been invested in self-driving technology since it was first seriously proposed. But what have these hopeful investors received for their money besides some flashy tech showcases? Why have self-driving cars been so difficult to go mainstream?
Critics point to the technology as being deficient. It’s nearly impossible for computers to understand all the real-world nuances that human drivers must interpret each time they take the wheel. Poorly paved roads, unpainted lines, a fallen tree… for human beings these are things we can quickly recognize and adapt to. For even the most sophisticated AI brains though, there’s no comparison. There’s just too much that could go wrong that they can’t predict. Even Anthony Levandowski, the co-founder of Google’s self-driving program is dubious. Fully autonomous self-driving cars are “an illusion” he told Bloomberg. For every successful demo, there might be dozens of failed ones. Until AI can capture all the context absorbed by a human brain while driving, it will still be a technology that’s less adaptable and more prone to accidents than its human-brained counterpart.
So are self-driving cars just a big scam? Well… kinda. While Tom Cruise isn’t going to be jumping from one self-driving car to the next anytime soon, the auto industry has come a long way towards driver-assisted cars. The current technology of self-driving cars on the market are cars that will “automatically brake for you if they anticipate a collision, cars that help keep you in your lane, and cars that can mostly handle highway driving.” In each of these cases though, there’s still a person behind the wheel supervising everything and avoiding accidents.
Technological hurdles aside, if we could develop the AI that makes self-driving cars as safe as human-driven cars, they’d still have quite a few other hurdles to overcome before going mainstream.
The biggest hurdle, perhaps, is the problem of liability.
Last week, a man in North Carolina was driving at night, following his GPS. The GPS led him to a bridge that the man couldn’t see was unfinished. He then drove off the bridge, crashed upside down in the river below and died. His GPS didn’t show that a portion of the bridge had been washed away – instead it went on mindlessly recommending it as the fastest route. After the man’s death, questions came up about who should be held responsible. Was it all the man’s fault? What about the fault of the city for not repairing the bridge? The state? The bridge manufacturer? What about the GPS technology that got it wrong? Should they pay out? It wasn’t clear where the fault lay and for that reason, all parties involved were vulnerable to lawsuits.
Another news story from last week: an attorney in Orange County filed a class-action suit against Kia and Hyundai for not having immobilizers in their older-model vehicles. These immobilizers would prevent the engine from starting without an associated key fob present. The lack of these devices, the attorney argued, have made Kias and Hyundais easier to steal and were popular targets for car thieves who would then go on to use the stolen vehicles for other crimes. While the suit seems like an obvious case of blaming-the-victim instead of the criminal, it also highlights how the question of “who is to blame for automotive deficiencies” is far from settled.
The list of liabilities continues to expand as well. The National Highway Transportation and Safety Administration (NHTSA) has only demanded more and more accountability from car manufacturers regarding auto safety regulations over the years. According to NHTSA (an arm of the Department of Transportation), all vehicles MUST include specific types of seatbelts, they MUST disclose the locations of where all their parts are assembled (via the Labeling Act), they MUST follow all cybersecurity restrictions, and if a new safety recall should arise, the manufacturer MUST fix them at their own expense. Today, about one in four vehicles on the road have an unresolved safety recall on them which has increased every year since the recall program’s inception.While some may say this is a good thing to have that much oversight around safety, it also does a lot to discourage manufacturers from sticking their necks out for potentially unsafe innovations.
The EPA is also squeezing vehicle manufacturers with new regulations – tightening its emission standards and adding restrictions that car manufacturers find increasingly difficult to abide by. As David Shepardson from Reuters said,
New rules [that] take effect in the 2023 model year… require a 28.3% reduction in vehicle emissions through 2026. The rules will be challenging for automakers to meet, especially for Detroit’s Big Three automakers. General Motors (GM.N), Ford Motor Co (F.N) and Chrysler-parent Stellantis NV (STLA.MI).
With all this red tape, automotive manufacturers are already feeling the weight of big brother pressing on their shoulders and would be reluctant to go all in on self-driving vehicles without all the safety concerns rigorously tested and approved to the point they can be sufficiently indemnified from lawsuits.
Further, what’s going to happen when one of those fully autonomous self-driving cars accidentally runs someone over? Even if the manufacturers can’t be held accountable (unlikely), our lawsuit-heavy country will find someone to blame. And when they do, that someone is going to have to shell out big time. This then opens the door for all sorts of unscrupulous activity. Folks will be gaming the system jumping in front of self-driving cars and pretending they were hit hoping for another big pay day. Eventually, the lawsuits will stack up until legislators attempt to settle the scope of responsibility for each participant, which will then be challenged, then appealed, then the question of who is responsible will dissolve into the land of legalize where few innovations ever come out alive.
Perhaps in another country with a more authoritative government, the liability issues can be overcome. Or maybe, self-driving cars can be controlled remotely by gig-workers using VR or something (so someone is still driving the car, they’re just not doing it from the inside), and they’d be accountable. However, the dream of self-driving vehicles still has a world of hurdles before it can be a realty in the USA and is something eager investors should consider before opening up their checkbooks.
There’s a difference in the technology that’s possible and the technology that’s feasible. Perhaps self-driving cars will someday become possible from a purely technological standpoint. However, it’s far more likely we’ll continue to see them through the lens of a Hollywood movie rather than the lens of our own windshields.