Self-Driving Cars are Self-Driving Bullets
Why even Elon Musk has been forced to admit that navigating streets is hard
A car is a just a slow-moving bullet with a stereo system.
When we drive a car, we typically think our main task is navigating from point A to point B. But mostly what we’re doing is trying to keep from killing someone. That is Job One. Everything else is secondary. If you were to get into a car and fail to get from point A to point B, that would suck. But if you were to kill someone, that would be orders of magnitude worse.
So 99% of what you’re doing when you’re behind the wheel of a car is attempting to not commit homicide.
This is a useful point to keep in mind whenever you read about the imminent arrival of “self-driving cars”. Because when tech folks tell you they’re building a self-driving car, what they’re really promising is to make a self-driving bullet that can weave through city streets without hitting anyone.
Kind of clarifies the stakes, doesn’t it?
Indeed, this is why tech executives have been so chastened by the challenge. They don’t like to admit defeat. But cars and roads are an environment where they cannot bluster and Powerpoint their way out of mistakes — because this time their errors quite directly injure people, with the unforgiving physics of two-ton hurtling chunks of steel.
I thought of this when I read Elon Musk’s tweet last weekend about Telsa’s FSD, their “full self-driving” software …
“Didn’t expect it to be so hard”: There’s an epitaph you could chisel on the tombstone of self-driving car hype.
The buzz started in 2005 when a Stanford team won the DARPA Grand Challenge, creating a vehicle that drove itself over 175 miles of desert. Over the next decade, companies from Google/Waymo to Uber to Tesla and old-school automakers ploughed boatloads of R&D into the cause.