Autonomous Driving Is A Pipe Dream

We have come a long way since the days of the first DARPA Grand Challenge, now every car manufacturer, technology giant and countless start-ups are creating driverless cars and put them on the road in Nevada, California, Switzerland, Singapore, and, more recently, in Pittsburgh for public use.

Many companies and cities are following the lead. Tesla will reach level 4 in 2018, they say. Ford announced to have a level-4 driverless car ready in high volumes by 2021.

It seems, at this point, it is an unstoppable, relentless and unreflecting hype.

And it is also a pipe dream.

This is a very dangerous situation. We have a clear and probably deliberate misunderstanding between engineers and computer scientists on the one side, and policy makers, investors, managers and the public on the other.

The engineering problem at this stage of development is becoming almost uninteresting. We have sufficient computing power within the limits of space, power consumption and processing time, we have any conceivable type of multi-dimensional optical and electro-magnetic sensors, we have highly precise maps and ubiquitous wireless network bandwidth to connect everything to everything and share and use every information. Building an (one) autonomous system is basically a somewhat more expensive game of LEGO® MINDSTORMS®.

Engineers don’t care about politics. Politicians and managers don’t care about engineering, as long as it is “Done.”

Nobody gives a … … cares about a system of different autonomous vehicles. Different in algorithms, protocols, evolution, capability, age, compute capacity, failure rate, reliability or any other means that defines their behavior.

What’s more, engineers don’t care about failure in a human sense. To them, failure is “learning” and, in the best case, “Do it better, next time.”.

Engineers care about excellent solutions for general cases. Failures in special circumstances commonly don’t receive a lot of favor.

This is uncomforting for people who’s lives depend on how these failures turn out.

With autonomous vehicles on public roads today, the burden of proof is turned to the analog controller. First, it is the driver’s mistake. Next time, it will be the fault of the bicycle rider, the pedestrian, the raccoon.

Autonomous driving is inextricably intertwined with Artificial Intelligence. Which is the pipe dream of our greater generation.

Nobody cares either.

Critique on autonomous driving is not welcome at this time. People cover their ears and go “la la la la”. And they cover their eyes, too.

But why do we start with the hard path? Driverless cars on public roads? Where are, at scale, the autonomous airplanes (no, drones are human-controlled), autonomous trains (other than short-track airport shuttles and some urban subways). What’s a train driver doing? Observe the track. Accelerate and break at signals. Even in an emergency, the stopping distance is measured in miles. An algorithm can do that.  These environments with limited options and predictable conflicts are much better suited to our automation’s actual capabilities than public roads. And as important for public safety. Anything in this area?

Not there yet.

Algorithms are developed or, if you want, trained, to behave deterministically from input to output. Pre-condition, invariant, post-condition. In a sense, algorithms do not have a choice.

Of course, we can relax that. But who will be the judge on that? Today’s artificial intelligence systems are rather complex and it usually takes more than a dozen engineers and a couple of weeks to troubleshoot and understand a single failure. That may not scale when millions of autonomous Tesla and Ford vehicles hit the roads in 2018 and 2021.

Every driver has to go through driving school, which ends with a certification. Why, again, can the engineers put systems on the road that are not certified?

Do those systems pass the test? If even engineers have trouble understanding their system, how would public officials perform a certification? Do they have to follow the same rules as we do? If not, why not? And, does each software update or system reconfiguration require re-certification as it is the regulated truth today for the devices and systems that control our health in the hospital, or produce our meds and food?

The answer is, currently, not at all. So, like a prestigious website, autonomous driving will forever be in “beta”. But this time, we may lose more than just some information or some bucks.

So, in the end, we may not get the promised security. Systems fail and it would be A New World if technology suddenly stopped failing just for the autonomous vehicles. Gee, why didn’t they come up with that earlier!

If, however, autonomous cars satisfied all today’s security regulations, even if those alone were precise, accurate and generally sound for all situations where humans of today yesterday use “judgment”, driving them will be an exercise in patience and humility. Will riders, and more importantly, buyers, really want to do that? Give way to THOSE PEOPLE, with their smaller, and, maybe, un-automated clunkers? Wait, until grandma crosses the road? Do we allow the rider override the car’s safety measures for personal gain? What about the owner? What about the data the car collects? Who owns it and who has the right to delete it?

The solution seems to be, that autonomous cars are not sold, but let and shared, for a single ride booked by Uber or maybe as a result of a Google search for the next restaurant. Not everybody needs a car, then. This means, to break even for decades of research and the ongoing hardware and software development, and, finally, production, cars have to become a magnitude more expensive than today and, subsequently, taxi fares and rental costs and transportation overall. This does not sound like an incentive to switch. Or even invest.

So the story is, automated vehicles will not make us safer in general, but, with billions of dollars sunk, and then finally everything else prohibited from most roads, we either stay put, walk or pay up for every inch to use one, or the just the fine for being hit by one.

Advertisements
This entry was posted in Coding Horror, Glorious Achievements, News and politics, Software Development, System Management and tagged , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s