Problems and Pitfalls in Self-Driving Cars

I love cars, but not as much as lawyers love cars, who especially must dream of loving the emerging class of self-driving cars.

Think about it. If you have an accident in a self-driving car, the fault is with the car company, not with any driver. This simple fact must be causing huge joy in Lawyerville. The prospect of filing against big car companies, dealerships, and insurance companies has to be an emerging profit center for lawyers and their clients.

Why do we need self-driving cars? People have been driving cards for 120 years or more, and yes, drivers make mistakes and have accidents. But accidents also happen because of some mechanical issues -- for example a blown tire, bad brakes, locked steering, engine failure, transmission breakdown, electrical system failure, slippery roadway. Are self-driving cars going to be able to cope with mechanical breakdowns? No one knows yet. Think about software bugs and errors. And you can add to that the risk of your car being hacked. Cars won't be any more secure than any other computer system; and all self-driving models will be Internet connected for a host of reasons. So sending them erroneous instructions should be a piece of cake for any hacker over eight years old, and that does not include the professional hackers, criminals or antagonistic governments and terrorists who want to cause chaos in America.

But still the question lingers -- why do we need self-driving cars?

One argument for them is that a self-driving car means less wear and tear on drivers, especially emotional wear and tear. Will there be less road rage, anxiety attacks and exhaustion in a car that drives itself? The answer (to a degree) is a qualified "yes" if you, the driver, trust Emil, the self-driving software that is running your car trip. Is Emil up to snuff? Does Emil understand the threats popping up "out there" in the real world? Is Emil reliable? (I decided to call my future self-driving software Emil. You can substitute your favorite name for Emil.) Anyway, opinion polls show that people are afraid of self-driving cars.

Another argument is that self-driving cars will be safer because the software is programmed to be safe. People run red lights, jump stop signs, make illegal U-turns, don't always stop for pedestrians, and drive in lanes where they are prohibited. Thus, a smart self-driving car will not commit any of these sins and offenses, so making driving far safer.

I believe the argument for safety has some merit, but these features could be incorporated in cars that otherwise need to be driven by people, not machines. For example, software that "sees" a red light through sensors will prevent a car from advancing through the red signal zone. And there already is software that will apply your brakes if there is an obstacle, including people, in the roadway. These safety systems, and more that will come on the scene in future, are a great technology application for two reasons: they prevent accidents and they educate drivers properly on safety procedures and safe driving.

Improvements can always be made for safety systems. Some new cars are equipped with blind spot monitors, usually installed in the side mirrors. These are really sophisticated sensors that see where the driver cannot. Right now most of them blink a warning in the mirror which, if the driver looks and reacts, can help prevent an accident. But suppose the blind spot detector is improved so you can't steer into a car in your blind spot. That would be adding a form of autonomous security that would be a step forward and entirely within reach, since all the technology components already exist and are inexpensive. (Incidentally, the special sensors in blind spot monitors were, and maybe still are, controlled if used in military equipment under the U.S. munition's laws known as the ITAR. Here's a military version of a blind spot monitor.)

Another argument for self-driving cars is that cars can automatically be rerouted to a destination if there are impediments such as construction, accidents, or congestion. Many GPS devices already have this feature, and WAZE which is a community-based traffic and navigation application for smartphones can give drivers nearly real-time advice. WAZE and WAZElike applications can be integrated into self-driving systems where the choice is up to the self-driving system and not the driver. But is this a good idea? It is hard to say with certainty, but one thing that has been clearly learned is that WAZE and other GPS traffic monitoring systems can be spoofed. This means false information can redirect traffic, or cause major problems on roadways. Spoofing can fool humans too. Irate homeowners in Los Angeles spoofed WAZE to move traffic away from their neighborhoods.

Shir Yadid and Meital Ben-Sinai, "fourth-year students at Technion-Israel Institute of Technology, hacked the incredibly popular WAZE GPS map, an Israeli-made smartphone app that provides directions and alerts drivers to traffic and accidents. The students created a virtual traffic jam to show how malicious hackers might create a real one." Even the police, angry over WAZE-users fingering hidden police cars to avoid speeding tickets have spoofed WAZE with phony sightings of police cruisers. So while a WAZE-type solution may be handy, it has definite pitfalls. In an automated system, the driver might find himself directed to drive into a river or onto a one-way highway.

For some time, there has been an argument that traffic on highly congested roadways can be eased, or at least smoothed out, if car speeds and distances between vehicles can be managed. Some highways today have sequencing lights at roadway entrances designed to space out cars before they enter roadways. But if the roads are really tightly jammed, these systems have little value and just agitate people even more.

The idea of sequencing on the roadways is based on the often-observed phenomena that congestion happens in "waves" and that once you break out of a wave there is "blue sky" ahead. Most of us have noticed this, forced to slow down on a 65-mph highway to a crawl, then spending ten minutes in stop start driving, and then all of a sudden everything opens up. There are no accidents to see, and no visible explanation for the tie-up. But it happens because drivers tend to squeeze their vehicles together in clumps where the next car back has to go slower, and these clumps start slowdowns. If you can de-clump the vehicles driving would be easier and faster without hitting the brakes.

Sequencing can best be done if all cars have sequencing systems which probably will need to be linked to high accuracy GPS monitoring capable of reading individual car speeds and anticipating clumping. For security reasons errors are introduced into the U.S. GPS service and, as a result in its present manifestation it is not accurate enough for vehicle sequencing. The European Galileo satellite system does have the needed accuracy (down to a few inches) thanks to a special passive hydrogen maser atomic clock, but Galileo is not a free system and it is not yet fully deployed.

Without active sequencing gaining traffic efficiencies from self-driving cars is unlikely to be achieved in the next decade or two. That does not mean that some of the derivative technology can't be used in standard vehicles: even real-time advice on the right speed to maintain to avoid traffic jams would be a good step (provided that everyone had it and used it, which is an educational as well as a technological problem).

Would a self-driving car free up the former driver to do some work while driving? Perhaps if the former driver was not nervous and not constantly scanning the horizon. But it might also take away a great pleasure, listening to music or podcasts, which any driver can do now, and keeping an eye on the outside world. One of the amazing thing about today's drivers is how they instinctively multitask, sometimes at the risk of their lives (as in texting or talking on cell phones). But there is good multitasking and bad. The good multitasking is for the driver's mind to have free time to think and wonder. Will we lose that with self-driving cars? The jury is out.

If you experience technical problems, please write to helpdesk@americanthinker.com