Monday, March 1, 2010

Terrors of Complexity

It's sometime in the 70's and I am driving down out of the Santa Cruz mountains with my '67 Ford Galaxy filled to capacity with friends. As we run the switchbacks towards Los Gatos, I notice my brakes getting weaker, and I can smell the acrid odor of imminent disaster in the form of toasted brake linings.

When the city streets come into view I am standing on the brakes with both feet, praying I will stop the car before we ram through the barriers and into the crowded art festival filling the streets. Stunned passengers are eerily silent. I am the only one screaming. I press with all my strength (questionable) and weight (considerable), my head screwed into the lining of the car top for maximum leverage, and we slowly come to a stop just inches from the barrier. No immediate harm done, but if I only live to 80 instead of 90, this was why.

A near disaster - now almost totally removed from possibility by technology. In the same situation, modern braking systems would never fade, never falter, and allow me to stop without any drama or risk to innocent art geeks.

But can technology impose risks as well as solve them?


Let's imagine I am tooling along Sunset Boulevard in a 2010 Prius. A woman runs across the road towards a bus she doesn't want to miss, not aware of my semi-electric approach. My eagle eyes spot her, though, and my panther-like reflexes stomp the brake pedal in plenty of time. But just then I hit a bumpy section of tarmac and something goes ga-ga in the electronic brain and the car's brakes stop braking. Just for a moment before stoppage resumes, but long enough to seriously flood both me and the unscathed but now wide-eyed pedestrian with unwanted amounts of adrenaline.

Now I don't own a 2010 Prius, just a less-trick but apparently safer 2006 model, but the scenario described can happen, according to the latest recall news. And this woe is in addition to the unintended acceleration issues that some believe are also due to software glitches and not just floor mats or poorly-shaped gas pedals.

I suppose these problems are inevitable results of progress. After all, it's hard to remove all the bugs from complex software, and it's equally hard to test all possible scenarios in which that software will be used, in order to find bugs. Computer systems experts have been working for decades on this and have yet to reach perfection - if perfection can indeed be reached. The best we have is a set of quality standards that help reduce likely problems, but won't eliminate them completely.

Which is a real drag, since computers can make using ever more capable vehicles safer and easier. Just look at modern racing motorcycles, which, through sensors and software controls, allow their riders to go beyond previous limits of traction at full lean, with reduced risk of 'high-siding' themselves into low-earth orbit (although it can still happen). And consider military jets, which have such complex control surfaces and unstable dynamics, they couldn't even be flown without computer aids.

By turning ultimate control of our vehicles over to computers, we become safer by giving away our ability to do something stupid. We can't command planes, cars, or motorcycles to exceed safe limits, because sensors and software will detect and identify our dangerously excessive inputs, and limit or block them. But at the same time we may limit our ability to control our own destiny. And we put our lives in the hands of computers and software made by our fellow, fallible humans.

Complex technology can reduce fearful incidents like my brakeless ride downhill towards that 1970s art festival, but can be terrible itself, when things go wrong.

No comments: