BlueDogDiaries

They’re coming, we’re told daily, and the arrival horizon keeps shrinking from decades to years to months. And I need to talk about it; I need to talk about autonomous vehicles—or “self-driving” vehicles, if you prefer—and the reason for that is, at my urging, my father recently bought the closest thing currently available on the market: a Buick Envision. The guy’s 92. He’ll be 93 by the time you read this, and his skills behind the wheel remain respectable. They used to be brilliant, but, hey, he’s 93.

So it seemed like the wise solution to the diminution of his driving alertness to get a car that did a lot of the thinking for him, including things like automatic collision avoidance, blind spot alerts, maintenance of following distance and of lane position so he can’t drift across a line on the pavement, parking assist, voice controls, OnStar oversight and a bunch of other stuff that remains inscrutable but is, I assume, valuable. It has umpteen cameras and sensors.

And yet even as the self-driving vehicle hype machine hits high gear (autonomously, we assume) there remains a stubborn caveat in their pitch, i.e., that, “Hey, not to worry if something goes haywire. There’s still a human in the vehicle in case, you know, the technology’s not all it’s cracked up to be.” And it emphatically is not.

Therein lies the problem, and it’s a problem we have already encountered with the explosive growth of GPS guidance. The problem, as you may well have already experienced yourself, is that the more you come to rely on the “infallibility” of the systems, the more your innate sense of direction and spatial orientation atrophy. The same goes for basic driving awareness and reflexes in the case of autonomous vehicles and the upshot is that when the human component of the self-driving operation assumptions are actually called into play, they’re probably not there anymore. You forget how to drive, just like you’ve forgotten how to use a map or even the position of the sun in navigation. I’ve seen it. It happened to me.

After 3,000 miles of semi-autonomous driving of the high-zoot Buick Envision, I found that when I got behind the wheel of a more archaic vehicle, say, a 2005 Buick LeSabre, I instinctively continued to rely on the fail-safe functions of the Envision I’d grown lazily comfortable with. In matters of vehicle proximity, following distance, lane position maintenance—especially when trying to figure out the functions on the console’s computer screen (never did)—or simply parking in a tight lot or parallel between two other vehicles.

That scares me. But not nearly as much as the possibilities of disaster inherent in the new fleets of self-driving tractor trailers being rushed into service. The more nefarious and paranoid possibility is that the truck can be hacked. It’s been proven not just possible, but actually fairly simple for the techies out there with mischief on their minds. Or the Russians, for that matter. We’ve seen graphically in the last week how a jack-knifed rig in a blizzard can cause extensive carnage and endless traffic snarls.

At this point we can assume that these recent calamities weren’t caused by hacked rigs. But there’s another even likelier prospect that would bring the same result: self-driving semis are virtually blind in a blizzard, or a dense fog, or a sudden downpour, all of which demand the immediate attention of that vestigial human in the cab; you know, the one who forgot how to drive.

As if that weren’t enough red flags to slow down the mad dash into autonomous vehicles, another factor—one that uniquely threatens motorcyclists, bicyclists, and pedestrians—is becoming an urgent hot button issue

It’s not a technological factor. It’s worse. It’s an ethical factor. Specifically, “The Lifeboat” ethical dilemma, which can be summed up tidily as: When is it OK to cannibalize your boat mates?

See, the decision-making algorithms of self-driving vehicles are programmed with hard moral decisions already encoded. The MIT Technology Review has headlined the issue thusly, “Why Self-Driving Cars Must Be Programmed to Kill.” Calling these algorithms “a fiendishly complex moral maze,” The Review posits the following question:

“Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the car, than for the rider of the motorcycle? Should different decisions be made when children are on board, since they both have a longer time ahead of them than adults, and had less agency in being in the car in the first place? If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm’s decisions?”

Notice how a motorcycle is used in the example. Motorcycles are also used in an example published by Slate.com in June of this year, to wit:

“You’re cruising along in your car and the sun is shining. The windows are down, and your favorite song is playing on the radio. Suddenly, the truck in front of you stops without warning. As a result, you are faced with three, and only three, zero-sum options.

In your first option, you can rear-end the truck. You’re driving a big car with high safety ratings so you’ll only be slightly injured, and the truck’s driver will be fine. Alternatively, you can swerve to your left, striking a motorcyclist wearing a helmet. Or you can swerve to your right, again striking a motorcyclist who isn’t wearing a helmet. You’ll be fine whichever of these two options you choose, but the motorcyclist with the helmet will be badly hurt, and the helmetless rider’s injuries will be even more severe. What do you do? Now imagine your car is autonomous. What should it be programmed to choose?”

To that I would add: And what if you suddenly notice that one of the riders is your spouse? Will your car recognize that?

A truly thorny issue, no? Not for Mercedes-Benz, however; they have no ethical qualms at all, and have readily acknowledged that the decisions of the auto’s computer will exclusively prioritize the safety of the car’s passengers—at the expense of the safety of other players in the collision scenario.

So mow down a pack of a half-dozen bikers in your Benz, or risk harm to yourself, and maybe a passenger? It’s literally a no-brainer. But then, hell, you bought the car. It’s yours. It wasn’t cheap. And all the bikers bought was the farm.

It’s all right here in the diaries. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here