A Point of View: The ethics of the driverless car
- Published
Driverless cars are being heralded as the answer to all our motoring problems. But long-term backseat driver Adam Gopnik has a few moral questions to raise.
I do not know how to drive a car.
There - it's out. In Britain, I think this is merely a little unusual. In the US, it is positively shaming. People give you strange looks when you confess this, as though you had confessed to not being able to perform some other, wholly natural function.
Like all people with a guilty secret, I have a perfectly good explanation. I grew up within a couple of blocks of the university where my pedestrian parents both taught, and I eventually went to school there, and then right out of university I went to New York, where no one has a car, and have lived here ever since (plus a few years in Paris, where no one in their right mind would try and drive).
My wife, fortunately, grew up in a Canadian suburb and learned to drive there. She is a wonderful driver, and when we go up to Cape Cod in August for our annual three weeks by the beach, she drives the family up, and then around. And the sad truth is that by now no one wants me to drive a car - my reflexes are too aberrant, my tendency to daydream too marked. My 14-year-old daughter is firm: "I'm never getting in a car if you're driving," she says grimly. "You would be thinking about something you're writing, and then bang, it's over for us all."
But the blow to my masculinity is real. I sense that I am, even in this properly post-feminist age, in the wrong seat. Not the one (the right front in your country, the left front in ours) where generations of fathers have sat, pressing down on pedals, and cursing the competition on the road. Instead, I occupy the traditional mother's seat and fill her role - shushing the children when the driver is tired, or changing the music on the radio as the one listenable station fades out into static.
I feel, I'm afraid, the insult to my masculinity so much that when a cop or a garage attendant approaches the car and gives me what I take to be a slightly puzzled, pitying look, I immediately slouch down and scowl resentfully in an impressive impersonation of a veteran driver, whose licence has been taken away after a lifetime of high speed, recklessly entertaining "Dukes of Hazzard" style driving.
"Cursing the competition?" my wife just said, reading over my shoulder.
"The other cars on the road aren't competitive. And is that why you get the weird look on your face? I can't believe that your concept of masculinity involves that much petty vanity and pointless displays of competitive ego in some... self-invented contest," she concludes - not seeing that if it were not for petty vanity and pointless displays of competitive ego, mostly in meaningless self-invented contests, we would have no concept of masculinity at all.
So you can easily imagine how excited I was when I first read that Google, the great, good search engine company out west, is many years, and many hundreds of millions of dollars into the process of developing and road-testing, and some day soon selling, the thing in life I most desire - the self-driving car. And Google isn't alone in the pursuit. Many companies are engaged in it. You will programme your destination when you set out, and the car will do the rest, even on the busiest motorway - find the exit, make the turn, maintain the speed, avoid the... well, the competition, and turn the fog lights on to penetrate the mist.
You can sit behind the wheel, if you like, and pantomime the act of driving, but the car will do all the work itself. Since self-driving cars never get tired, drunk, or distracted by their husbands trying to find a decent jazz station on the radio, Google and the other companies promise to bring road fatalities down to near-zero.
There is a problem, though, I've discovered, reading eagerly on. It is that human drivers are engaged every day not just in navigating roads, but also in making ethical decisions as they drive, and these too will have somehow to be programmed into the software of the self-driving car. Each self-driving car will have to have its own ethical engine.
Drivers, for instance, know that it is right to swerve to avoid an animal racing across the road, though not at any risk to their passengers. But they are also prepared to take a little more risk with the passengers to avoid a cat or a dog, which we instantly recognize as pets with human owners, than, say, a squirrel or raccoon.
Even graver ethical choices, often studied by philosophers and psychologists, regularly arise. What to do when faced with a choice between, say, mowing down a couple of bystanders and ploughing into a school bus packed with children? We compute these ethical costs and choices in an eye blink, and not just the choices but the moral reasoning behind them would have to be programmed into the self-driving car. And should there be a different module that switches on if the bus is packed not with children but with, say, ailing nonagenarians from a nearby hospice? And there are even simpler but still real ethical dilemmas that human drivers understand - say, that a speed limit of 50mph (80 km/h) on a fine day is really 60mph (96k m/h), while on a wet and foggy day, really 45mph (72 km/h). How do we programme this kind of flexibility into a machine?
It will not surprise the euro-sceptics among you that the European Union, in its own parallel self-driving car programme, is trying to solve this dilemma through a system of bureaucratically imposed obedience. What is called, almost unbelievably, the Sartre project - a joint research mission by Ricardo UK and Volvo among others and the EU - works on the convoy or "road train" model - a single truck with a human driver leading the way and up to five computerised self-drive cars following sheep-like behind. "Because they're all taking the same orders," the engineer explains, "the cars can travel just a few metres apart." Sartre is an acronym for "Safe Road Trains for the Environment", but it is a perfect tribute to the great French philosopher who ran his own ethical cafe-convoy, leading his zombie-like followers from absurdity to absurdity over many decades.
But why only Sartre? It occurs to me that, given the huge market for customised niche products these days, there should be a variety of ethical engines to install in your self-driving car. There would be many ethical apps to develop and download into the software of your self-driving Volvo. You could choose, say, a Nietzschean engine, which would drive right over everything - why not? God is dead anyway. Or the Albert Camus model, which would stall and pause in the middle of the highway while the traffic backs up behind - and then suddenly shoot off, bang, because the existential leap must be made, and some pedal struck.
There would be an Ayn Rand model ethical engine, named after the Russian-American free market fanatic, which would use chip technology to scan the bank account of each pedestrian, calculating their net worth, swerving to miss the makers, and mowing down a taker or two - who needs 'em? And there would be its technical relation, the Richard Dawkins model, which would use portable MRIs to heat-seek and discover which pedestrians you distantly share genes with, while steering you directly into the ones who are, alas, no relation. There could even be a Woody Allen ethical engine, which would start apologising as you press on the gas, and continue all the way home, and a Ludwig Wittgenstein model, which would announce wearily that there is no motor in the car anyway - all there is, is the activity of driving.
Yet the one thing that all philosophers and engineers are agreed on, is that no one is yet nearly as good, as flexible, as vigilant - not to mention as perpetually self-justifying - at these things as people are. We are our own best ethical engines. And who more expert than those of us, that small persecuted class, the non-drivers, who have been watching the road without the distraction of actual driving for years?
And here, I realise, is where I could really cash in. Instead of developing those ethical apps, I could become one myself. I will hire myself out as a full time on-call, ethical chauffeur, the moral rule-maker within your self-driving car. I will sit behind the wheel, just like a real driver, but making philosophical judgments rather than right turns - this raccoon lives, this bug dies, miss the school bus, run over these oldsters. I might even enforce more aesthetic ethical injunctions - say, to stop at every lookout on a scenic road, simply to admire the view.
There I will be, at last - right front or left front, depending on the country. For the first time, the guy inside, clutching the wheel - promoting the beautiful, saving the vulnerable, dooming the deserving, almost like a God... almost, for that matter, well, almost, like a man.
Follow @BBCNewsMagazine, external on Twitter and on Facebook, external