Thursday The New York Times magazine published a big story Tesla’s technology, the unreliable, sometimes deadly autonomous system, and what the successes and failures of technology can reveal about Elon Musk. Which, in the front, I have to say that for me this is very surprising: In any list of studies that can be reviewed and found on public roads of autonomous vehicles that are interested in cutting pedestrians and farming in a sustainable way. things, of course “the personality and characteristics of Elon Musk” are among the most important, and least interesting.
For all that, though, it’s a fun read. Journalist, Christopher Cox, goes on a surreal and dark ride with some sweaty Tesla enthusiasts in California who find their conversion to life-saving changes to self-driving cars sometimes confused by the urgent need for self-control. self-driving cars from carnage.
A minute later, the car warned Key to keep his hands on the wheel and his eyes on the road. “Tesla is now like a nanny on this,” he complained. If Autopilot once dangerously allowed careless drivers — allowing them to squirm behind the wheel, even — that error, like the error of a stationary object, had been corrected. “Between the steering wheel and the eye control, that’s a problem to be solved,” Key said.
[…]
After that, Key told FSD to go back to the cafe. But when we started to turn left, the steering wheel jerked and the brakes went very fast. Key muttered nervously, “Okay. …”
After another minute, the car stopped in the middle of the road. A line of cars was coming down beside us. Key hesitated again but then quickly took it and finished it. He said: “It could have been much higher, but I didn’t want to end it.” If he was wrong, of course, there was a good chance that he would have had his second AI accident on the same one kilometer road.
There’s also some good reporting in there about Tesla and Musk’s changing habits of talking about what the company’s cars can do — Musk is, by light years, the world’s most carefree person — and what those cars have actually done. Ultimately, Tesla compares the crash statistics of its technology to human-driven cars in ways that seem designed to distort the narrative, supporting false claims that its AI drives better and safer than humans. It’s bad.
The Time The blog raises the issue of utility utilitarianism and risk-reward calculations. You can understand what happens: All Elon Musk does with any credibility is revert to long-baked and messianic moments when faced with his cruelty to others or his cars killing people. The fact that, well, his cars kill people gives a tragic weight to the material. And so, just like that, here comes Peter Singer, every boring genius on the internet, to weigh in on Musk’s desire to flood the streets with cars that sometimes think it’s time to put a toddler to bed:
The caller told me that while Autopilot and human drivers are equally lethal, we should choose AI, as long as the next software update, based on data from accident reports and near misses, can make the system safer. “It’s a bit like experimental surgeons,” he said. “Maybe the first few times they do surgery, patients will lose, but the argument is that it will save a lot of patients in the long run.” It was important, however, Singer added, that surgeons obtain informed consent from patients.
And here, above the next paragraph, is the exact moment my hair caught fire:
They do Does Tesla have informed consent for its drivers?
That’s a good question, in short: Tesla increases the power of its cars and damages its safety record; Most Tesla drivers—or, as in, drivers?—may not know what they’re buying, or what they’re not buying. It is again completely wrong question.
The most obvious question to ask about unmanned autonomous vehicles roaming public roads is not Tesla drivers have sufficient knowledge to accept the risk to their safety; “patients,” in Singer’s metaphor – whether he or Cox knows – are not Tesla drivers, but everyone else out there is using public roads and streets and sidewalks, any of them can be killed or maimed any time an unproven technology is tested on them without their knowledge or consent. There is no experimental test method that can randomly kill an innocent person minding their business in the same building – but a self-driving car carrying even the most knowledgeable and enthusiastic Tesla enthusiasts can deliberately kill anyone it comes in contact with. In fact, this has happened many times. The Time The story also tells a similar story, when a Tesla driving alone crossed a red light on the way to Los Angeles and hit a Honda driven by people, killing the Honda’s occupants – who had no say in whether the Tesla driver would cooperate. autopilot system that killed them.
That is to say that in Singer’s analogy, Tesla owners drive their own autonomous vehicles. and surgeons. And not even ordinary doctors, but really Centipede people a person, performing surgery on randomly selected strangers. Who gives a fuck anyway that A young man accepts the danger he poses to everyone else, seeing that everyone else has no choice in the matter?
This is like talking about the dangers of firearms, focusing on the dangers of an AR-15 owner who is blissfully unaware that his rifle may explode in his hands while using it to spray bullets at schoolchildren. It’s like wringing your hands to the right of an antivaxxer to decide what goes in their body, and ignoring that this is tantamount to giving them unlimited power over what goes in everyone else’s body. You still think that the blog can move around to consider the consent of the people – the members who can expect to ask each other on the question if they want to share their morning visit with the psychotic technology of the two tons of killbot out of order. from regular safety checks and road tests at high speeds by untrained volunteers—but it doesn’t stop there.
In this way, intentionally or not, Cox makes the mistake of taking Musk’s and Tesla’s libertarian or sociopathic (assuming that you grant, generously, that these are not the same): that the open roads used by the ignorant are a legitimate testing ground. risky and unproven experimental technology, and building—through trials and sometimes fatal errors—documents that in the future may enable the technology to live up to the manufacturer’s boast. Beyond that: unmistakable claims about what the next generation of intelligent self-driving cars will be he can do it create positive self-image and non-profits by providing public service tools that can match the test—like making Big Macs an official part of all school lunches because the CEO of McDonald’s says he dreams of one Big Mac. a day to prevent cancer.
Hiding behind it all, unexamined in Time blog, it’s a question of who, or what, has the ultimate responsibility to protect the public from the dangers that the unbridled hypercapitalists and their deplorable religions can bring to us all. Perhaps the fight has already been lost: Tesla’s inefficient cars, after all, are already on the roads, suddenly exiting, pedestrians, driving innocent drivers on the roads, catching fire for no reason, and seemingly nowhere. or the political will to just…get them off the road. Neither is the widely shared view that it is something every organization should do.
However. Imagine an ideal world that is given by Time Blog post on informed consent for Tesla drivers whose cars are killing others: They will know the dangers they face when driving self-driving technology in their new car. Good. And I think we all have to accept the possibility of being dumped a mile away by a stolen car as the price of getting from here to there. Do you feel like the way things should be? At least we will be informed!
Dressing up those unpleasant words and chin rubs on painful problems or necessary moral exchanges, even if the test is well-intentioned, is just giving away the game. It’s easy to say that these things are broken machines that have no business on the road, and that in a society that works slowly the decision whether to put them there can’t be with a shitposting imbecile like Elon Musk. For the media to dance to every corner of the bald truth seems… well, let’s calm down in a wavering manner.