As a British reader I am wowed by exotic tales of Robotaxis and heartwarmed to read Play Gooseberry" and "Snogging" from an American writer.... Well done.
I can imagine feeling romantically inclined towards my attractive middle-aged spouse on the way to the airport—perhaps we are headed to a Viking river cruise—but I'm not sure I'd be tempted to get up to anything in the back seat of a driverless car that I wouldn't do if there were a driver. (I mean, I understand that the idea might be that we wouldn't have to make small talk with a chatty driver, except that the image suggests something more . . . intimate.)
I guess I'm not getting a clear statement from these articles that SD cars have been programmed to, in effect, resolve the trolley problem; mostly they seem to outline the ethical dilemma and note that SD car programmers will have to deal with the issue.
"[I]n such cases, who is responsible for the death(s) caused? Is it the self-driving car manufacturer? Is it the software programmer? Or is it the car itself? There is no clear and right answer to this question. Thus, most people feel that accidents should happen naturally rather than letting a robot or software decide who’ll live and die."
On a separate but related issue, if a SD car is considerably less likely to get into a crash in the first place than a human-piloted one, does that factor into a societal weighing of the ethical issue? Right now, ~40,000 people a year die in crashes.
Perhaps we are missing the proverbial elephant in the room. If prospective buyers (or passengers) knew that, in the case of an impending accident, the SD vehicle was programmed to prioritize lives and safety of others before that of its passengers, then no one would buy, or consent to ride in, such a vehicle.
While the story as usual was beautifully written, my thought was about the 3rd wheel. Beyond its origin, I have been all too aware when I was said wheel and, sadly, not really finding the need to jettison someone else from the event in question.
"The closest thing to an exciting moment during my two trips today was a minor, only-in-San-Francisco traffic disruption. On the way back from my appointment, my Waymo encountered a stalled cable car. Its gripman was crouched in the street, apparently performing some maneuver to get it going again. The Waymo knew something was amiss, displayed a message telling me to remain seated and belted, and waited to proceed until the cable car was moving again. Faced with the same situation an Uber driver couldn’t have performed any better, and would probably have grumbled more."
As a British reader I am wowed by exotic tales of Robotaxis and heartwarmed to read Play Gooseberry" and "Snogging" from an American writer.... Well done.
Kevin
I’m still trying to get past one of the gooseberry definitions… 💩
That was new to me; I'm more familiar with the term "dingleberry"
I can imagine feeling romantically inclined towards my attractive middle-aged spouse on the way to the airport—perhaps we are headed to a Viking river cruise—but I'm not sure I'd be tempted to get up to anything in the back seat of a driverless car that I wouldn't do if there were a driver. (I mean, I understand that the idea might be that we wouldn't have to make small talk with a chatty driver, except that the image suggests something more . . . intimate.)
Robotaxis have a built-in algorithm that places their passenger’s safety last, in case of any accident that involves other people.
Because of this, I would NEVER ride in a driverless vehicle!!
Hiya. Do you have a source for this info? I can't find an explicit statement anywhere about the choice of passenger vs others safety.
https://www.scientificamerican.com/article/driverless-cars-will-face-moral-dilemmas/
https://ww2.aip.org/inside-science/the-moral-dilemmas-of-self-driving-cars
https://www.forbes.com/sites/naveenjoshi/2022/08/05/5-moral-dilemmas-that-self-driving-cars-face-today/
https://hai.stanford.edu/news/designing-ethical-self-driving-cars
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9665398/
I guess I'm not getting a clear statement from these articles that SD cars have been programmed to, in effect, resolve the trolley problem; mostly they seem to outline the ethical dilemma and note that SD car programmers will have to deal with the issue.
"[I]n such cases, who is responsible for the death(s) caused? Is it the self-driving car manufacturer? Is it the software programmer? Or is it the car itself? There is no clear and right answer to this question. Thus, most people feel that accidents should happen naturally rather than letting a robot or software decide who’ll live and die."
On a separate but related issue, if a SD car is considerably less likely to get into a crash in the first place than a human-piloted one, does that factor into a societal weighing of the ethical issue? Right now, ~40,000 people a year die in crashes.
Perhaps we are missing the proverbial elephant in the room. If prospective buyers (or passengers) knew that, in the case of an impending accident, the SD vehicle was programmed to prioritize lives and safety of others before that of its passengers, then no one would buy, or consent to ride in, such a vehicle.
While the story as usual was beautifully written, my thought was about the 3rd wheel. Beyond its origin, I have been all too aware when I was said wheel and, sadly, not really finding the need to jettison someone else from the event in question.
I had never heard of "playing gooseberry" before. Leave it to the Brits to come up with something MUCH more colorful than "third wheel."
Amazing information as always!
I goofed and neglected to include a link to this Waymo review by Harry McCracken in Fast Company:
https://www.fastcompany.com/91150764/google-waymo-one-self-driving-cars-san-francisco
A snippet:
"The closest thing to an exciting moment during my two trips today was a minor, only-in-San-Francisco traffic disruption. On the way back from my appointment, my Waymo encountered a stalled cable car. Its gripman was crouched in the street, apparently performing some maneuver to get it going again. The Waymo knew something was amiss, displayed a message telling me to remain seated and belted, and waited to proceed until the cable car was moving again. Faced with the same situation an Uber driver couldn’t have performed any better, and would probably have grumbled more."