Coalition of the Confused

Hosted by Jenifer (Zarknorph)

Confused malcontents swilling Chardonnay while awaiting the Zombie Apocalypse.

  • 1127
    MEMBERS
  • 60932
    MESSAGES
  • 21
    POSTS TODAY

Discussions

The catastrophe that EVERYONE saw coming   America - all of it

Started 3/19/18 by Jenifer (Zarknorph); 2485 views.
In reply toRe: msg 2
Jenifer (Zarknorph)

From: Jenifer (Zarknorph)

3/20/18

3. Is there a moral code we can all agree on?

People often talk about self-driving cars taking a utilitarian approach on the road: the car acts in a way that maximises benefit for the most amount of people.

But there's also the negative approach: you may want the car to drive defensively and minimise risk for the occupant only, increasing risk for everyone else.

"Let's call it an egoist," Dr Evans said.

Or think about tailgating.

Under Australian law, if you're cut off and you slam on your brakes and someone following too close rear-ends you, we typically don't think you're at fault.

If a car was programmed along utilitarian lines, it might brake slowly so as to not cause the tailgater to crash, but this places more risk upfront.

But if you took a more deontological approach — one that was focused primarily on what your duty was — things would be different.

"A deontological car might say, it might be really nice for you to protect the tailgater, but you don't have a duty to protect the tailgater," Dr Evans said.

"They distinguish between what your obligations are, and what they call supererogatory — above the call of duty — actions."

4. Or should we choose our own car's moral code?

Car manufacturers need to decide whether they want to reveal how cars are ethically programmed. Or whether their customers should even have a choice.

For example, you could push a button to tell the car it's important to get swiftly from point A to B, so drive more aggressively.

Or what if your autonomous car started running red lights every time it appeared safe to?

"I think we would rightly say no, that's not OK to have that setting on your vehicle," Dr Danks said.

Alternatively, regulators might demand that all cars have a similar decision-making process.

This might be safer, but not necessarily better.

"It might not be the way we want society to run, because we're never on time for anything or we never get where we're going, or we're always stuck in traffic," Dr Evans said.

The government might have a role in cutting off extremes, however.

"We don't want Machiavelli's car out on the road, always tricking people. Or Ayn Rand's car out on the road running people off the road anytime she can get away with it," he said.

Continued

In reply toRe: msg 3
Jenifer (Zarknorph)

From: Jenifer (Zarknorph)

3/20/18

5. Can we ever learn to trust our self-driving cars?

According to Dr Danks, learning to trust our self-driving cars may be a confusing journey.

One kind of trust is built around predictability and reliability, he suggested.

"When I go out and turn the key in my car in the morning, I trust that it will start," he said.

"It's through seeing how the system behaves in various contexts, so you know the next time you're in that context, it's probably going to do something pretty similar."

In contrast, there is trust based on knowing how or why something works.

This might be very personal: "I understand how and why my wife makes the decisions she does," Dr Danks explained.

"I understand her beliefs, I understand how she thinks about the world, even though by any statistical measure, she is less predictable than my car."

This is our theory of mind: we can interact with other people and figure out their ways of thinking and goals.

The way self-driving cars are designed, we can't currently interpret them as if they were human, but neither are they predictable.

"Precisely because they are these autonomous systems, they actually are capable of much more sophisticated and nuanced reactions to the environment than most machines," Dr Danks said.

This might make them more difficult to trust. You can't say, "if I've seen it once, I've seen it one thousand times" when driverless cars make intricate calculations in novel circumstances.

Ultimately, however, this might come full circle.

"At some point, the sophistication of these systems will reach a point that in fact they are behaving in ways that we can sort of interpret as human-like," Dr Danks said.

When self-driving cars feel like they have a true cognitive state, will we learn to trust again?

http://www.abc.net.au/news/science/2018-03-21/self-driving-autonomous-cars-five-ethical-questions/9567986

Dan (DANCULBERSON)

From: Dan (DANCULBERSON)

3/21/18

Jenifer (Zarknorph) said...

Uber self-driving car kills pedestrian in Arizona

And the thread is titled "The catastrophe that EVERYONE saw coming"?

Apparently not the woman pedestrian who was killed. [RIM SHOT]

Jenifer (Zarknorph)

From: Jenifer (Zarknorph)

3/21/18

Oh Heavens to Betsy!

I don't know what is worse!

The fact you made that terribly offensive joke, or the fact that I'm trying to type while laughing!

I think we'll just have to agree that both our karma is in the toilet.

In reply toRe: msg 6
Jenifer (Zarknorph)
Host

From: Jenifer (Zarknorph)

7/5/18

Contribute to the Ethical Debate - vote on who lives or dies!


Driverless cars are coming... but would you want to be in one that would drive you into a wall to save a texting jaywalker?

In reply toRe: msg 7
Jenifer (Zarknorph)
Host

From: Jenifer (Zarknorph)

10/26/18

Who do we want self-driving cars to spare on the road?

Self-driving cars may soon make troubling human choices. Their software could have a programmed reaction to the question: If an accident is inevitable, who or what is more valuable?

A human or a pet? Passengers versus pedestrians? The young or the elderly?

Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.

While there were intriguing trends from country to country, globally Moral Machine players showed a preference for sparing babies, little girls, little boys and pregnant women.

Of course, humans don't always make clear, thought-out decisions when faced with a road accident, and it's not clear if driverless cars will do any better.

See the Chart HERE!   It's kinda sexist, really.

Di (amina046)

From: Di (amina046)

10/27/18

I am not going out again without strapping a pillow to my waist under the dress!

i am going to look like 18 months pregnant with triplets!

Jenifer (Zarknorph)
Host

From: Jenifer (Zarknorph)

10/27/18

Or you could dress like a doctor... just make sure you look like a man.

In reply toRe: msg 10
Jenifer (Zarknorph)
Host

From: Jenifer (Zarknorph)

Apr-19

adwil

From: adwil

Apr-21

Jenifer (Zarknorph) said:

Two men die when a Tesla vehicle, believed to be operating without anyone in the driver's seat, crashes into a tree and bursts into flames.

Strange. Further investigation by police and Tesla needed. This particular vehicle did not have the self-drive pack and even so should not operate without someone at the steering wheel.  Perhaps the car had been unofficially modded.  Interesting.

TOP