• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Home
  • About
    • Contact
    • Privacy
    • Terms of use
  • Subscribe
  • Your Membership

Science and Technology News

Dedicated to the wonder of discovery

  • News
  • Features
  • Life
  • Health
  • Research
  • Engineering

What might sheep and driverless cars have in common? Following the herd

March 1, 2021 by Editor

Psychologists have long found that people behave differently than when they learn of peers’ actions.

A new study by computer scientists found that when individuals in an experiment about autonomous vehicles were informed that their peers were more likely to sacrifice their own safety to program their vehicle to hit a wall rather than hit pedestrians who were at risk, the percentage of individuals willing to sacrifice their own safety increased by approximately two-thirds.

As computer scientists train machines to act as people’s agents in all sorts of situations, the study’s authors indicate that the social component of decision-making is often overlooked.

This could be of great consequence, note the paper’s authors who show that the trolly problem -long shown to be the scenario moral psychologists turn to – is problematic. The problem, the authors indicate, fails to show the complexity of how humans make decisions.

Jonathan Gratch, one of the paper’s author, the principal investigator for this project, and a computer scientist at the USC Institute for Creative Technologies, says existing models assume that in high stakes life and death decisions, people think differently than they actually do. He indicates that there are not moral absolutes for human decision-making but rather “it is more nuanced,” says Gratch.

The researchers conducted four separate simulation experiments to understand how people might process and act on the moral dilemmas they would face as an operator of a driverless car.

The first three experiments focused on human behavior when faced with risk to themselves and others in the event of negative scenario in which the vehicle would have to be programmed to either the car to hit the wall or hit five pedestrians.

The authors prove that participants would use severity of injury to self and the risk to others as guideposts for decision-making. They found that the higher the risk to pedestrians; the more likely people were likely to self -sacrifice their own health.

In addition, the level of risk to pedestrians did not to have to be as high as for the operator of the autonomous vehicle to sacrifice their own well-being.

In the fourth experiment, the researchers added a social dimension telling participants what peers had opted to do in the same situation. In one simulation, the knowledge that peers chose to risk their own health changed the participants’ responses, rising from 30 percent who were willing to risk their health to 50 percent.

But this can go both ways cautions Gratch. “Technically there are two forces at work. When people realize their peers don’t care, this pulls people down to selfishness. When they realize they care, this pulls them up.”

The research has implications for autonomous vehicles including drones and boats, as well as robots that are programmed by humans.

The authors suggest that it is important for manufacturers to have an awareness of how humans actually make decisions in life or death situations.

In addition, the authors imply that transparency in how machines are programmed as well as relinquishing controls to the human drivers so that they might change settings prior to these life and death situations are important for the public.

They also suggest it is important for legislators to be aware of how vehicles might be programmed.

Lastly, given the human susceptibility to conform to social norms, the authors believe that public health campaigns related to how peers programmed their autonomous vehicles for self-sacrifice might influence future owners to change their vehicle settings to be more oriented to protecting others from injury and choose self-sacrifice.

Share this:

  • Twitter
  • Facebook
  • Print
  • LinkedIn
  • Reddit
  • Pinterest
  • WhatsApp

Related Posts

  • The appearance of robots affects our perception of the morality of their decisions
    30
    The appearance of robots affects our perception of the morality of their decisionsMoralities of Intelligent Machines is a project that investigates people's attitudes towards moral choices made by artificial intelligence. In the latest study completed under the project, study participants read short narratives where either a robot, a somewhat humanoid robot known as iRobot, a robot with a strong humanoid appearance called iClooney…
    Tags: people, moral, human, decisions, machines, problem, participants, humans, news

Filed Under: Industry, News Tagged With: authors, autonomous, computer, decision-making, decisions, human, humans, injury, life, moral, participants, pedestrians, peers, people, problem, programmed, risk, sacrifice, situations, social, vehicle, vehicles

Primary Sidebar

Latest news

  • AutoX expands robotaxi operation zone to 1,000 sq km
  • Schaeffler acquires precision gearbox maker Melior Motion 
  • Sunflower Labs provides its security drone system to range of new customers
  • Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
  • Robot performs laparoscopic surgery without guiding hand of a human
  • Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
  • Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
  • Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
  • Motion capture is guiding the next generation of extraterrestrial robots
  • Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing

Most read

  • AutoX expands robotaxi operation zone to 1,000 sq km
    AutoX expands robotaxi operation zone to 1,000 sq km
  • Schaeffler acquires precision gearbox maker Melior Motion 
    Schaeffler acquires precision gearbox maker Melior Motion 
  • Sunflower Labs provides its security drone system to range of new customers
    Sunflower Labs provides its security drone system to range of new customers
  • Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
    Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
  • Robot performs laparoscopic surgery without guiding hand of a human
    Robot performs laparoscopic surgery without guiding hand of a human
  • Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
    Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
  • Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
    Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
  • Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
    Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
  • Motion capture is guiding the next generation of extraterrestrial robots
    Motion capture is guiding the next generation of extraterrestrial robots
  • Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing
    Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing

Live visitor count

164
Live visitors

Secondary Sidebar

Categories

  • Agriculture
  • Archaeology
  • Astronomy
  • Biology
  • Brain
  • Chemistry
  • Computer games
  • Computing
  • Digital Economy
  • Education
  • Energy
  • Engineering
  • Environment
  • Features
  • Genetics
  • Health
  • History
  • Industry
  • Life
  • Nature
  • News
  • Opinion
  • Physics
  • Research
  • Science
  • Social
  • Space
  • Technology
  • Uncategorized
  • Universe

Copyright © 2023 · News Pro on Genesis Framework · WordPress · Log in