• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Home
  • About
    • Contact
    • Privacy
    • Terms of use
  • Subscribe
  • Your Membership

Science and Technology News

Dedicated to the wonder of discovery

  • News
  • Features
  • Life
  • Health
  • Research
  • Engineering

The appearance of robots affects our perception of the morality of their decisions

February 24, 2021 by Editor

Moralities of Intelligent Machines is a project that investigates people’s attitudes towards moral choices made by artificial intelligence.

In the latest study completed under the project, study participants read short narratives where either a robot, a somewhat humanoid robot known as iRobot, a robot with a strong humanoid appearance called iClooney or a human being encounters a moral problem along the lines of the trolley dilemma, making a specific decision.

The participants were also shown images of these agents, after which they assessed the morality of their decisions. The study was funded by the Jane and Aatos Erkko Foundation and the Academy of Finland.

The trolley dilemma is a problem where a person sees a trolley careening on the tracks, without anyone in control, towards five people. The person can either do nothing or turn the trolley onto another track, saving the five people but killing another individual on the other track.

Attitudes more negative towards humanoid robots According to the study, people consider the choice made by the humanoid iRobot and iClooney less ethically sound than the same decision made by a human and a robot with a traditional robot-like appearance.

Michael Laakasuo, senior researcher in University of Helsinki, project lead and the principal investigator of the study, links the findings to the uncanny valley effect, which has been identified in prior research.

“Humanness in artificial intelligence is perceived as eerie or creepy, and attitudes towards such robots are more negative than towards more machine-like robots. This may be due to, for example, the difficulty of reacting to a humanoid being: is it an animal, a human or a tool?”

According to Laakasuo, the findings indicate that humans do not find robots making moral decisions a strange idea, since the decisions made by a human and a traditional robot were seen as equally acceptable. Instead, the appearance of the robot makes a difference to evaluating their morality.

Discussion guides the regulation of AI Laakasuo says that the number of intelligent machines making moral choices is growing in our society, with self-driving cars as an example.

“It’s important to know how people view intelligent machines and what kinds of factors affect related moral assessment. For instance, are traffic violations perpetrated by a stylish self-driving car perceived differently from those of a less classy model?”

This knowledge can influence the direction of AI and robotics development, as well as, among other things, product branding. Knowledge can also shape the political discussion relating to the regulation of artificial intelligence.

For example, self-driving cars can become test laboratories of sorts for private companies: in the case of accidents, the consequences can be dealt with using money, risking human health in the name of technological advancement with appeals to consequentialist morals.

“What kind of robots do we want to have among us: robots who save five people from being run over by a trolley, sacrificing one person, or robots who refuse to sacrifice anyone even if it would mean saving several lives? Should robots be designed to look like humans or not if their appearance affects the perceived morality of their actions?”

Share this:

  • Twitter
  • Facebook
  • Print
  • LinkedIn
  • Reddit
  • Pinterest
  • WhatsApp

Related Posts

  • What might sheep and driverless cars have in common? Following the herd
    30
    What might sheep and driverless cars have in common? Following the herdPsychologists have long found that people behave differently than when they learn of peers' actions. A new study by computer scientists found that when individuals in an experiment about autonomous vehicles were informed that their peers were more likely to sacrifice their own safety to program their vehicle to hit a wall…
    Tags: people, human, moral, humans, participants, decisions, problem, machines, news

Filed Under: Engineering, News Tagged With: appearance, artificial, attitudes, example, human, humanoid, intelligent, machines, moral, people, perceived, person, project, robots, self-driving, trolley

Primary Sidebar

Latest news

  • AutoX expands robotaxi operation zone to 1,000 sq km
  • Schaeffler acquires precision gearbox maker Melior Motion 
  • Sunflower Labs provides its security drone system to range of new customers
  • Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
  • Robot performs laparoscopic surgery without guiding hand of a human
  • Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
  • Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
  • Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
  • Motion capture is guiding the next generation of extraterrestrial robots
  • Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing

Most read

  • AutoX expands robotaxi operation zone to 1,000 sq km
    AutoX expands robotaxi operation zone to 1,000 sq km
  • Schaeffler acquires precision gearbox maker Melior Motion 
    Schaeffler acquires precision gearbox maker Melior Motion 
  • Sunflower Labs provides its security drone system to range of new customers
    Sunflower Labs provides its security drone system to range of new customers
  • Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
    Monarch Tractor showcases ‘world’s first fully electric, driver-optional tractor’
  • Robot performs laparoscopic surgery without guiding hand of a human
    Robot performs laparoscopic surgery without guiding hand of a human
  • Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
    Amazon owner’s Blue Origin to buy asteroid mining company Honeybee Robotics
  • Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
    Sydney scientists achieve ‘99 per cent accuracy’ for quantum computing in silicon
  • Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
    Ceremorphic unveils plans to build supercomputer infrastructure on 5 nanometer chips
  • Motion capture is guiding the next generation of extraterrestrial robots
    Motion capture is guiding the next generation of extraterrestrial robots
  • Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing
    Baidu’s autonomous electric carmaker Jidu raises $400 million in Series A financing

Live visitor count

327
Live visitors

Secondary Sidebar

Categories

  • Agriculture
  • Archaeology
  • Astronomy
  • Biology
  • Brain
  • Chemistry
  • Computer games
  • Computing
  • Digital Economy
  • Education
  • Energy
  • Engineering
  • Environment
  • Features
  • Genetics
  • Health
  • History
  • Industry
  • Life
  • Nature
  • News
  • Opinion
  • Physics
  • Research
  • Science
  • Social
  • Space
  • Technology
  • Uncategorized
  • Universe

Copyright © 2023 · News Pro on Genesis Framework · WordPress · Log in