Advertisement
UK markets open in 5 hours 5 minutes
  • NIKKEI 225

    36,916.20
    -1,163.50 (-3.06%)
     
  • HANG SENG

    16,109.18
    -276.69 (-1.69%)
     
  • CRUDE OIL

    85.63
    +2.90 (+3.51%)
     
  • GOLD FUTURES

    2,427.20
    +29.20 (+1.22%)
     
  • DOW

    37,775.38
    +22.07 (+0.06%)
     
  • Bitcoin GBP

    49,293.53
    -241.43 (-0.49%)
     
  • CMC Crypto 200

    1,268.11
    +382.58 (+41.19%)
     
  • NASDAQ Composite

    15,601.50
    -81.87 (-0.52%)
     
  • UK FTSE All Share

    4,290.02
    +17.00 (+0.40%)
     

People are faster at understanding human actions than robotic movements – study

A new study suggesting people are faster at understanding human actions than robotic movements highlights potential challenges for a future where robots are incorporated more into our daily lives, researchers have said.

Over a series of six experiments, scientists investigated if people can understand robots’ intentions as they do those of humans.

The team compared people’s ability to ascribe intentions to humans, humanoid robots, and non-human-like animated objects by investigating how easily people understood the simple social cue of a gaze.

Psychologists led by the University of Hull explored whether people apply the same social rules to a robot’s actions as they do to humans’, asking participants to predict what a human or a robot would do by observing their gaze.

Robots playing chess
The study looked at interpretations of gaze (Alamy)

The results showed that people were faster to infer the mental content of human agents compared to robotic agents, suggesting that people process human actions differently to robotic actions.

ADVERTISEMENT

Dr Emmanuele Tidoni, a lecturer in psychology at the University of Hull and the lead author of the report, explained: “The results of our research suggest that people are better at identifying what a human – rather than a robot – would do.

“Specifically, we found that people interpret the meaning of human actions faster than non-human actions.

“We tested how easily people understand the simple social cue of a gaze – humans predict other people’s behaviour by looking at their eyes.

“For example, if you are in a restaurant, you can easily guess if the waiter is ready to take your order by checking where they are looking.

Robotics
Study leaders pointed out that society is already becoming more technological (Alamy)

“Another example is when we see a person looking at a bottle of water – we may easily guess the person is thirsty. However, such guesses may not be so automatic when we see robots – we wouldn’t think a robot is thirsty if it looks at a water bottle.”

The team says their research has implications for the future and the potential challenges we may face as robots are incorporated more into our homes or at work.

Dr Tidoni added: “Society is getting more and more technological. Less than 20 years ago, we saw the introduction of portable devices like smartphones that have radically changed how we communicate and interact with others, and how we perceive technology.

“Investigating how people interpret actions performed by a machine is crucial to improving future interactions between humans and robots.

“Our results suggest that people may benefit from additional information to understand what actions robots are doing. This opens new ideas to use psychological theories to find solutions to improve the development of a fast and growing market, such as the human–robot interaction.”

The study, “Human but not robotic gaze facilitates prediction”, was published in iScience.