Express Computer
Home  »  Artificial Intelligence (AI)  »  This AI-based system can teach human intention to robots

This AI-based system can teach human intention to robots

0 218

In a bid to give machines the ability to predict intent when interacting with humans, a team at the University of New South Wales (UNSW) Sydney is developing artificial intelligence-driven prototype human-machine interface system that will assist humans to be seen not merely as tools, but as partners.

Dr Lina Yao, a senior lecturer of engineering at UNSW and principal investigator, is busy getting AI systems and human-machine interfaces up to speed with the finer nuances of human behaviour.

- Advertisement -

The ultimate goal is for her research to be used in autonomous AI systems, robots and even cyborgs, but the first step is focused on the interface between humans and intelligent machines.

“What we’re doing in these early phases is to help machines learn to act like humans based on our daily interactions and the actions that are influenced by our own judgment and expectations – so that they can be better placed to predict our intentions,” Yao said in a university statement.

At the moment, AI may do a plausible job at detecting the intent of another person (in other words, after the fact).

It may even have a list of predefined, possible responses that a human will respond within a given situation. But when an AI system or machine only has a few clues or partial observations to go on, its responses can sometimes be a little robotic.

Dr Yao is working on less obvious examples of human behaviour integrated into AI systems to improve intent prediction.

Things like gestures, eye movement, posture, facial expression and even micro-expressions – the tell-tale physical signs when someone reacts emotionally to a stimulus but tries to keep it hidden.

- Advertisement -

“We can learn and predict what a human would like to do when they’re wearing an EEG [electroencephalogram] device,” said Yao.

While wearing one of these devices, whenever the person makes a movement, their brainwaves are collected which researchers can then analyse.

“Later we can ask people to think about moving with a particular action – such as raising their right arm. So not actually raising the arm, but thinking about it, and we can then collect the associated brain waves,” said Yao.

Recording this data has the potential to help people unable to move or communicate freely due to disability or illness.

Brain waves recorded with an EEG device could be analysed and used to move machinery such as a wheelchair, or even to communicate a request for assistance.

According to Yao, autonomous AI systems and machines may one day look at us as belonging to one of three categories after observing our behavior — peer, bystander or competitor.

“While this may seem cold and aloof, these categories may dynamically change from one to another according to their evolving contexts.

At any rate, she said, this sort of cognitive categorisation is actually very human.


If you have an interesting article / experience / case study to share, please get in touch with us at [email protected]

Advertisement

Advertisement

Get real time updates directly on you device, subscribe now.

Subscribe to our newsletter
Sign up here to get the latest news, updates delivered directly to your inbox.
You can unsubscribe at any time
Leave A Reply

Your email address will not be published.