By February 12, 2015 Read More →

Mapping emotions for collaborative robots

152011_Igus1For reasons of safety, industrial robots tend to be segregated from humans, which can restrict the types of tasks that robots perform. This has led robot technologists to search for solutions to set robots free from their cages. If this can be achieved, the combination of dexterity and problem-solving skills of humans along with precision and strength of robots can be fully harnessed.

Pictured left: Sara Baber Sial who did a year’s study research looking at programming emotional responses into robots

In the Department of Design, Engineering and Mathematics at the University of Middlesex a Master’s Degree project has begun to look at ways to allow robots and humans to work closer together. “There’s an increasing move to get robots and humans to work together to achieve a joint goal on production lines,” Dr Aleksander Zivanovic of the University of Middlesex explains. “One of the important questions that needs answering is how can robots and humans communicate by using gestures and movements that convey their intentions without the need for text messages, alarms or flashing lights?”

“Because we have an affinity to robots as they are animal-like, we have an instinctive way of interpreting their intentions. If someone looks in a certain direction their attention is focussed there and they are more likely to move that way. If a robot does that it should be a clue as to their intention. If before a robot moved to pick up an object it could glance in the direction that it was going to move, it would add to our awareness of what it was planning,” adds Dr Zivanovic.

Programming emotional responses

The initial research was carried out by master’s degree student, Sara Baber Sial who did a year’s study research looking at programming emotional responses into robots. Could you make a robot look depressed, excited, happy or sad, just by the way it moves, without any facial expressions?

“The problem was that with most robots you cannot control them at a very low level; you have to work through the manufacturer’s control system” explains Dr Zivanovic. “We were looking for a system where you had control at a very low level because Sara was looking to control each of the joints.”

robolink from tribopolymer specialist Igus provided the simple solution they needed – a multi-axis joint for humanoid robots and lightweight automation applications. It is a complete modular system, combining enormous design freedom with ease of use, and is particularly well-suited where mass is to be kept as low as possible.

150211_igus2At the heart of the modular system are the lightweight, maintenance and corrosion-free joints with tribologically optimised plastic bearings that are driven via wires and can rotate and pivot freely. To articulate the multi-axis joints, igus developed a range of flexible Bowden cables with high-performance polymer jackets that combine low friction values with a long service life. The cables have extremely small bending radii, making highly flexible movements possible and are suitable wherever frequent relative movements take place.

“robolink was ideal because it was up to us to install a control system and we used National Instrument’s CompactRio for that with LabView to control it,” Dr Zivanovic continues. “The robolink arm is very simple – it’s just rods with joints. There is no suggestion of a big robot arm, which is ideal because we wanted to express emotions by movement rather than its look.”

The team did consider using an industrial robot but it was the ability to control movement at a local level that led them to choose robolink. “Having the ability to control the stepper motors individually, was as far as we could find, a unique feature,” says Dr Zivanovic.

The project was a success with Sara Baber Sial achieving her master’s degree. She was able to programme different expressions into the movements of the robot. The movement profiles of conventional robotic arms are often very trapezoidal as in the production world they need to move as quickly and efficiently as possible. This makes the arm movement look very robotic.

It is the first steps for looking at industrial applications and understanding whether a worker that stands next to a robot can understand what is happening just by the way a robot moves

Sara developed a smoother profile, which would make it look more natural. “By stretching and compressing that profile, Sara was able to create different ways of moving the robot’s arm movement,” Dr Zivanovic says. “To test this, she invited volunteers into the studio, showed them a range of the robot’s different movements and asked them to map the emotions being conveyed for analysis.” Sara found that most people recognised the emotion that she was aiming for. Slow moving, low velocity and low acceleration are seen as sad, while high speeds communicate excited or stimulated emotions as you would expect.

“It is the first steps for looking at industrial applications and understanding whether a worker that stands next to a robot can understand what is happening just by the way a robot moves, the hypothesis being that it will make it easier to work together. If the robot is moving in an excited or stimulated manner, you might step back and wait to see what it is going to do, almost a warning to step back.”

Sara’s successful completion of her master’s degree is not however the end of the work for robolink at the university. The plan is to extend research in this area and look at things such as directing the attention of the robot. The next step will be to mount a simple head unit and pivot it in the direction that it is moving and then moving in that direction; communicating its intentions and goals. It is research that may prove vital to the future of industrial automation and helping robots break out of their cages.

Visit the Igus website for more information.

See all stories for Igus

Disclaimer: Robotics Update is not responsible for the content of submitted or externally produced articles and images. Click here to email us about any errors or omissions contained within this article