Defining an emotion model that can be applied to Loomo to express emotions. Master project by Maximilian Feigl
In times of need for care personal their robotic counterparts are an interesting and striving research topic. To improve the coexistence of robots and humans it is important for robots to be able to display emotions. Because the current development process of specific models for each robot is very expensive, the objective of this thesis is to develop a universally usable model which can automatically display emotions for robots with different capabilities and limitations. Therefore the first step is to sight the literature about emotions to map specific movements and facial expressions to specific emotions. Afterwards the specific needs, to enable as many robots as possible to display emotions, for such a model are reviewed. This review shows that feature models already have the capability to display most of the needed notation, which leads to them beeing used as the basis of the here introduced emotion models. To check wether said models have the ability to create recognizable emotions an android library is created which is used to enable the robot Loomo of Segway Robotics display emotions. The performed user study shows that basic emotions are easily distinguishable and have 80 % recognition rate, while other emotions lacked behind because of the missing context that is needed to identify them.
You can download the full thesis as PDF file (3 MB) here.