• PhD Project
    • Overview
    • Student Projects
  • Browse Topics
    • View All
    • Bloglet
    • Online Courses
    • Student Projects
  • Publications
  • About Me
Computer Scientist David Gollasch, M.Sc.
  • PhD Project
    • Overview
    • Student Projects
  • Browse Topics
    • View All
    • Bloglet
    • Online Courses
    • Student Projects
  • Publications
  • About Me

Student Projects

  • Home
  • Blog
  • Student Projects
  • Generating Emotion Models for Loomo

Generating Emotion Models for Loomo

  • Posted by David
  • Categories Student Projects
  • Date 22nd February 2018
Copyright Notice The content of this page is provided by Maximilian Feigl. All rights reserved.

Project Documentation

Defining an emotion model that can be applied to Loomo to express emotions. Master project by Maximilian Feigl

Abstract

In times of need for care personal their robotic counterparts are an interesting and striving research topic. To improve the coexistence of robots and humans it is important for robots to be able to display emotions. Because the current development process of specific models for each robot is very expensive, the objective of this thesis is to develop a universally usable model which can automatically display emotions for robots with different capabilities and limitations. Therefore the first step is to sight the literature about emotions to map specific movements and facial expressions to specific emotions. Afterwards the specific needs, to enable as many robots as possible to display emotions, for such a model are reviewed. This review shows that feature models already have the capability to display most of the needed notation, which leads to them beeing used as the basis of the here introduced emotion models. To check wether said models have the ability to create recognizable emotions an android library is created which is used to enable the robot Loomo of Segway Robotics display emotions. The performed user study shows that basic emotions are easily distinguishable and have 80 % recognition rate, while other emotions lacked behind because of the missing context that is needed to identify them.

Download Thesis

You can download the full thesis as PDF file (3 MB) here.

Language Notice This document is available in German language only.
Open PDF

Tag:Social Assistance Robots

  • Share:
author avatar
David

Previous post

Z-Wave-Based Smart Home Sensor and Actuator Communication
22nd February 2018

Next post

Microsoft Voice Client Implementation on Loomo​
10th June 2018

You may also like

BALiefkePostImage
Age-Specific Strategies for Multimodal CUI Implementation
26 January, 2022
DAZiemannPostImage2
Context-Related Support for Elderly People During Activities of Daily Living
19 December, 2021
KPVoiceAssistant
Developing an Online/Offline Voice Assistant for Android
14 September, 2021

Categories

  • Bloglet
  • Online Courses
  • Publications
  • Student Projects

Tags

Accessibility Adaptivity Artificial Intelligence Diversity-Sensitivity Productivity Social Assistance Robots Software Variability Usability User-Centred Design Voice Interaction

Archives

  • January 2022
  • December 2021
  • September 2021
  • May 2021
  • March 2021
  • February 2021
  • September 2020
  • May 2020
  • April 2020
  • November 2019
  • October 2019
  • July 2019
  • June 2019
  • May 2019
  • March 2019
  • December 2018
  • November 2018
  • October 2018
  • August 2018
  • June 2018
  • February 2018
  • January 2018
  • September 2017
  • May 2016
  • November 2015
  • January 2015
  • September 2014
  • May 2014
  • October 2013

Aside

  • Social Links

Legal

  • Legal Disclosure
  • Privacy Policy
Logo_V3.1_20200908_white-1

Made with a lot of ☕️ in Dresden.

© 2008-2021. Computer Scientist David Gollasch, M.Sc. All Rights Reserved.