BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Expressing Emotions with a New W3C Markup Language, EmotionML

Expressing Emotions with a New W3C Markup Language, EmotionML

This item in japanese

W3C has published the first public working draft of the Emotion Markup Language (EmotionML), a language meant to express emotions in three main ways in today’s computer-based communication: annotating data, the recognition of emotional-based states, and generating emotion-related system behavior.

According to the language authors, EmotionML can have applications in various fields like:

  • Opinion mining / sentiment analysis in Web 2.0, to automatically track customer's attitude regarding a product across blogs;
  • Affective monitoring, such as ambient assisted living applications for the elderly, fear detection for surveillance purposes, or using wearable sensors to test customer satisfaction;
  • Character design and control for games and virtual worlds;
  • Social robots, such as guide robots engaging with visitors;
  • Expressive speech synthesis, generating synthetic speech with different emotions, such as happy or sad, friendly or apologetic;
  • Emotion recognition (e.g., for spotting angry customers in speech dialog systems);
  • Support for people with disabilities, such as educational programs for people with autism.

The basis of an EmotionML document is represented by the <emotion> element, having as children one of the following elements: <category>, <dimension>, <appraisal>, <action-tendency>. There are various emotion category sets, the shortest one being Paul Ekman’s, containing six basic emotions having facial expressions: anger, disgust, fear, happiness, sadness, and surprised. Other sets are more elaborate like Fontaine, Scherer, Roesch and Ellsworth’s, having 24 categories. The dimension element can also have various values depending on the author(s) studying it, an example being: pleasure, arousal, and dominance. The same goes for appraisal and action-tendency, detailed in the document.

The EmotionML draft contains several examples, including annotating an image:

image

Another example shows how a system can express the information collected by three different affective sensor devices:

image 

Yet another example demonstrates the third usage intended for EmotionML, the behavior of a robot with batteries running out and looking for a power outlet and avoiding to pick up boxes which would drain the batteries even more:

image 

EmotionML can be used in conjunction with other markup languages like EMMA, and extensible multi-modal annotation language, and SSML, a speech synthesis language.

Rate this Article

Adoption
Style

BT