BT

Expressing Emotions with a New W3C Markup Language, EmotionML

by Abel Avram on Jul 30, 2010 |

W3C has published the first public working draft of the Emotion Markup Language (EmotionML), a language meant to express emotions in three main ways in today’s computer-based communication: annotating data, the recognition of emotional-based states, and generating emotion-related system behavior.

According to the language authors, EmotionML can have applications in various fields like:

  • Opinion mining / sentiment analysis in Web 2.0, to automatically track customer's attitude regarding a product across blogs;
  • Affective monitoring, such as ambient assisted living applications for the elderly, fear detection for surveillance purposes, or using wearable sensors to test customer satisfaction;
  • Character design and control for games and virtual worlds;
  • Social robots, such as guide robots engaging with visitors;
  • Expressive speech synthesis, generating synthetic speech with different emotions, such as happy or sad, friendly or apologetic;
  • Emotion recognition (e.g., for spotting angry customers in speech dialog systems);
  • Support for people with disabilities, such as educational programs for people with autism.

The basis of an EmotionML document is represented by the <emotion> element, having as children one of the following elements: <category>, <dimension>, <appraisal>, <action-tendency>. There are various emotion category sets, the shortest one being Paul Ekman’s, containing six basic emotions having facial expressions: anger, disgust, fear, happiness, sadness, and surprised. Other sets are more elaborate like Fontaine, Scherer, Roesch and Ellsworth’s, having 24 categories. The dimension element can also have various values depending on the author(s) studying it, an example being: pleasure, arousal, and dominance. The same goes for appraisal and action-tendency, detailed in the document.

The EmotionML draft contains several examples, including annotating an image:

image

Another example shows how a system can express the information collected by three different affective sensor devices:

image 

Yet another example demonstrates the third usage intended for EmotionML, the behavior of a robot with batteries running out and looking for a power outlet and avoiding to pick up boxes which would drain the batteries even more:

image 

EmotionML can be used in conjunction with other markup languages like EMMA, and extensible multi-modal annotation language, and SSML, a speech synthesis language.

Hello stranger!

You need to Register an InfoQ account or to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Any examples for use in RDF or RDFa? by Jay Myers

This could be useful in Semantic applications :-)

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

1 Discuss

Educational Content

General Feedback
Bugs
Advertising
Editorial
InfoQ.com and all content copyright © 2006-2013 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
Privacy policy
BT