Expressing Emotions with a New W3C Markup Language, EmotionML

by Abel Avram on Jul 30, 2010 |

W3C has published the first public working draft of the Emotion Markup Language (EmotionML), a language meant to express emotions in three main ways in today’s computer-based communication: annotating data, the recognition of emotional-based states, and generating emotion-related system behavior.

According to the language authors, EmotionML can have applications in various fields like:

  • Opinion mining / sentiment analysis in Web 2.0, to automatically track customer's attitude regarding a product across blogs;
  • Affective monitoring, such as ambient assisted living applications for the elderly, fear detection for surveillance purposes, or using wearable sensors to test customer satisfaction;
  • Character design and control for games and virtual worlds;
  • Social robots, such as guide robots engaging with visitors;
  • Expressive speech synthesis, generating synthetic speech with different emotions, such as happy or sad, friendly or apologetic;
  • Emotion recognition (e.g., for spotting angry customers in speech dialog systems);
  • Support for people with disabilities, such as educational programs for people with autism.

The basis of an EmotionML document is represented by the <emotion> element, having as children one of the following elements: <category>, <dimension>, <appraisal>, <action-tendency>. There are various emotion category sets, the shortest one being Paul Ekman’s, containing six basic emotions having facial expressions: anger, disgust, fear, happiness, sadness, and surprised. Other sets are more elaborate like Fontaine, Scherer, Roesch and Ellsworth’s, having 24 categories. The dimension element can also have various values depending on the author(s) studying it, an example being: pleasure, arousal, and dominance. The same goes for appraisal and action-tendency, detailed in the document.

The EmotionML draft contains several examples, including annotating an image:


Another example shows how a system can express the information collected by three different affective sensor devices:


Yet another example demonstrates the third usage intended for EmotionML, the behavior of a robot with batteries running out and looking for a power outlet and avoiding to pick up boxes which would drain the batteries even more:


EmotionML can be used in conjunction with other markup languages like EMMA, and extensible multi-modal annotation language, and SSML, a speech synthesis language.

Rate this Article


Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Any examples for use in RDF or RDFa? by Jay Myers

This could be useful in Semantic applications :-)

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

1 Discuss
General Feedback
Marketing and all content copyright © 2006-2016 C4Media Inc. hosted at Contegix, the best ISP we've ever worked with.
Privacy policy

We notice you're using an ad blocker

We understand why you use ad blockers. However to keep InfoQ free we need your support. InfoQ will not provide your data to third parties without individual opt-in consent. We only work with advertisers relevant to our readers. Please consider whitelisting us.