BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Amazon Lex Now Generally Available to Enable Conversational Interfaces

Amazon Lex Now Generally Available to Enable Conversational Interfaces

Leia em Português

This item in japanese

Amazon Lex, the platform behind Amazon Alexa, is now generally available to create voice-powered chatbots and mobile, web, and desktop apps.

Amazon Lex was introduced as a preview beta at the last AWS Re:invent conference to allow developers to start experimenting with embedding conversational interfaces into their apps. Lex provides both automatic speech recognition (ASR) and some form of natural language understanding (NLU) within the context of a bot, which provides a framework to access them. A bot includes:

  • Intents, which represent the goals the user wants to achieve. This could consist of getting the answer to a question, carrying through an action on some remote service, etc.
  • Utterances, which are phrases associated to the various intents. An utterance can be seen as a sentence template optionally containing placeholders, called slots, whose values are supplied in the concrete utterances provided by the users.
  • Slots, which provide, as mentioned, a mechanism to represent inputs for an utterance. Each slot represents a specific type of information, such as numbers, years, countries cities, etc. Custom slot types can also be defined to handle sets of related inputs, e.g., a list of actions, colors, etc.
  • Prompts, which are questions that Lex can ask users so they provide some required piece of information that was not provided in the initial utterance. Prompts are a fundamental block to enable real conversations that span over multiple verbal exchanges between the user and a Lex-powered bot.
  • Fulfillment, which is the name Amazon chose to identify the AWS Lambda-based service responsible to fulfill user intents. This business logic can count on Lex to provide it the intent that it recognized based on the user’s utterance, plus the actual values for the slots in that utterance.

Developers who have created a skill for Alexa will recognize here a very similar framework compared to that used for Alexa – the two major differences being that a bot is, in Alexa-parlance, a skill, and that Alexa skills are not constrained to using AWS lambda and allow to use any remote endpoint for the back-end implementation.

At the time of the preview announcement, Amazon showed a Facebook Messenger chatbot. Support for interacting with more services has been added since, including Slack and Twilio. More importantly, the AWS SDKs now include support to create iOS and Android apps, as well as web and desktop apps using a number of languages, that integrate Lex to interact with their users. The Lex console provides a number of facilities to define the utterances and their association with the intents that compose a bot, as well as to monitor utterances that were not recognized and did not trigger an intent.

Rate this Article

Adoption
Style

BT