BT

Your opinion matters! Please fill in the InfoQ Survey!

Amazon Lex Now Generally Available to Enable Conversational Interfaces

| by Sergio De Simone Follow 5 Followers on Apr 24, 2017. Estimated reading time: 2 minutes |

Amazon Lex, the platform behind Amazon Alexa, is now generally available to create voice-powered chatbots and mobile, web, and desktop apps.

Amazon Lex was introduced as a preview beta at the last AWS Re:invent conference to allow developers to start experimenting with embedding conversational interfaces into their apps. Lex provides both automatic speech recognition (ASR) and some form of natural language understanding (NLU) within the context of a bot, which provides a framework to access them. A bot includes:

  • Intents, which represent the goals the user wants to achieve. This could consist of getting the answer to a question, carrying through an action on some remote service, etc.
  • Utterances, which are phrases associated to the various intents. An utterance can be seen as a sentence template optionally containing placeholders, called slots, whose values are supplied in the concrete utterances provided by the users.
  • Slots, which provide, as mentioned, a mechanism to represent inputs for an utterance. Each slot represents a specific type of information, such as numbers, years, countries cities, etc. Custom slot types can also be defined to handle sets of related inputs, e.g., a list of actions, colors, etc.
  • Prompts, which are questions that Lex can ask users so they provide some required piece of information that was not provided in the initial utterance. Prompts are a fundamental block to enable real conversations that span over multiple verbal exchanges between the user and a Lex-powered bot.
  • Fulfillment, which is the name Amazon chose to identify the AWS Lambda-based service responsible to fulfill user intents. This business logic can count on Lex to provide it the intent that it recognized based on the user’s utterance, plus the actual values for the slots in that utterance.

Developers who have created a skill for Alexa will recognize here a very similar framework compared to that used for Alexa – the two major differences being that a bot is, in Alexa-parlance, a skill, and that Alexa skills are not constrained to using AWS lambda and allow to use any remote endpoint for the back-end implementation.

At the time of the preview announcement, Amazon showed a Facebook Messenger chatbot. Support for interacting with more services has been added since, including Slack and Twilio. More importantly, the AWS SDKs now include support to create iOS and Android apps, as well as web and desktop apps using a number of languages, that integrate Lex to interact with their users. The Lex console provides a number of facilities to define the utterances and their association with the intents that compose a bot, as well as to monitor utterances that were not recognized and did not trigger an intent.

Rate this Article

Adoption Stage
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Discuss

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT