When a third-party app — or skill — for Amazon’s Alexa voice assistant politely asks that you invoke an actor, airline, athlete, author, or some other noun in response to a question, it’s specifying a slot type — an entity that defines how data is recognized and handled. (A skill that uses the aforementioned actor slot type might query filmographies with the names of supplied actors and actresses, for example.) They’re meant to “save time” and provide a “more consistent” experience across skills, according to Amazon, and today, four slot types are joining the existing list in public beta: AMAZON.CreativeWorkType, AMAZON.Food, AMAZON.Sport, and AMAZON.VideoGame.
When building an Alexa skill with Amazon’s Custom Skill API, developers define a voice interaction model that describes how users will interact with their skill. The model includes intents, which Alexa uses to the users’ spoken words to actions they’d like the skill to take, and slots — variables within the intent. Amazon’s predefined slots eliminate the need for them to come up with common use cases themselves.
The new slot types work in all Alexa locales and can be used in skills published to the Alexa Skills Store. They encompass a representative list of values that Amazon says it continuously updates, and which it says it uses to train Alexa’s voice recognition machine learning models.
Here’s a description of each:
- AMAZON.CreativeWorkType captures types of creative works such as “soundtrack,” “book,” or “album.”
- AMAZON.Food captures food items, such as “bacon,” “scrambled egg,” and “lemon juice.”
- AMAZON.Sport captures the names of sports, such as “gymnastics,” “basketball,” and “team handball.”
- AMAZON.VideoGame captures video game titles, such as “Doom Two,” “Lemmings,” and “The Sims.”
The four slot types join built-in slots for capturing date, time, numbers, and more in the Alexa Skills Kit (ASK), as well as seven slot types Amazon launched in beta in early October: AMAZON.Color, AMAZON.Country, AMAZON.DayOfWeek, AMAZON.Genre, AMAZON.Language, AMAZON.Month, and AMAZON.Room.
Both additions follow on the heels of Alexa’s Reminders API, which enables skills developers to alert users of upcoming events and chores, and calendar availability, which allows Alexa to tap into a connected calendar and surface free times. Alexa Smart Scheduling Assistant, another recently added feature, schedules one-on-one meetings based on the availability of both meeting participants, and the Room Booking API lets third-party meeting schedulers like Joan and Robin reserve conference rooms and check room availability.
Oh, and Alexa can now talk about the midterm elections.
Amazon’s voice assistant been gaining new capabilities at an accelerated clip ever since September, when the company debuted 11 new and refreshed Alexa-enabled devices including the Echo Sub; the 2018 Echo Dot, Echo Dot Plus, and Echo Show; and the Amazon Smart Plug. Just last week, it made the Music Skill API publicly available, allowing developers to stream songs from online services to Alexa devices and Amazon Echo speakers in the U.S. Earlier this fall, it launched the Alexa Presentation Language (APL) — a suite of tools designed to make it easier for developers to create “visually rich” skills for Alexa devices with screens such as Amazon’s Echo Show, Fire TV, Fire Tablet, and Echo Spot — in beta. And in October, it unveiled APIs to connect smart cameras and doorbells to Echo devices.