
For example, Assistant is trained to associate the phrases "Order a pizza" or "show me the dessert menu" with the ORDER_MENU_ITEM BII. Google has made dozens of "built-in" intents (BIIs) covering a wide variety of request types available with App Actions. In Natural Language Understanding (NLU), an intent is a group of user phrases that carry similar meanings. Fulfillment definitions specify which parameters are expected from the user query, and how those parameters should be encoded into the launch instructions.

Key term: Capability elements are templates you define in shortcuts.xml to declare the kinds of actions users can take to launch your app, and jump directly to a specific task.

When a user asks Assistant to perform a task using your app, Assistant matches their query to an App Actions capability defined in your app's shortcuts.xml XML resource.
Voice actions for ipad 2 for android#
You will also learn to use the Google Assistant plugin for Android Studio to test your BIIs.
Voice actions for ipad 2 how to#
You'll learn how to use BIIs from the Health and Fitness category to extend Assistant to an Android app. In this codelab, you'll add two App Actions built-in intents (BIIs) to a sample fitness Android app, enabling users to start and stop an exercise timer by using their voice. If you're new to Android may instead want to get started with one of the codelabs for Android developer fundamentals. You should have prior experience with developing Android apps and Android intents to follow this codelab. This codelab covers beginner-level concepts for developing with App Actions. Capabilities let Assistant know which app features support user voice requests, and how you want those requests fulfilled.

As an Android developer, you implement capability elements to add App Actions. With App Actions, you can use Google Assistant to jump directly into app features and complete tasks using your voice.
