During today’s event, Google talked plenty about the Google Assistant, an AI bot now found in both the Pixel phones and Google Home. While there weren’t many stand-out moments when we learned something totally unexpected about the software, we did hear about a feature that’ll surely please developers: starting in December, devs can begin creating ‘Actions on Google’ that’ll integrate with Assistant to create better interaction with the bot. Let me explain…
See, there’s two ways developers will go about these Actions: by creating Direct Actions and creating Conversation Actions. The former is a more direct approach to voice interaction with Assistant and is something we’re already used to with the likes of Siri, Cortana, and even Google Now. If you ask a question, you’ll get a response. If you wanna play a song, just ask to play a song and it’ll start. If you wanna know the weather, just ask “What’s the weather?” and you’ll hear the current weather conditions where you are. These are similar interactions you can make with developer’s apps once they’re integrated as Direct Actions.
On the flip side, Conversation Actions are a little more complex. These interactions tend to be more on the conversational side of things, hence it’s title. Take Uber for example: if you wanna request transportation, all you need to do is say, “I need an Uber” and the Assistant will connect you with Uber. Once you’re greeted by the app, you’ll need to tell Uber where you’re headed and what size car you’ll need. Once that’s done, Uber will let you know who’s coming your way and in what model vehicle. A back-and-forth style of interaction is present in this way of requesting a car, so just pretend you’re having a conversation with someone when using this feature. The below GIF (via The Verge) demos this integration in use.
In order to accomplish such integration, developers will need to add a level of intelligence to their apps in order for each function to work well. Ideally, since Assistant is launching on a variety of devices, devs should be able to code their AI once and it’ll show up everywhere from smartphones to Google Home. According to Google, we’ll learn more about Actions on Google this December when the proper APIs become available for development. For now, check out Google’s Developers page for additional info.
You must log in to post a comment.