Just say it: The future of search is voice and personal digital assistants

Blog
Christi Olson
Bing Evangelist
Microsoft

In the 2013 drama Her, actor Joaquin Phoenix plays a lonely writer named Theodore who falls in love with his AI digital assistant, “Samantha.” Samantha, it seems, knows Theodore better than anyone.

And while the voice-search enabled digital assistants of the real-world like Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa may not yet have had anyone confess their undying love, we do know that they are quickly becoming the go-to search mode for consumers everywhere. In fact, ComScore says that by 2020, 50 percent of all searches will be voice searches.

Today’s digital assistants are going beyond voice input, and are evolving to understand user intent and behaviors through available data and information to help consumers take actions.

With voice search and the adoption of personal assistants coming on strong, here are some trends and tips for digital marketers.

Voice Search is increasingly mobile – but also integrated into products you use everyday

Voice search tends to be more mobile and locally focused because it’s embedded into many mobile apps and devices. However, several digital assistants are integrated into products that are part of your everyday life, both at work and at home. For instance, Microsoft has integrated Cortana into Windows 10 to power both text and voice search on all devices from your PC to your phone to the Xbox One. Amazon’s Echo is like having the Star-Trek computer available at the command of your voice, ready to answer your questions, play your music or audio books, and even control other devices in your home through smart home-connected technology. These devices are powered by data gathered across platforms and the Internet of Things, and are no longer solely tethered to mobile devices and applications.

Using voice search and digital personal assistants is becoming second nature as they are integrated into products I use regularly and helps me get tasks done more quickly every day. For example, it’s common for me to use voice search in my car with Ford Sync technology. I can send a text message while driving to let someone know I’m running late, or use Cortana on my PC to add an appointment to my calendar. When I’m at home I can speak to my Xbox to navigate between my favorite TV channel and Netflix, and then during the movie, ask Echo to add ice cream to my AmazonFresh shopping cart because I’m almost out.

It’s about the conversation – and using data and signals to predict user intent

Over the last 15 years, we’ve learned to succinctly express ourselves in very few words to traditional search engines. With voice search spoken language connects people to what they’re searching for with an immediacy, convenience and intimacy that text-only search could never provide. Because of this, the artificial intelligence that powers voice search grows smarter with every interaction and has a better understanding of user intent as the search becomes more conversational in nature. The AI is developing a model based on the conversational language that understands intent and context, builds on previous queries, can contemplate multiple steps and queries, and is oriented toward actions, tasks and transactions.

For example: If I live in Seattle, but have travelled to Munich and I ask my digital assistant for directions to a hotel, and then for a cab company in Munich, it should anticipate that when I ask for local restaurant recommendations, the search results should be in Munich -- not in Seattle -- based on my current geo-location and the context of my recent historical searches.

The search engines powering voice search and personal assistants are focusing on new ways that machine based learning and artificial intelligence can be used to analyze all of the available data signals – not only from their own products and the available search knowledge graphs – but also at how nd how a current query pertains to previous queries, geolocation signals as well as your own connected data as signals of intent. As more devices and the Internet of Things provide data signals to the artificial intelligence platforms that power voice search, it can understand who you are, what you are doing, and where you are to understand the context of what you want, based on your behavior patterns and preferences. This makes voice search somewhat predictive in nature, with the ability to understand intent and anticipate what the searcher’s upcoming needs may be.

Want more like this?

Want more like this?

Insight delivered to your inbox

Keep up to date with our free email. Hand picked whitepapers and posts from our blog, as well as exclusive videos and webinar invitations keep our Users one step ahead.

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

side image splash

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

Voice searches are different – and the results should be, too

We unconsciously change our behavior when using voice search. When you are searching for a restaurant on your desktop or phone, you might type in “Best Brunch in Los Angeles.” But when you use voice search you change your behavior and ask a question, like “What restaurant has the best brunch in Los Angeles?” or “What restaurants are open for Brunch Now?”

 

Use of Colour

 

As a result, voice search queries are longer than their text counterparts – they tend to be three-to-five keywords in length, and they tend to explicitly ask a question, characterized by words like who, how, what, where, why and when, with the expectation that the search engines will provide an answer back.

 

Use of Colour

 

The choice of words used in the question provides more context about user intent, which in turn can provide advertisers with more insight into where the consumer is within the purchase funnel. Are they simply researching, or ready to purchase?

For example, if a consumer asks, “What is the difference between an infant car seat and a convertible car seat?” they are likely just researching. But if they ask, “How much is a Mesa car seat?” or “Where can I buy a Mesa car seat?” they are much closer to taking action.

 

Use of Colour

 

And remember, not all queries that start with a term are created equal. “What price” shows more intent than “What is” and is further down the purchase path. Create and develop content to answer specific questions and to match to the right level of user intent.

Understanding the nuances between conversational search queries can help you discern consumer intent and make sure that your website contains the right content to adapt to voice search.

Top tips for tapping into voice search

The continued rise of both personal assistants and voice search means that it’s more important than ever to understand how search engines provide predictive answers to our questions. It’s necessary to adapt our marketing strategies to be visible and provide answers to questions across all formats from text search to voice search.

How do savvy marketers adapt to voice search? Here are a few tips.

  1. Write content in a natural, conversational voice that answers the questions your consumers are asking: Website content in the era of voice search isn’t about keywords, it’s about semantic search and building the context related to answering a question.
  2. Build out your view of user intent based on the types of question-based queries: It’s essential to understand the intent behind a query and how both you and the search engine can deliver more accurate results based on the anticipated context. Develop content as well as expand your paid search keyword lists to include longer tail keyword phrases to reach users at each stage of intent based on the types of questions they are asking.
  3. Make sure structured data is integrated into the backend of your website (schema mark-up): Structured data within your website becomes more important because it is one of the signals used to power the search results and the “cards” -- or direct answers. Using structured data and schema mark-up can help the search engines understand and crawl your data more efficiently and become part of the available knowledge graphs.

Personal assistants are getting smarter based on how we interact with them, learning from our speech patterns and personal preferences to gain context to help us with our daily life so we can focus on what matters most.

Voice search is not a thing of the future – it’s here today, and consumers are quickly adopting it. When they fall in love with their digital assistant of choice, make sure you’re there, ready to take advantage of the promise voice search and digital assistants can deliver.

Want more like this?

Want more like this?

Insight delivered to your inbox

Keep up to date with our free email. Hand picked whitepapers and posts from our blog, as well as exclusive videos and webinar invitations keep our Users one step ahead.

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

side image splash

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy