New study pans the intelligence of voice assistants

Natural language holding back true digital assistance

Study finds assistants are pretty dumb
The Ambient is reader-powered. If you click through using links on the site, we may earn an affiliate commission. Learn more

A new study of voice assistants has tested the capacity of Amazon Alexa, Google Assistant and Siri to help satisfy people’s queries – and the results are not brilliant.

The Nielsen Norman Group (NNG) has performed an extensive – if less than scientific test – measuring not only the response, but expectations of users. It got together 17 participants from New York and San Francisco, all of which were experienced voice assistant users, and put them in a lab to carry out a number of smart speaker tasks, and interviewed them about their expectations and quality of results.

While it rated voice input as a good experience for users, and noted that support for accents was improving, it rated most other elements as “bad”. Those include using natural language, the voice output of voice assistants, and the ability to tap into users’ services (calendars, email, to-do lists) and pull out useful information.

The study found that natural language was one of the biggest barriers, with voice assistants regularly tripped up by sentences with multiple causes:

“The majority of the participants felt that complex, multiclause sentences (such as “What time should I leave for Moss Beach on Saturday if I want to avoid traffic?” or “Find the flight status of a flight from London to Vancouver that leaves at 4:55pm today”) were unlikely to be understood by the assistants,” the study reads.

Getting the language right was a theme of the study and Alexa’s skills came under fire from the participants. Requiring users to remember the exact name of a given skill was one major criticism, and the specific requirements of phrasing commands.

“The majority of the Alexa users did not know what skills were; some had encountered them before, installed one or two, and then completely forgotten about their existence.”

While the NNG survey is far from scientific, and doesn’t really generate any clear or meaningful conclusions, it’s refreshing for a study to put value on the expectations of users. Voice assistant tests usually mean parroting the same questions at speakers, but this is better designed to be a frank assessment of the challenges faced by the likes of Amazon, Google and Apple to produce a true digital assistant.

TAGGED    smart speakers

Related stories

google home Google Home multi-room music setup explained: Nest, Chromecast and more
amazon alexa The best Alexa speakers: Smart speakers with Amazon's Alexa built-in
amazon alexa Verizon's smart display has Alexa baked in and 4G connectivity
amazon alexa How to change Alexa's wake word, name and voice: Ziggy, Amazon, Computer and Echo all available now
amazon alexa 101 best Amazon Alexa voice commands: Unlock music, TV, cool features
smart speakers Spotify Connect: We explain the devices, multiroom and advanced features