Artificial intelligence is going to do some crazy things, and acting and sounding more like a human is one of them. Google has even demonstrated how close that is, with a feature it’s working on in Google Assistant. And wow, is it crazy real.
There are a lot of buzzwords and bits of jargon that get passed around, but one of the more common ones popping up lately is “artificial intelligence”.
Also known as “AI”, artificial intelligence is known by most as computers that think, or more specifically, it’s computers able to learn thanks to complex algorithms that help them do the math on problems and in turn help regular humans.
It is intelligence that has been programmed to work on technology, and it has the potential to great things.
In fact, in 2018, artificial intelligence is already doing some useful things, such as being used in phone keyboards to help you work out what movies to watch on Netflix to working out suggestions in the Music Genome Project operated by Pandora. It’s in use on your phone for all of the major virtual assistants — Amazon’s Alexa, Apple’s Siri, and Google’s Assistant — and it’s in use by major shops to work out how to better sell products to you.
Machines that can use these complicated algorithms to sort through vast quantities of data are what makes AI super cool, and super customised, too, and this week at Google I/O, when it wasn’t showing off all the great things Android P will do, Google was talking up some of the amazing ways it is making AI special this year.
One of them was truly amazing, as it made Google Assistant into a real assistant, calling up for appointments and reservations on behalf of you, and physically talking to real people on the other end of the phone so you don’t have to.
It’s the sort of technology that literally stops you in your tracks, that opens that jaw and drops it to the ground, and not just because it can do it, but because Google’s latest project has found a way to evolve the AI of the Google Assistant so that it sounds closer to human than we’ve ever managed, and most would have trouble working out the difference in.
Google’s improvements to its Assistant technology are part of the next phase of the project, with the approach to make the Assistant “naturally conversational”, and able to pick up on the nuances of conversation. Forget about the staccato minimalist approach of an artificial intelligence that talks to you in segments as if it were reading It off a script poorly, because the next phase of Google Assistant will speak to you like you. With the occasional “umm” and “hmm”, just like we do.
The approach is one that will also be able to analyse phrases and statements made by the real-life people it talks to, and is powered by a technology Google calls “Duplex”, which enables Google Assistant to work out regular human speech, which is actually quite fast to work through. That makes it possible for the Assistant to talk to people in real-time and understands the intent, and will effectively turn the Google Assistant into a real Assistant, albeit a digital one.
Google’s new Assistant will also come with the ability to talk to it without saying “Hey Google” each time you’re talking to it, making your conversations to it more normal, especially when worked in with its regular contextual understanding.
Though one of the other exciting changes isn’t about how you talk to it, but rather how it talks to you.
Thanks to advances in its artificial intelligence, there will be six new voices for Google Assistant, and recording artist John Legend will also be one of them.
Legend and the rest of these features should arrive later, but for now, immerse yourself in the crazy cool abilities of Google Duplex helping Google Assistant to make its conversation come to life for real in the video above. It’s just that impressive.