"Open the pod bay doors, Viv"
Hey Siri, remind me to call mum when I get home. OK Google, wake me up at 7am. More and more devices are listening to our commands. The technology has come a long away but is still far from perfect. Sure, there are a number of ways to make the same request. We can say “show me tomorrow’s weather”, “what will tomorrow’s weather be like?”, or even “will I need an umbrella tomorrow” but it still doesn’t feel natural and the services are fairly limited. The creators of Siri have just demonstrated a new personal assistant, Viv, that they claim is a paradigm shift for the technology.
Viv is an artificial intelligence (AI) platform that will be available on a host of devices. During the announcement, Viv and Siri co-founder Dag Kittlaus demonstrated the tech in action at TechCrunch Disrupt and gave a behind-the-scenes look at how it works. The aim is to provide a single personal assistant for all our tech that scales well and makes it easier than ever to get things done just by talking.
“Will it be warmer than 70-degrees near the Golden gate bridge after 5pm the day after tomorrow?”
Kittlaus started off with some standard questions about the weather that Siri or Google could handle without breaking a sweat. He made the questions more complicated to the point where he was asking “Will it be warmer than 70-degrees near the Golden gate bridge after 5pm the day after tomorrow?” and Viv was responding correctly within seconds. The novel aspect was how Viv works in the background.
Viv reads the request to understand the intent and from there wrote its own program for getting the necessary information. In milliseconds, Viv wrote a program to figure out what to do to give the intended result. It recognised “Golden gate bridge” as a place of interest and used the necessary services to find its location then used weather services and the time to figure out the answer to the question. All within seconds, and by figuring out how to solve the problem itself.
The ultimate goal is for its symbol to become as recognisable as the Bluetooth or Wi-Fi symbols. If a device has the symbol, you can talk to it. Kittlaus talked a lot about the scaling of the technology. If everyone is making chat bots and different devices have different personal assistants (Google vs Apple), then we have to decide what to speak to in order to get a specific task done and those assistants have to learn things about us every time we switch to a new device. He wants Viv to be the go-to assistant available on all devices and connected to all services.
By making it easier for 3rd-party developers to integrate their own services, Viv becomes smarter and more capable. It already works with Uber and a number of other services. During the presentation, Kittlaus naturally chatted to make transactions. Highlights included “Send Adam 20 bucks for the drinks last night”, “send my mum flowers for her birthday”, and “I need a ride for 6 people from my office to Madison Square Garden”. All of these were instantly answered by Viv with a confirmation to go ahead with the transaction. As simple as that.
Siri does work. We can control our music while running or activate a timer while cooking. But when it gets confused it just complains or gives a web search in the hopes that it’ll find something relevant. Viv should have access to more services and intelligently program methods to solve problems. If Kittlaus gets his wish and Viv becomes the default personal assistant for all devices, we could be using it to talk to our smartphones, smarthome appliances, cash machines, and more.
Viv will begin rolling out towards the end of the year and developers will get their hands on it next year to incorporate their own services.
Main image: Viv Labs