Made By Google: Google Aims to Lead the "AI First World" with Google Assistant

Nathan Anaman


Last week, Google hosted an event for their new Made by Google initiative. Anyone who also watched Google IO 2016 will recognize many common themes, but the primary focus of the day was around the hardware behind Google Assistant. So what’s new from Google? They started off their presentation with a simple statement:

It’s time to move from a mobile first world to an AI first world.

For a while mobile was king of emerging tech, but it’s become too restrictive. Virtual assistants such as Siri and Alexa are rising up to make interacting with services easier and more modular. Rather than requiring users to fill out a form and push a button, for example, we’ll be having conversations through text or speech on many different platforms using artificial intelligence.

Google’s pushing forward with their own virtual assistant, called Google Assistant, and it’s looking very strong right now. They’re trying not only to get a foothold alongside Apple, Amazon and Microsoft but to build the most cohesive, comprehensive experience yet in this area. Let’s dig into Made by Google and see how they plan to do that.


Google_MachineLearning.jpegMachine Learning

There wasn’t a lot of new information here. Google is going to continue making this one of their main talking points because it is at the core of almost all of their latest technologies. Virtual Assistants would be useless without it.

They made it clear at the beginning of the presentation the amount of progress they’re making, and how all of this will translate to better Google products. Google Photo filtering, natural language understanding, translation, text to speech, and speech to text are all built on top of machine learning. The more Google invests in it, the better their products in these areas become.

The company that makes the most progress in the area of machine learning will have a major advantage in the AI first world.



Google_Pixel.jpgGoogle jumps into the deep end of the hardware pool with Pixel, the latest Android phone. The difference is this time they built the hardware, just as Apple has been doing with iPhones for almost a decade. It’s a beautiful phone, with the best smartphone camera on the planet, refined style, and a quick charge capability that can give the battery seven hours of juice in just fifteen minutes. The phone specs were not the focus though, as Google discussed the first part of their plan to welcome you to the “AI first” world.

The Pixel is the first phone with Google Assistant built in, which is not surprising since Google Assistant hasn’t been around for very long. Virtual assistants have, however, and it’s easy to see the difference first party hardware can make with a feature that should be available anywhere on the phone. Look at Siri usage on the iPhone compared to the Cortana app, for example. Deep integration with the system is paramount so assistants can access all relevant services and give the user one point of interaction.

This was fully apparent during their demo. They used the Google Assistant to search for restaurants, make reservations using OpenTable, and call an Uber. The assistant was context aware and readily available, usable from any screen, including within a text message conversation. A new feature on display called on the assistant to make recommendations based on what was currently on the screen.

The Pixel will a be a cornerstone of the Google Assistant ecosystem, which they continued with their in home hardware updates.

Google Home

GoogleHome2.jpgGoogle Home is voice interaction for the Google Assistant. It just happens to live in a box. It has impressive speakers and a more impressive microphone setup, with a touch interactive top panel and clean design. It has some features built in. My Day summarizes your calendar and any other information relevant to your day such as driving times and weather. It has a shopping list that can be accessed from the Google Assistant across any of your devices.

Ultimately, Google Home is not ground breaking on paper. It’s another piece of the the Google Assistant experience, and its success depends on the effectiveness of the ecosystem. Chromecast and the newly announced Chromecast Ultra are the same: impressive, affordable hardware that is part of Google’s ultimate vision for a contextual experience that spans many devices.

Google Assistant

GoogleAssistant2.jpgWith the announcement of the Pixel and Home, Google is making moves to elevate themselves above their competitors. Apple has a solid on phone experience with Siri. Amazon has established Alexa as the premier in home voice assistant. Chat services such as WhatsApp and Slack have welcomed chatbots into our workflow, allowing us to interact with our favorite services through conversation. That is the AI first world.

Google brings it all together by providing the in-phone experience, the in-home experience, and the foundation for both to be successful. Google’s search will provide the best knowledge base for assistant queries. Google Calendar, Contacts, and Gmail are already some of the most used services on the planet, and these are what people would use their assistant to integrate with on a daily basis. Chromecast allows control over the television and streaming services. Integration between Home and Pixel would mean potentially being able to access all functionality on your phone from anywhere in the house.

All these pieces come together to form an ecosystem that is fully integrated, extensively tested, and built on the bleeding edge of artificial intelligence - Made by Google.