The Top 5 Announcements from Google Cloud Next '18

From bringing intelligence to the enterprise to uniting the Cloud and On-Prem, here are the five biggest announcements to come out of Google Cloud Next '18.

1. AI and the Enterprise

Google Cloud Platform’s artificial intelligence (AI) feature took center stage at Next 2018. If your organization already uses G Suite, new features powered by AI will be coming in a few weeks to your favorite collaboration tools such as Hangouts, Docs, and Gmail. The three most anticipated features are Smart Reply, which can recognize urgent [Hangouts] messages and propose personalized responses; Smart Compose, which intelligently autocompletes emails using common phrases; and Grammar Suggestions, which builds on the spell-check function by identifying grammatical errors and suggesting corrections. Existing Google Voice users will also be happy about the access beta features that are integrated with Google AI and machine learning (ML) built specifically for the enterprise. Voicemail transcription and spam filtering mark a major step toward building automated platforms that can handle customer requests. Contact Center AI — a new scalable Google solution to connect businesses and customers — is still in its infancy, but further developments in AI are inevitable. While these advances offer Google customers the benefits of AI, their adoption depends on how seamlessly they are woven into the digital experience.

 

2. Cloud Build

Cloud Build is Google’s attempt to create a competitive service for managing the build, test, and deployment infrastructure — a complement to AWS CodeBuild and Azure’s Team Services. With Cloud Build, a business can leverage Google Compute Engine to ensure that robust pipelines will auto scale to a business. To speed up things, Cloud Build will spin up new resources as existing ones complete their respective jobs, which eliminates the bottleneck caused by traditional build queues. Plus, Cloud Build can use custom worker modes for businesses that are confined to on-premise build environments. Google Cloud Build offers the versatility of the cloud to empower application development and meet the growing needs of DevOps teams.

 

3. Cloud AutoML Vision, Natural Language, and Translation

With the introduction of Cloud AutoML, Google lowers the barriers to bring AI into an enterprise. Cloud AutoML can help businesses solve machine-learning problems in three areas: Vision, Natural Language, and Translation.

Google has invested capital and countless hours in creating a cloud environment that can be scaled as a business grows, and in the development process Google has amassed tons of data. With these data and infrastructure, Google built and trained some leading ML models using capital and data that startups don’t have. Now that Google’s done the hard part, companies can customize Cloud AutoML without needing to reinvent the wheel. For example, with AutoML Vision, you provide only a set of images with labels specific to your use, and Google Cloud takes care of the rest, leaving you with a trained computer vision model ready for any application.

This leaves companies to focus on delivering best-in-class AI. For example, research facilities studying skin cancer detection or retail shops selling merchandise from pictures can focus on designing beautiful user experiences, web apps, and mobile apps; they don’t need to hire a team of data scientists.

 

4. Istio and Google Cloud Services Platform

With Cloud Services Platform, businesses can take advantage of Google’s years of experience running reliable applications scaled to a production environment. The Cloud Services Platform’s core components, Kubernetes (for container orchestration) and Istio (to monitor and secure microservices), are complemented by innovative Kubernetes-based services like Knative (to run a serverless version of Kubernetes) and GKE On-Prem (to run a GCP managed Kubernetes cluster on-premise). Then these are tied together by Cloud Build (for continuous integration). Google has put the exclamation point on its fleet of enterprise tools with a menu of open source tools that power Google’s international fleet of production services.

Istio does not require changes to how code is managed or configured but provides a whole suite of monitoring and troubleshooting tools that are easy to implement and quickly tracks down problems. Istio enables security best practices for services running in a cluster and also automatically provides traffic splitting and monitoring; application logs and traces; the ability to see services communicate; drills down to problem areas; and uses built-in tools based upon the SRE practice for setting service level objectives (SLOs) by tracking service-level indicators (SLIs) at the application level (things like uptime or CPU load). Building upon these metrics, Istio also has Error Budgets to track how well an application is performing relative to its defined objectives.

Because microservice implementation, shorter release cycles, and faster features in production are business priorities, Google knows that, realistically, in a fast-paced delivery environment a business cannot maintain 100 percent uptime, but by adhering to the SRE practice philosophy, realistic goals can be achieved.

 

5. Google Kubernetes Engine On-Premises

GKE On-Prem is Google’s next foray into hybrid cloud space. While a clear majority of enterprise customers migrate to the cloud with multi-cloud and hybrid-cloud deployments, Google notes that while server costs are decreasing, administration costs for hybrid deployments are dramatically increasing. Google calls the problem “false dichotomy between on-premises and cloud”: indeed, something fundamentally different from the on-prem space is happening on the cloud. Customers still establish security policies, monitor the health of their setups, debug operational problems, and more. This is Google’s simple rebuttal: Our customers shouldn’t have to do this twice.

The solution is straightforward: let customers serve the desired Google Cloud product (GKE) on the premises using consistent deployment, monitoring, and administrative experience. GKE On-Prem revolves around customer experience: provision GKE on top of the widely-used VMware vSphere, and it appears like it is running in any Google Cloud Platform zone. The GKE On-Prem cluster appears on the GKE Dashboard in Stackdriver and uses the same (wonderful) APIs and consistent primitives you expect with any Google Cloud product.

Google also brings sanity to DevOps. Engineers can now maintain fewer Terraform modules across the hybrid cloud platform. They can also use GKE to solve “edge-compute” problems like simplifying deployments to on-site retail stores just as they would roll out a containerized service update to any cloud cluster. (Watch the Google Next breakout session “Kubernetes, Kubernetes, As Far As the Eye Can See!” to learn how Target uses Kubernetes to manage its deployments to 2,800+ retail stores.)

Because Google Cloud Platform excels at managed containerization, IAM and security, and hybrid network connectivity and offers them to businesses to manage private data centers, the walls between the cloud and on-premise gardens are sure to get blown open really soon.

Watch a video here of Ryan Maguire, our Director of AI, speak at Google Cloud Next about hybrid machine learning. 

This post was co-authored by Brian Anderson, Matt Griffin, Don Johnson, and Raj Singh.