Google Cloud Next 2018 Conference Wrap Up

Categories

Last week, Google Cloud held its latest Next conference in San Francisco, unveiling new AI features, partnerships, tools and technologies to build out its ecosystem and make its services more attractive to developers. The conference was Google Cloud’s largest to date with over 20,000 registered attendees.

Alongside Amazon Web Services and Microsoft Azure, Google Cloud is already one of the top three cloud providers. From its beginnings, however, Google has specialized in working with smaller companies and startups; with its latest serious investments in infrastructure and its push into deepening its big data and machine learning capabilities, the company is now set to equal its rivals as a contender for medium and large enterprise clients.

Diane Greene, Google Cloud’s CEO  who gave the day 1 Keynote, focused on the ways in which cloud is changing business practices (“everybody is going to move to the cloud… just know, we’re here to partner with you – to help you disrupt in a non-disruptive way”) in addition to outlining Google’s latest major developments in security and AI. As Greene phrased it, “Security is the number one worry, and AI is the number one opportunity.”

Almost a dozen new security features and tools were unveiled across the conference, following on from the twenty announced in March. Greene also took the keynote as an opportunity to discuss some of the fields it intends to expand into over the coming months and years, healthcare in particular; announcing that Google is partnering with the Broad Institute for genome processing tools and the National Institutes of Health to help analyze huge biomedical data sets.

Cloud Services Platform

The Cloud Services platform will be available in alpha this fall. It will include many of the new features unveiled at Google Next, including GKE On-Prem, Istio, observability and centralized policy enforcement. AI will play an increasingly central role in that area. It is already being used for Google Smart Compose, to check your grammar in Google Docs and detect spam in Gmail.

Some of the other most intriguing new features announced at Next include:

Edge TPUs and Google Cloud IoT Core

Google made a surprising hardware announcement related to the IoT, introducing its new Edge TPU (Tensor Processing Unit) as an AI accelerator application-specific integrated circuit (ASIC) developed specifically for neural network machine learning.

“The new Edge TPU is so small four of them can fit on top of a penny and can fit in your smallest sensors,” said Injong Rhee, Google’s Vice President of IoT Cloud. “We designed this to be highly focused on performance per dollar and performance per watt. It brings a brain to your endpoint devices at an extremely low cost.”

The new Edge TPUs will work in conjunction with Google’s Cloud IoT Core, Google’s fully managed service for developers to connect, manage and ingest data from geographically dispersed IoT devices. The service was first introduced to select users last May in alpha then expanded to a beta version available to all users as of September 2017. It functions as part of Google’s broader Google Cloud IoT service, enabling both the ingestion of IoT data and its analysis.

In February, Google acquired LogMeIn’s Xively Iot Management in order to bolster its capabilities in the IoT device management space. The new Edge TPU builds on this acquisition. The Edge TPU is a way of bringing more processing to the edge, closer to the end user. At Next, Rhee shared numerous potential use cases, including “thousands of these in a city in traffic cameras” connected to the Google Cloud Platform, which could be used to analyze traffic.

Google’s Cloud Tensor Processing Unit is an ASIC designed from the outset for machine learning, powering multiple major Google products, including Search, Assistant, Gmail and Translate. Cloud TPU offers the same accelerator service to external businesses, helping them “speed up their machine learning workloads on Google Cloud”.

Enhancements to G Suite, including Google Voice Enterprise Version

G-Suite’s applications have 1.4B active monthly users. At Next, the company announced several new features for the popular program, including new functionality to Cloud Search, which will help companies securely index third party data beyond G Suite, either in the cloud or on-prem. The general availability of “New Gmail” was also announced, offering customers new security warnings along with various new features such as smart reply, offline access and snooze emails (allowing you to set a message so that the message hides temporarily and pops up later when you have time to tackle it). A new tool panel inside Gmail also offers easy access to the revamped Tasks, Keep and Calendar, and further add-ons will be included soon. To switch to the new Gmail, users simply need to click the gear button followed by the option for “Try the new Gmail”.

Toubassi also announced the addition of an enterprise version of Google Voice. “Google Voice is already beloved by millions of consumers,” said Garrick Toubassi, VP of engineering for Google on Day 2.

The enterprise-ready iteration is available via an Early Adopter program. It takes telephony beyond just audio, using context and intelligence to provide features such as voicemail transcription and call filtering. Previously, Google depended on third parties to offer telephony to its G Suite customers.

Toubassi also announced that Google Drive Enterprise can be purchased as a standalone offering to users; he said it would  “immediately allow employees to be more productive and collaborative without disrupting mail or calendar or other legacy tools”.

Shielded VMs

Part of Google’s rollout of new cloud security features is Shielded Virtual Machines (VMs), a technology it describes as “hardened virtual machines” offered as part of its Cloud Platform that protects virtual machines from numerous types of cyberattack, including the installation of persistent malware such as rootkits.

The Shielded VMs feature was launched in beta at the Next conference. Shielded VMs employs a cryptographically protected baseline measurement of the VM’s image, which offers a means of “tamper-proofing” virtual machines and an alert system, providing owners with notifications of any alterations in their VMs runtime states. Shielded VMs additionally help stop a virtual machine from being booted in a different context than its original deployment i.e. stopping theft of VMs through “snap-shotting” or other means of duplication.

Outright remote hacks of virtual machine instances on the major cloud provider platforms are rare; however, theft of administrative credentials via spear phishing attacks, are an easier route in.

The major cloud providers have been attempting to reduce threats to virtual machines and cloud application containers in various ways, for example, through hardened operating system images for virtual machines and “confidential computing” models that halt compromise of the machine’s operating system from offering access.

Google and Microsoft have each previously launched confidential computing technologies aimed at keeping user data secret, even from the cloud providers themselves – Google introduced its Asylo framework for building “enclaved” apps for the cloud in May 2018; Microsoft introduced its Azure Confidential Compute the previous September. Both platforms run application containers in enclaves i.e. “trusted execution environments”: these enclaves prevent access to the data within from being read by anything running on the underlying operating system or virtual environment.

As Chris Vickery, director of cyberrisk research at UpGuard, a cloud security firm, told Ars Technica that human error and system misconfiguration often leave the door wide open for an attacker to jump in. “A more common situation would be that someone left AWS credentials in a Github repo that was exposed to the public and forgot to limit the permissions on the credentials in the first place,” Vickery said. With access to the credentials, an attacker could make a snapshot of virtual machines or storage “and then migrate the snapshots over to an account owned by [the attacker] for pilfering,” he said. They could also potentially gain access to the virtual machine itself and drop rootkits or other malware that provide them with persistent access.

Shielded VMs use a mix of firmware-based UEFI Secure Boot and vTPM – a virtual Trusted Platform Module that is able to generate and store “sealed” encryption keys. The keys are used for (i) Secure Boot, which guarantees that the VM only runs authenticated software and prevents malicious code from being loaded early on in the boot sequence; and (ii) Measured Boot that checks against former baselines of the VM’s configuration to offer greater control over the VM’s integrity ahead of its launch.

Tritan Security Key

One of the most intriguing security announcements at Next was that of the new Titan Security Key, a physical key that offers enhanced protection against phishing attacks, in which hackers create fraudulent websites, which ask for 2-step verification codes in order to steal credentials. The security key simplifies the 2-step verification process so that the user just needs to tap the button on the physical key itself instead of retyping codes into their device/s. No phone number is required, unlike in other 2-step verification methods. The Tritan key is particularly aimed at admins and other “high-value users” who have access to sensitive data and systems within G Suite, Google Cloud Platform, and Cloud.

“We’ve long advocated the use of security keys as the strongest, most phishing-resistant authentication factor for high-value users, especially cloud admins, to protect against the potentially damaging consequences of credential theft,” Jennifer Lin, product management director at Google Cloud, said in a blog post.

In addition to offering enhanced physical security capability, Google is also offering cloud administrators more tools to set policies in order to control access to different components of the cloud. Context-aware access for instance, lets admins define and enforce access to Google Platform APIs, G Suite productivity tools, third-party SaaS apps and more, at a granular level. The context-aware capabilities are initially being made available to its VPC Service Controls users, and will later be added to its Cloud Identity and Access Management, Cloud Identity-Aware Proxy and Cloud Identity services.

Google’s Enterprise Journey

Google currently only holds 6% of the total market share of cloud business, trailing its largest nearest competitors -with AWS in the lead at 33% and Microsoft at 13%. Google is pulling in between $1-2 billion per quarter, a fraction of what AWS and Microsoft are taking in. However, the market very much remains open with 80-90% of enterprises still running their workloads on premises.

As was evident from the Next conference, it is clear that Google Cloud intends to make serious further inroads into this space. The company is investing a huge amount of resources to change business’ main interpretation of it as focused on developer needs rather than enterprise as a whole.

Various analyst firms have recently singled out Google’s security and networking capabilities for praise, for instance, Forrester Research named Google Cloud a leader in native security within the Public Cloud Platform space in Q2 2018 (listing it top of seven other vendors).

This is helpful in an environment in which minds will need changing in order to truly embrace the kind of digital transformation Google Cloud requires for its users to benefit from its cutting-edge advanced technology and AI features.

One example of this is Twitter’s recent decision to become a GCP customer. At the conference, Twitter detailed its reasons for switching across from AWS to Google Cloud.

“We operate at massive scale with a relatively small team,” said Twitter CTO Parag Agrawal. Even though the social media giant has constructed a lot of its own cloud infrastructure, Agrawal said Twitter particularly stood to benefit from the ability of Google’s cloud to help its data scientists better understand the petabytes of data it rapidly collects at scale.

Agrawal furthermore cited the performance of GCP as the primary reason for the move to Google.

“We did an extensive technical evaluation, and GCP performed best for ad hoc analysis and the network performance is so good that’s it’s a huge advantage for us to be able to do storage and compute separately,” Agrawal said.

Scroll to Top