Skip to content

Top 3 IT Infrastructure & App Modernisation Updates from Google Cloud Next 2024

At Google Cloud Next 2024 in Las Vegas 9-11 April, some team members of the Devoteam IT infrastructure & app modernisation tribe were present and ready to learn. Read their top 3 highlights regarding IT infrastructure and app modernisation announced at Google Cloud’s yearly flagship event. With as a common denominator: the AI wave coming to the field of IT infra & app modernisation.

This article is written by Devoteam G Cloud’s Global CTO Jason Quek.

Let’s talk about the future of Google Cloud, where we can see a wave coming. That wave is the AI wave, which is blending AI into our daily way of working across all main Google Cloud domains: IT infra and app modernisation, data & work transformation. This AI Wave was very present across the entire Google Cloud Next 2024 event in Las Vegas.

1. Gemini for Google Cloud

That brings us to the first big announcement in IT infrastructure modernisation: “Gemini for Google Cloud”. This is the first step of embedding Generative AI assistants platform-wide. “Gemini Code Assist” for developers, “Gemini Cloud Assist” for platform engineers, “Gemini in Security Operations” for security engineers, “Gemini in Databases” for database engineers and migrations. 

These are currently in preview, but Devoteam has also been in multiple trusted tester programs, giving feedback to the product teams to evolve the offering and acting as the voice of the customer. 

Gemini Pricing Model

The only roadblock we see is the pricing model, which has been evolving into a per user licence type of fee, similar to Gemini for Workspace. 

I believe that moving to the cloud has always been a pay-for-what-you-use experience instead of having a campaign to move customers to certain sku of products. 

I would like to challenge Google to use a more innovative and fair pricing model, such as pay-per-query in BigQuery. So if platform engineers use the Gemini assistant, they can pay per query. 

That would be fair to me, and the better it becomes, the more users would use it. It would be a non-predictable usage model, but that’s how BigQuery also won the hearts and minds of many of its proponents, same with Kubernetes and Cloud Run. The fact that we pay for what we use.

2. Google Axion Processor

The second big announcement was the Google Axion Processor. This was actually one of the keynotes’ first announcements, and it is worthy news. 

Impact on other CPU-providers for Google Cloud

Devoteam is actually also a partner to Intel, as we help customers move to Google Cloud VMWare Engine, which is only available on Intel processors. This news might impact Intel and AMD, which provide the CPUs for Google Cloud. 

But we think that this actually will enable more workloads on Google Cloud which are more price sensitive, rather than steal market share from Intel and AMD. 

How to redefine your VMware Strategy?

Get inspired at our online event on May 22nd. Hear from our customer Wessling, Google Cloud and Devoteam experts.

Security considerations for custom hardware by Google

Another side-effect of custom hardware by Google is often in security, which is built from ground zero in all Google products. This means that Intel and

AMD-targeted attacks will not compromise their arm-based chip architecture as well. 

Availability of Google Axion Processor

The Google Axion Processor will be available on Google Kubernetes Engine, Dataproc, Google Compute Engine and Cloud Batch. It will become a price competitor to on-prem systems which have already been depreciated and running on extended life warranties.

Lower carbon footprint & sustainability targets

The processor is also more energy-efficient than the current generation standard x86 instances, which also helps the energy targets of companies that want to lower their carbon footprint and hit their sustainability targets.

3. AI Anywhere on Google Distributed Cloud

The third big news for us is the AI Anywhere on Google Distributed Cloud. As a Google Cloud partner, Devoteam has been a strong proponent of Anthos and GKE On-Prem from the beginning in 2019, as Google invested into the development of running Kubernetes on VMWare, Bare Metal, Azure and AWS. That all while being monitored and operated in Google Cloud. 

Also see this dedicated article by our Devoteam EMEA App Modernisation Practice Lead for Google Cloud Matthieu Audin.

Deploying Gemma models on to Google Distributed Cloud

Devoteam has invested similarly in training of our consultants in the product, which resulted in three Google Cloud Certified Fellows in Hybrid and Multi Cloud (this has now transitioned into the Champions Innovators program). As a result now in 2024, besides running applications and databases on Anthos (now repackaged as Google Distributed Cloud), customers are able to deploy Gemma models on to Google Distributed Cloud. 

AI experience in highly regulated environments

Having a compliant standardised surface to run containerised models, which fit a strong MLOps process, is one of the key requirements to deliver the same AI experience to customers which work in highly regulated fields. Think of banking and healthcare, industries where data can’t be moved to the cloud. 

This truly follows the promise of Google democratising AI even to hybrid customers. 

Of course, the list of announcements at Google Cloud Next 2024 was much longer; but we were most excited about these 3 novelties in the field of IT infrastructure and application modernisation: the embedding of generative AI assistants across Google Cloud solutions, the newest Axion processor becoming a price competitor for on-prem systems and allowing for security and lower carbon footprints, and the AI anywhere experience on Google Distributed Cloud. 

More updates about IT infrastructure and app modernisation with Google Cloud?

Join our experts for an online session on May 28th!