Please note that we are not authorised to provide any investment advice. The content on this page is for information purposes only.
The UK’s Lloyds Banking Group started using Google Cloud and its AI tool known as Vertex AI to accelerate the development of its own AI solutions. Specifically, the bank is hoping to speed up the creation of a new machine learning (ML) and Generative AI (GenAI) platform using Google Cloud.
According to the bank, the initiative has already revolutionized its ability to deploy impactful AI use cases at pace. Right now, over 300 of its data scientists and AI developers have started using the platform.
The Group also revealed that it had migrated 15 modeling systems to the new platform as part of this initiative. These systems comprise hundreds of individual models from the bank’s on-premise infrastructure, which allowed it to save 27 tons of CO2 operational emissions.
New Use Cases And Capabilities Are Already Emerging
The migration also unlocked a number of new capabilities and tooling across some of the key business areas, such as an algorithm that reduces the income verification step for the users’ mortgage applications, now lasting only seconds instead of days.
The bank managed to initiate over 80 new machine learning use cases successfully since it deployed the new platform on Vertex AI. On top of that, it also launched more than 18 GenAI systems into production, spanning its entire business. The bank expects that another 12 GenAI systems will go live by the end of June 2025.
Apart from that, it has also seen ongoing work regarding a new Agentic AI system for customer interactions. It collaborated with Google Cloud to create a prototype, and it believes that the product will be ready to launch at some point later this year.
Lloyds’ group chief data and analytics officer, Ranil Boteju, stated that moving to Vertex AI has been extremely transformative for the Group. It provided Lloyds with the reliability and scalability that allows it to innovate with AI at pace.
“Vertex AI is enabling data scientists and AI developers across the Group to access GenAI solutions with consistent guardrails, as well as giving them the flexibility to use Large Language Models from third parties and open-source providers, as well as Google’s Gemini model,” he added.