AI: From exploration to production - five case studies on GenAI in action | Computer Weekly
Ascannio - stock. adobe. com
The launch of ChatGPT from OpenAI in late 2022 thrust years of underground research on artificial intelligence (AI) into the limelight. Two years after the first publicly available generative AI (GenAI) tools surfaced, the excitement about this emerging technology shows no sign of abating, particularly in blue-chip boardrooms.
The rate at which companies deploy GenAI programmes doubled between December 2023 and July 2024, according to Bloomberg Intelligence. Its survey shows AI and machine learning are still top of mind for CIOs and their C-suite peers, with companies investing in foundational models, GPUs and supporting cloud services.
However, other research suggests turning excitement for AI into something useful is much harder. Companies struggle to move GenAI projects into production, according to Deloitte. The consulting giant says 70% of business leaders have moved 30% or fewer of their AI experiments into production.
Worse still, analyst Gartner has predicted that 30% of GenAI projects will be abandoned after the proof-of-concept stage by the end of 2025. So, why are organisations struggling to make the most of AI? We spoke to five business leaders at the recent Snowflake Summit 2024 in San Francisco to ask how executives can turn AI into a competitive advantage.
Anastasiia Stefanska, data analyst for analytics and AI at TUI, is helping her organisation exploit emerging technology. The holiday firm uses Snowflake to create a platform for data-led change.
“I would say we’re full speed ahead on using AI,” says Stefanska. “We believe we can achieve a lot. We have worked in the last few quarters on identifying use cases. Some of these cases are in production, and other high-potential use cases are being worked on. ”
Stefanska’s colleague, Bastian Handke, technology team lead for the enterprise datawarehouse platform at TUI, says the company uses generative AI for data analysis. The company also uses chatbots in training programmes. These bots speed up the onboarding process for new staff. TUI also uses Cortex AI, Snowflake’s large language model (LLM).
“There are many use cases,” says Handke. “We do sentiment analysis using the functions from Cortex, like translate. Our colleagues in other countries get answers from their customers in different languages. To analyse that data with functions in Cortex is super useful. ”
Stefanska says TUI has created official documentation to guide its AI explorations. These papers incorporate two principles. First, there is a focus on enablement – the company wants employees to feel comfortable to suggest use cases. The second principle ensures humans stay in the loop.
“The decision is with the human,” she says. “The governance of the models and the executions are embedded into how we do things. At TUI, we understand that the chatbots are machines. We treat them with respect due to their accumulated knowledge, but we also lay a human eye on their processes. ”
Sasha Jory, CIO at insurer Hastings Direct, says her firm uses AI in several areas, including underwriting and dealing with customer queries. She says the key to success is ensuring your data is managed and understandable.
“AI loves data,” she says. “You’ve got to get everything in a format AI can read, digest and use. For example, you’ve got to turn recordings into text so that AI can read the data and give answers. Ensuring collaboration between our machine learning, our AI and our data, in the right place and format, is critical for success. ”
Jory says Hastings uses Azure-powered machine learning in underwriting. The technology has helped the company refine its pricing and risk models. “The technology is in production and doing very well,” she says.
“We have seen an improvement in speed to market by more than 100%, the number of underwriting changes we are now able to make has more than trebled, and the route to go-live is fully automated straight-through processing, making releases intraday and simple to execute. ”
“We’re not biting on the hype. We don’t think AI is the panacea for everything. AI must be used in the right areas for performance improvement and enhancement, and the technology must be implemented carefully and safely”
Sasha Jory, Hastings Direct
Hastings also uses Azure and AI to help staff write customer complaint letters. Jory says the company has used the technology to boost the readability of those letters from a score of 50 out of 100 to about 70. What’s more, the score is increasing all the time.
“That’s giving us a good outcome for our customers because they’re receiving a letter they can read,” she says. “It’s simple and means there’s less backwards and forwards with customers trying to understand their issues. ”
Jory says Hastings’ initial explorations into AI show the technology can be used to make the business smarter. “It’s certainly something that we’re looking at all the time, but we’re very cautious,” she says.
“We’re not biting on the hype. We don’t think AI is the panacea for everything. AI must be used in the right areas for performance improvement and enhancement, and the technology must be implemented carefully and safely. ”
Gerard Francis, firm-wide product head for data and analytics at JP Morgan Chase and head of Fusion by JP Morgan, says his early explorations into AI suggest the technology will solve several business challenges across unstructured and structured data.
“The problems are better understood in the unstructured world, whether it’s techniques like retrieval or augmented generation of documents,” he says. “We’re doing a range of things to improve the experience with unstructured data and the use cases you can deal with, from customer service questions to legal issues. ”
Francis says exploiting structured data should be seen as “the next wave” of AI. In an industry such as finance, there’s a surfeit of structured data, and people require accurate answers. Companies must consider how LLMs can help people generate this insight.
“I think that evolution will only continue,” he says. “We’ve focused on semantically correct data from the beginning. Very, very few people have semantically consistent data in the industry. And if you don’t get the semantics right, the LLM can’t work with the data. ”
Francis explains semantics in more detail. Employees might use different codes, such as “MATUR” or “MAT”, for an industry term like maturity. LLMs will struggle to understand what those codes mean without direction. Definitions can also overlap.
“The same name might mean three different things depending upon who that data comes from,” he says. “If you ask a question, the LLM might translate it using a description, but it might give an incorrect answer because it has a different interpretation of that word. JP Morgan works with many customers and lots of data, so we’ve created a common definition for all the business terms that somebody would use in financial markets. ”
This semantic consistency is producing great results. “It’s a big investment by JP Morgan to build this technology. We’ve hired some of the most talented people in the industry to make it work,” says Francis.
“We exist to make our clients happy and ensure we provide them with great service. For us, AI is about making that investment with an eye to the future and continuing to grow. ”
Ulf Holmström, lead data scientist at Scania Group, says his business is keen to make the most of AI. His team ensures the company focuses on the right areas.
“We have explored GenAI, but it’s not in production yet,” he says. “We have started with internal support, rather than customer-facing, processes. However, what you can do with this technology is mind-blowing. ”
Holmström says Scania is exploring GenAI through Amazon Web Services (AWS) technology. The company has been running its explorations for a year and has seen some “good results”. Scania is also interested in the technologies being developed by Snowflake, particularly Cortex.
“Based on my experience, if Snowflake delivers on its promises, then we can build generative applications faster and simpler,” he says. “I am impressed because we’ve been working hard with AWS Bedrock. Snowflake is putting another abstraction layer on top of the technology from the cloud vendors. ”
Scania uses a data mesh built on Snowflake’s platform to generate insight-led services for customers. Today, the company has around 60 Snowflake accounts. Holmström is keen to explore how this foundation can create a trusted platform for AI.
“With our mesh, we know we can trust our data,” he says. “We can focus on solving business problems and let Snowflake and AWS deal with the tech stack. It’s almost like self-serve AI. ”
Miguel Morgado, senior product owner for the performance hub at Eutelsat Group, says most of the satellite company’s AI projects are focused on outage prediction and root cause analysis. One of the key use cases is weather prediction.
“Weather events, like a cyclone, storm or heavy rain, can affect satellite dishes,” he says. “All the satellite operators have mitigations, such as increasing the signal power during heavy rain. Real-time weather data can help us predict when the weather will affect user terminals in each region and provide the best customer service. ”
Eutelsat gives its customers service guarantees. Data and AI can help the company ensure it meets service-level agreements. Morgado explains how his organisation feeds data sources into Snowflake and runs algorithms to generate insights.
“It’s all done by us in-house,” he says. “We don’t use ChatGPT, Mistral or anything else. We plan to use Cortex for certain use cases. We’ll work on that area during the next six months. We’re already doing a proof of concept. We are testing the waters. ”
Morgado says his firm’s explorations into AI suggest challenges must be overcome. One of the key issues is ensuring the business has access to a knowledge graph, a collection of interlinked descriptions of entities that puts data into context and enables analytics and collaboration.
“The problem with AI is that it’s easy to say, ‘Tell me how my satellite is doing. ’ And the system will say, ‘It’s in service. ’ However, it’s difficult to ask, ‘How is our network performance in London? ’ That question is difficult to answer because you need a knowledge graph on top of your data and AI. ”
Morgado says Eutelsat is working with RelationAI to explore how its knowledge graph runs on the Snowflake platform: “One of the big challenges for us is how we can use AI and knowledge graphs – and that is what we are testing. ”
https://www.computerweekly.com/feature/AI-From-exploration-to-production-five-case-studies-on-GenAI-in-action