It was a day well spent at the Real Deals Tech Innovation Conference and a dominant theme was Generative AI (GenAI). Nearly 18 months after the launch of ChatGPT, and the wave of excitement that followed, I’ve been reflecting on the state of play with GenAI and our learnings around what makes a successful implementation.
Passing the peak of the hype cycle
While GenAI spending is projected to double in 2024, over a hundred new GenAI vendors are expected to be formed, and the capabilities of Large Language Models (LLMS) are continuing to evolve with developments like Open AI’s Sora video generation, it does feel as the initial frenzy has subsided a little, with businesses taking a more realistic view of LLMs strengths and limitations. Businesses are recognising GenAI as a valuable tool within a broader digital transformation strategy, rather than a silver bullet that will fix all problems for all people. We have moved past the point where businesses felt compelled to use AI because everyone else seemed to be doing it, without always having a clear idea of the problems they were trying to solve. While the talk around GenAI has quietened, adoption has continued to increase, helped by the launch of Microsoft Copilot despite some teething issues with the initial release.
Keys to successful implementation
As more GenAI projects reach completion, the success factors are becoming clearer. Here are 5 key steps to successful implementation:
1. People. There are two key considerations around people. The first is having high-quality data talent in the business, which is increasingly becoming a differentiator for businesses given the scarcity of these people. They bring the ability to continuously develop and deploy new models as business requirements change. The second area is leadership. As with any change programme, leadership buy in and sponsorship is key to driving the right culture and behaviours throughout an organisation.
2. Prompt engineering. A lot of the value in LLMs is in the quality of the prompt. Upskilling teams using training or prompting guides is an important step to driving up adoption and value delivered. Given you can expect at least pockets of individuals in any firm to be using GenAI in their day-to-day roles, particularly from younger generations, companywide training is often a sensible approach.
3. Mechanism for extracting value from insight. In order to make sure value is extracted from LLMs, their implementation should be treated as any change programme, with thought given to winning hearts and minds, training, process changes and measurable success criteria. Outputs from LLMs need to provide more than just improved efficiency of individuals or interesting insights – deployments of these models need to be supported by fundamental changes to business operations to ensure value is being realised.
Often leaving individuals to experiment with LLMs is a good way to identify new use cases, but there needs to be a process to collate and centralise the ones that work well to deliver impact across the whole organisation.
4. The right tool for the use case. Understanding the specific use case is important in adopting the right tool for the job. Are you looking to generate content? Or to automate repetitive admin tasks? Or for support with decision making? Or for insights based on business data? The tool you would use for each of these is different, and for most of them GenAI is unlikely to provide the best solution. If you do use a LLM, there are further considerations, such as whether you deploy an out of the box model or train a bespoke one, which will again be driven by use case.
5. Data assets. Finally, the data in the business needs to be accurate and accessible for the models to generate useful outputs. Rubbish in, rubbish out. In addition to this, protecting your data is an increasingly important consideration – both internal data usage policies and agreements with third parties that have access to your data are key.
ECI’s approach to GenAI
At ECI we’re taking a proactive approach to Gen AI, both internally and for our portfolio. Internally, we have developed and integrated an LLM into our origination tool, Amplifind, and have deployed Microsoft Copilot across all teams in the business. For our portfolio, we are hosting a Digital Summit next month where we bring together Technology, Data and Finance leaders from across the portfolio to share their experiences and learnings with a focus on driving value from LLMs, which I am very much looking forward to.