April 7, 2023
Building the Stack: AI for Competitive Advantage
AI adoption just reached an all-time high among financial firms. Funds are doing everything from building their own models to testing solutions up and down the stack. What does this stack look like, and where should you start? We talked to dozens of investors and solution providers to find out. Learn more in our latest blog post below.

Today’s landscape of AI tools for enterprises is vast and ever-changing. It can feel like new solutions for every imaginable industry, profession, and use case pop up every week. It’s almost certainly not worth keeping track of which model developer is winning the horse race. While some managers are eager to dive in and get their hands dirty, others continue to take more of a “wait-and-see” approach, despite the trends we wrote about in our last article.

But when it comes to the use of AI in finance professions, there is little disagreement that firms will need to adopt and invest in ML-driven processes sooner or later.

In terms of where in the stack this adoption will occur, we have heard varying responses depending on the type and size of firm we speak with. The AI “tech stack” as we think about it is relatively similar to a framework recently outlined by Andreesen Horowitz. Shown below, it spans the base layers of chips and cloud platforms powering AI computation all the way up to the applications used by, in our case, investors.

Source: a16z Enterprise

For the most part, investors tell us their initial foray into applying AI to their work will happen in the uppermost application layer.

When financial firms use these apps, it generally means purchasing existing out-of-the-box software, web-based point solutions, or horizontal chatbots and knowledge search engines. This is especially true for smaller and mid-sized companies, where over half of deal teams we survey in our calls tell us they have already consulted ChatGPT or Google’s Gemini on a somewhat regular basis for industry research, source exploration or quick fact-checks.

Teams at funds of a certain size, often “upper middle market” (firms focusing on deals of $100M USD) or larger, mention stricter company policies limiting the use of chatbots in certain instances. Instead, they tell us, greater investments are being made to set up firm-wide initiatives, working with providers lower in the stack, often in the model and cloud layers, to combine existing capabilities in data science and tool development alongside proprietary data and knowledge libraries.

More sophisticated firms, especially “megacap funds” and others within the upper market, are already co-investing and co-developing their own applications. Apollo, for example, is working with VC and venture studio creator 25madison to deploy AI across internal deal teams and portfolio companies. Vista Equity Partners has partnered with Microsoft to focus its most recent global hackathon on AI, inviting developers from across its portfolio and “ecosystem” to identify new use cases with the firm’s software portfolio.

The extent to which these solutions are fine-tuned with in-house knowledge, infrastructure and processes will depend on the importance of AI to that particular firm.

Only the most advanced firms will make investments in the model layer, but large, tech-forward teams may see gains from building or fine-tuning smaller, local models that give them a competitive moat. Among the most high-profile such projects, in this case somewhat adjacent to the direct investing industry, is McKinsey’s “Lilli", a genAI-powered knowledge management tool.

As we look two to three years into the future, most financial executives and generative AI providers seem to agree that the reality for many firms will be a patchwork of off-the-shelf and moderately customized AI applications. These systems might provide benefits ranging from accelerating and enhancing research, to due diligence to operational oversight. Of course, every customer will need to determine the right mode of adoption for their strategy, given the capabilities and IP proprietary to their firm.

On the surface, it may seem that the applications currently available to smaller funds lend little to no long-term advantage beyond productivity enhancements. Our view is that this is not the case. Resource constraints are only one factor contributing to firms in the lower to lower middle market preferring ready-to-use applications, whereas larger funds are dabbling in partnering with model providers to build their own solutions. It’s true that resourcing and training data (mainly in the context of fine-tuning and reinforcement learning), are more plentiful at larger funds, who are bound to use their scale to their advantage anyway.

But more specialized firms have advantages too, and trends are moving in the right direction to enable adoption of extremely high-value AI solutions even as end-user applications. Here is how we see this playing out:

1. First, the analytical capability of applications is going to improve dramatically

  • Currently, new applications seem to be more focused on content summarization (e.g. paraphrasing earnings call) vs. knowledge generation (e.g. investment thesis for a buyout investor vs minority stake)
  • However, a new subset of solutions is pushing the boundaries of what today’s LLMs can do, combining RAG-based systems with proprietary datasets and unique analysis methods tailored to a slice of the market
  • Importantly, this will require…

2. Harnessing customization around a specialist firm’s unique set of differentiators

  • We believe that combining new reasoning capabilities with the ability to customize the analysis to the decision-making strategy of the firm will be the true value unlock for smaller firms adopting end-user applications
  • Larger funds with multiple sectors and broader approaches to the market may have a harder time partnering with an AI provider that can help build a firm-specific advantage

3. The ability to stay secure and private in the cloud is improving

  • Precision and data security are still the two priorities fintech buyers mention when considering purchasing
  • However, with cloud security improving across the board, larger firms’ advantages from building on-premise systems may be eroding as well

Finally, it's worth emphasizing that not all adoption of end-user applications is equal. As we have written about before, securing early-stage partnerships at this emerging phase of the AI market can create long-term benefits that managers who "wait-and-see" may not capture. At Keye, for example, we are currently tailoring our solution specifically to the needs of our early partners. However, this may not be an approach we can take for all clients as we scale.

So while it is important to think about where in the stack to adopt AI solutions, there is no one-size-fits-all solution. Even niche funds who do not make large investments into building their own models have an opportunity to build real advantages. How should more specialized firms go about identifying the right approach? We will cover that in our next post.

Recent Blogs posted from our team

See more from our latest series on AI in the private finance space.

Want to stay in the loop?

Enter your email and our team will provide regular updates.

Thank you!

Thank you!
Your submission has been received!
Something went wrong! Try again later