By Abraham Thomas | 15 March 2024
At Merak, we are thesis-driven investors. We are generalists and invest across a broad range of industries, business models and technology stacks. But we also believe that fortune favours the prepared mind. Therefore, we are constantly creating and evolving frameworks through which to understand the startups we invest in, and the markets they operate in. In this article we describe the third of our investing frameworks for SaaS: the Emerging AI Stack
Every few years, the world of technology goes through a platform shift: the emergence of new capabilities or business models that have a transcendent impact on society. These platform shifts compel every company in every industry to utterly reinvent the way they operate; some seemingly indestructible incumbents fall by the wayside, while new upstarts go from zero to world domination in the blink of an eye.
Past platform shifts have included the emergence of personal computing in the 1980s (Apple, Microsoft), the internet in the early 90s (Netscape, Amazon, eBay), cloud computing in the early 2000s (Salesforce, AWS, Google), smartphones in the late 2000s (Apple, Samsung) and social networks in the 2010s (Meta, X, Tiktok). But platform shifts go back much further than that: the automobile plus mass manufacturing drove one big shift in the early 20th century; electrification drove another in the late 19th century. Some of the greatest entrepreneurs in history rode the momentum of platform shifts that in many cases they themselves created: from Thomas Edison and Nikola Tesla, through Henry Ford and Bill Gates and Steve Jobs, to Larry Page, Sergey Brin, Jeff Bezos and Mark Zuckerberg.
We’re currently in the midst of a major platform shift: the emergence of AI. As with all platform shifts, AI (more specifically, language and image models like GPT and Sora) unlocks technological capabilities that seemed science-fictional just a few years ago; companies everywhere are scrambling to incorporate AI tools in both their customer-facing products as well as their internal workflows; and the pace of change is faster than many observers can keep up with.
For tech founders and operators, as well as for the venture investors who back them, this presents both a challenge and an opportunity. The opportunity is obvious: the possibility of reinventing entire industries or building the infrastructure that powers a new age. The challenge is that AI is progressing so fast that it’s hard to know what that reinvention will look like; where competitive advantage resides; and who will capture the economic value of these manifold breakthroughs.
At Merak, one of our guiding principles is that fortune favours the prepared mind. We can’t predict exactly how the AI revolution will play out, but we have developed a rigorous framework for thinking about developments in this space – and the companies that play in it.
We believe that the broad universe of companies and technologies that fall under the umbrella of AI can be separated into distinct “layers” that we call the emerging AI stack. We examine each of these layers below:
The bottom-most layer is physical infrastructure that is designed from the ground up for AI use cases. (Nvidia’s AI-optimised GPUs are the perfect example of this.) Computation, energy and storage are the three key pieces of physical infra for AI; there’s also an opportunity to build service wrappers around these (the AI analogues of AWS, Azure and GCP). As investors, we like this layer for two reasons. First, we think the economics for winners in the hardware layer will be strong irrespective of how the rest of the AI stack plays out. And second, we believe most VCs – especially in India – lack the ability to identify, evaluate or successfully invest in this type of hardware play; we on the other hand have the skills, experience, and track record to do so, and indeed have been doing so for several years.
The next layer is that of foundational models. Almost all AI applications today are built on top of a handful of AI cores: GPT, LLaMa, Claude, Gemini, Cohere, Midjourney, Sora, DALL-E, Stable Diffusion etc. These base models are enormously expensive to train, and they require massive amounts of compute and data, making them largely the preserve of incumbents like Meta and Alphabet, or hyper-funded startups like OpenAI and Anthropic. We take the opposite tack: we’re interested in low-cost disruptors at this level, most saliently in open-source and university research, but also in small teams that punch above their weight like Midjourney.
While some users may access foundational models directly, most high-value use cases will involve custom applications built on top of these models. This implies the next few layers of the AI stack.
The software infrastructure layer comprises the horizontal tooling, common across a wide range of applications, that enables those applications to effectively use the foundational models, manage their data assets and training, and monitor their AI-based processes. This category includes AI orchestration and “glue” technologies, AI-optimised primitives like vector databases, and tools for AI-specific tasks like RAG and other fine-tuning / feedback mechanisms. The economic outcomes for winners in this layer will be huge; in the most optimistic scenario, this is a complete rewrite of the world’s codebase. Naturally, this makes it an extremely competitive space (or set of spaces), and we therefore seek startups with innate or structural competitive advantages; exceptional teams, execution and distribution.
Sitting parallel to the software tooling layer is the data infrastructure layer. In addition to the planet-scale datasets that form the baseline training corpus for foundational models, it seems likely that specific use cases will require fine-tuning on application-specific datasets. This will benefit firms who own such data assets. We, therefore, actively seek firms that generate or capture data as part of their value-add: typically, via a system-of-record or data-learning-loop. We believe that such firms will retain their market leverage and economics even in an AI-first world. We are also on the lookout for ‘data enablers’ – software or platform solutions that help companies effectively manage and capture value from their own or third-party data.
The final layer in the AI stack is that of end-user applications. Many of these take the form of “thin wrappers” – a simple UI or UX on top of a foundational model. We’re not fans of such startups; we feel that they don’t add enough long-term sustainable value and can easily be displaced. Instead, we like “thick wrappers” – applications that offer a great deal of value-add on top of their basic AI capabilities. Often, this value-add is very workflow-specific or industry-specific, incorporating tons of nuance and detail that solves customer problems in a very sticky way, and getting ever better over time.
A common feature of all these layers and approaches is that exceptional founders matter. There’s no substitute for expertise – whether it’s the technological expertise that goes into building core infrastructure, or the domain expertise that goes into building rich and powerful vertical applications, or the user expertise that goes into building products that delight customers. Couple that with rapid learning, ambitious execution, and a single-minded focus on solving problems that matter, and the sky's the limit. Platform shifts change the world!