August 11, 2023

Slicing Through AI Innovation: The Pizza and Algorithm Connection

Author imaage
Shyam Nallasenapathy

I write this to you, knowing that this can either come across as a silly comparison or as a good way of comparing the moats in commoditized products where I believe the AI race is headed. Whatever it may be, I hope you enjoy reading this. From my brain to yours.

We are witnessing a multitude of LLMs taking their stand, with billions of dollars being poured into this framework and a host of technical architectures taking shape to support them. For those of you who don't know what an LLM is - in simple words, it's a knowledge bank that knows answers to all the books/documents it has read and images it has seen.

The boundaries of expertise in one subject don't apply to LLMs, given they can digest any amount of information compared to a normal human mind.

The above are the stacks that we will use to make our case. Yes, you are also seeing a pizza stack, and yes, it is my favorite food. And yes, we are going to compare LLMs with a pizza. The idea is not to explain every part but to give you an idea of how AI application layers differ from normal software. It's going to be algorithm amalgamations that will compete, rather than offering more features for a lower price in software.

Breaking it down into 3 parts:

What is the difference between the tech stacks of an app (consider a chat app) if all of them are solving the same use case?

Let me answer this with pizzas. The base, the tomatoes, the cheese - all of them are critical to how a pizza is made and tastes. These elements are available across the globe in different forms. However, one knows that a pizza from Naples is very different from one in New York or a deep dish in Chicago. The underlying commonality is that the elements in the pizza are shared, but the way they are made and where the raw materials are sourced from is where the true differentiation and competitive advantage come into play.

Similarly, in an AI app, the use of an LLM, embedding support, cloud usage, and balancer setup will be common, but how these components interact with each other and intertwine to address the use case will be the most significant differentiating factor. There are multiple factors that go into an AI app algorithm; prompts are just the surface layer, which seems to have taken the main stage. You also need a data preprocessing layer, a feedback layer, UX to support the bot, and an efficiency layer in terms of both UX and cost. All of these will matter for an AI app, and every single one of these layers can be built differently. In my mind, there are no finite combinations at this point. One can use OpenAI, Anthropic, Hakuna, or any other new model that comes into the foray, but they would have just tested the dough, not the pizza itself.

Why is the moat in the last layer (application layer), and how will companies build their layers?

Simply put, when you crave pizza, you want all the layers cooked to perfection. And once you enjoy the pizza, you'll keep coming back for more, and it's not solely because of a single ingredient. In the AI race, the application layer is where most of the moat sits. Let me explain why - once users experience what the app is capable of and how effectively it addresses their needs, there is a high chance that there won't be a similar recipe out there, as the intricate details operate in shadows.

Another critical aspect is that only the application layer has access to structured feedback loops in the setup, not the LLMs or the clouds in play. This, in turn, helps fine-tune the underlying algorithm so much that the AI app gains a profound understanding of the data. As a result, any other app that comes along would face a significant learning curve and often find it not worthwhile to compete from that standpoint. Thus, the application layer becomes the stronghold, providing a competitive advantage and differentiation for companies building their AI applications.

How much power does the LLM have in this setup?

The LLM is a highly critical component, as it forms the backbone of the AI setup. The entire AI system relies on the intelligence of LLMs to perform its tasks. However, every AI app should be designed with an architecture that allows for an easy switch to different LLMs based on the specific use case. While LLMs can provide precise information, how that information is utilized is what truly matters.

To build upon an LLM successfully, precision, speed, and context awareness are key. Although English might be the new coding language, the inherent subjectiveness of English and how applications handle this subjectiveness will determine the success or failure of the AI app.

In some scenarios, applications might even use various LLMs within the same system, but managing the complexity of such an approach will be a challenge. As Bryant H. McGill aptly said, "It is better to have a fair intellect that is well used than a powerful one that is idle." The same holds true for LLMs; the true value lies not in the number of data points they are trained on, but in how effectively they are put to use. For instance, a healthcare use case will not benefit from an LLM that has been trained on entertainment data, and vice versa. Selecting the right LLM for the specific task at hand will be critical for the success of AI applications.

How will the market look, will it be algorithms at play hereafter and not just feature sets?

It will be the war of algorithms against one another, features will play a part of course. Whose recipe is the best and can stay consistent across seasons (verticals) will have a higher probability of winning. It will be a long game, with more iterations, feedback loops, and numbers taking the stage.

In the application layer, the team that can be fast and nimble will likely win and take the lead. The prompts in English might have to be tweaked for every use case or even for every client in a SAAS setup.

Moats will be in the data layer, while prompts will be protected like Coke’s formula, and IP will lie in the process rather than just in the prompts.

We are on the verge of a new business domain: AI as a service, which will be somewhat different from the typical SAAS business or a typical consumer business. While we all watch AI penetrate our lives, it surely leaves a lasting impression and provides great scope for new innovations to be built on top. Exciting times lie ahead, and it’s time for algorithms to take center stage. Pizza in hand - Let the silicon wars begin!"

We value your privacy

We use cookies on our website to see how you interact with them. By accepting, you agree to our use of such cookies.      
Privacy Policy