4 min read

Commoditize your complement: Meta AI edition

Commoditize your complement: Meta AI edition

Now, if you’re a student of the startup world, you’ve probably heard the phrase “commoditize your complement” thrown about when discussing technology business strategy. Joel Spolsky wrote a famous blog post about the idea as it relates to open source, though Carl Shapiro and Hal Varian’s book Information Rules: A Strategic Guide to the Network Economy is where I first came across the idea. However, Shapiro and Varian’s book is a bit too dense for a friendly, neighborhood newsletter like ours, so let me quote Spolsky instead:

Every product in the marketplace has substitutes and complements. A substitute is another product you might buy if the first product is too expensive. Chicken is a substitute for beef. If you’re a chicken farmer and the price of beef goes up, the people will want more chicken, and you will sell more.
…A complement is a product that you usually buy together with another product. Gas and cars are complements. Computer hardware is a classic complement of computer operating systems…
All else being equal, demand for a product increases when the prices of its complements decrease.

Spolsky unearths a bunch of relevant examples from technology history as well. For example, he tells the story of how IBM designed the architecture for the PC using off-the-shelf parts (instead of custom parts) and then documented everything in the IBM-PC Technical Reference Manual to create a standard and commoditize the add-in market (which is a complement to the PC market).

This is all ancient history now. What’s the point? Well, this exact dynamic has continued up the stack and is relevant to this day:

  • Microsoft commoditized the PC market by licensing their operating system to all those PC manufacturers that IBM enabled. As PCs themselves became commodity, the demand for the operating system skyrocketed. 
  • Then the operating system market became competitive, with Apple (macOS) and Google (Chrome OS) joining the fray. Microsoft stayed in the lead for a long time until the internet (and, in particular, the browser) commoditized the operating system itself. Who cares about the OS when you’re just hanging out in the browser? 
  • Mobile was a different story. Smartphones were a brand new platform opportunity. Apple got an early lead by taking a highly integrated approach (owning both the hardware platform and the software that ran on it) which enabled them to build a far superior user experience. In response, Google released the Android mobile operating system for free, creating a competing worldwide standard in the process to commoditize the mobile device/OS layer entirely and drive more people online. (More people online means more people searching which means more ad revenue for Google.)

On desktop, the internet used to be just links and information, but in the 2010s, the internet became all about applications. And cloud applications themselves commoditized much of the technology underneath, largely due to improvements like WebGL, which enabled desktop-like application experiences in the browser. (You can see how this played out with companies like Figma eating Adobe’s lunch.)

But mobile has been stuck. There’s been no OS commoditization like on desktop, because mobile applications aren’t accessed via the browser, they’re accessed via apps. Those apps have to be approved by the app store, which is governed by the owner of the OS. And who owns the mobile OS? Apple (iOS) and Google (Android).

All of which leads me to Meta’s announcement on Thursday of the launch of their latest open source large language model: Llama3.

Llama3 8B and 70B are both available and surprisingly, stunningly, good…beating out both open source and paid models of similar sizes. Llama3 400B is still training, but early indications suggest it will likely be GPT4 level or better.

Why is Meta investing so heavily in these models and then just open sourcing them?

It’s useful to remember that not only is Meta an application company, but at this point, Meta is more or less exclusively a mobile application company. 

And that’s why, according to Mark Zuckerberg (in this conversation with Dwarkesh Patel), Meta’s throwing billions of dollars at generative AI models it plans to release for free. Because Apple and Google won the mobile ecosystem and act as gatekeepers, and he doesn’t want that same dynamic to play out with AI.

That is part of it, I’m sure. (And who would deny Zuck the opportunity to stick it to Apple). But I’m not so sure I believe him. 

From my vantage point, generative AI isn’t a new platform opportunity like mobile. And I don’t think Zuck believes it is, either. If it was, he’d be investing billions of dollars to own that platform himself (just like he did with the Oculus and the “metaverse”). 

But generative AI is the perfect complement to Meta’s family of apps…apps that are built around connecting with other people on the one hand, and creating content, following content creators, and sharing that content with the world on the other.

What Zuck has realized is that Meta’s products will be much more enjoyable (and more valuable) with generative AI sprinkled throughout. And he doesn’t need to own the model for that to be true. So if he can drive down the cost of inference, Meta benefits big. And he’d rather be the one that sets the global standard, because, as we’ve learned, that’s what market leaders do to their complements.

The last thing I’ll note is that Meta’s open source strategy is perhaps even more disruptive than any of us yet realize. How many companies focused on generative AI security, safety and evaluation will go up in smoke now? OpenAI may still own a vibrant customer-facing product moving forward (“chatGPT” does seem to have staying power), but who will pay for their models?

What is even the business model for a company like Mistral moving forward? 

As Naveen Rao, VP Generative AI at Databricks, tweeted soon after Llama3 was released: