Some rough thoughts on investing in generative AI

Some rough thoughts on investing in generative AI

The topic du jour - this week, last week and every week of 2023 so far has been AI (see NFX’s ever growing market map here). Large Language Models have captured the imagination of investors and entrepreneurs alike. As I’ve continued to dive into the space - meeting with founders, tinkering with the tools, and deepening our understanding of the technology - I’ve realized that it has more potential than previous hype cycles. However, I am also reminding myself to stay sober when assessing new investments.

How ChatGPT works

To make sense of it, I’ve sought to understand both the theoretical (reading) and the practical (meeting various entrepreneurs in the space). On the former, I suggest Stephen Wolfram’s deep dive on ChatGPT. To summarize it briefly:

  • ChatGPT aims to continue texts by asking itself what the next most reasonable word(s) should be, given the text so far. The next most reasonable word is based on training the model on billions of articles online to figure out the probability estimates.
  • There are about 40K words in the English language. By looking at a large corpus of English text (e.g. a few million books) we can get an estimate of how common each word is. And from there we can generate "sentences" where each word is independently picked at random with the same probability as it appears with in the corpus.
  • To make something sound even more humanlike, you can take the probabilities of certain words coming together (e.g. after the word "cat" most often comes "food is smelly", not just an individual word. The ChatGPT model comes up with estimates of the probabilities of sequences of words
  • However, it’s not just about picking the highest probability word. A higher probability word can make the text sound more authoritative - useful for something like a summary of an article. In contrast, a lower probability word can make a text funnier, say, a poem written about Bitcoin by Donald Trump. This ability to mimic various types of human output is why the technology has appeared so magical to many.

The technology is actually impressive and groundbreaking

First, a comment about the technology. It’s evident that LLMs are more than a “next word predictor.” The Transformer model architecture that underpins ChatGPT appears to understand the structure of language and the “meaning” of a prompt. This provides computers with new superpowers to summarize, query and create text (and code). It’s likely that this affects jobs and workflows who primarily work with language. But beyond this, and despite the hyped demos, it still remains to be seen if generative AI significantly changes how knowledge work is done.

But it’s important to remember that LLMs are only a small subset of ML

Moreover, I think GPT3/4 has overshadowed some of the interesting work happening in other areas of ML that could be more impactful. ML can be used to find patterns in anything that can be represented numerically, not just written language. The applications here are numerous–from voice recognition to predictive analytics. With this framework, at 1984, we’ve backed companies like Deepscribe, an AI-powered tool that transcribes patient visits and saves doctors 3 hours a day, and Syrup, an AI-powered inventory manager that analyzes historical sales to help retailers reduce excess inventory.

Implications as an investor

As an investor who eschews hype, I’m proceeding cautiously. On the one hand, it's evident that end users of products will benefit tremendously. Content creation has been made 10x easier and there are already clear use cases across categories like generating marketing copy and summarizing content. But this unlock of content creation is the same reason why investing in the category can be dangerous. For example, in this current Y Combinator batch, there are already 5+ companies building a B2B customer support app! And 50+ of the ~250 companies in the batch are working on a generative AI product.

In a world where the hard work (of training and open sourcing the model) has already been done, competition for each category will explode. And competition isn't just limited to startups. Unlike in previous eras, the incumbents are not slow moving. In fact, players like Notion (that I’m using to write this!) have been some of the quickest to incorporate this functionality into their products. Generative AI has made for magical demos and promises to make tasks better, faster and easier. But as an investor, I’m skeptical that without another advantage (such as proprietary distribution or cheaper customer acquisition), startups are going to be caught in the crowd. Consumer surplus doesn't necessarily translate into outsized investing returns.

*One caveat to the above is that if you aren’t intending on creating a venture-scale business, there is a massive opportunity for a small team (or an individual) to create a revenue-generating business that generates substantial revenue that covers ones costs. It could be the best time ever to create a “lifestyle” business (I hate the term, but don’t know of an alternative phrase).