2023 Predictions
Last year, I focused my 2022 predictions on becoming a better decision-maker. In 2023, the theme is loss functions.
The loss function in optimization is a function that serves as a proxy for the underlying performance measurement. In many cases, it's one of the most important components of any form of machine learning.
It's also sometimes referred to as the cost function, objective function, error function, or reward function, depending on what you're doing. Those terms capture the essence of what I'd like to get out of my 2023 predictions – a measured error and a way to derive a path to improvement (i.e., a metaphorical gradient).
Private Equity comes for SaaS – This is already starting to happen for those watching Thoma Bravo. SaaS multiples will continue to readjust, and many companies will be left without a clear path forward – no window to IPO, no additional capital at their current (e.g., 2021 high) valuations, but with real intrinsic value and revenue. If it does happen, it will be hard to judge in 2023 whether or not this was a good move (for founders or PE).
Questions:
Will this continue the trend of mass layoffs at tech companies? How overstaffed are tech companies? With Musk's policies at Twitter and a PE wave, I imagine we'll find out.
Loss function: How many public or growth-stage companies get acquired by private equity? Is it measurably different than previous years?
A wave of new generative AI companies and the march to monetization for existing ones – ChatGPT, Stable Diffusion, and GPT-3 have created renewed excitement in AI. There has been a large amount of capital flowing to startups in these areas, so we'll see products in this space (companies funded are a leading indicator of products launched). Meanwhile, existing products that have captured attention (pun intended) but not monetization will inevitably have to monetize (OpenAI, Stability AI).
Questions:
How successful will API monetization be? We'll get data on how value accrues in the generative AI stack. My prediction: distribution always wins. APIs are easy to copy, and data is not the new oil it's made out to be (e.g., publicly scraped data, synthetic data, and more). Interfaces are more important than raw parameters (see Jasper and ChatGPT vs. raw GPT-3).
In a way, this is closely linked to the question of foundational models. Do a few models form the basis of specific use cases (i.e., is everything simply fine-tuned on GPT-3)? My prediction: foundational models still might reign in 2023, but anything important enough might not only fork but be completely separate from the main models (i.e., train a new model from scratch that's smaller, more specific, and more cost-efficient).
That answer will ultimately inform the AI infrastructure companies. The last generation of AI infrastructure companies found the landscape to be completely changed in only a few years. Feature engineering isn't as important in a world of foundational models, which removes the need for a feature store. Training and inference are largely undifferentiated, but that shows MLOps is probably more convergent than divergent.
Questions:
LLMs fine-tuned code enables developer productivity in various places – I'm very excited about the application of LLMs to developer workflows. First, chain-of-thought in LLMs dramatically increased when code was introduced into the training data (OpenAI's Codex). See How Does GPT Obtain its Ability? Tracing Emergent Abilities of Language Models to their Sources for a good explanation. I imagine GitHub Copilot is already bringing in significant revenue (even for GitHub). A few of this year's thoughts on how it could play out.
Other predictions and questions
Will AWS launch a native WebAssembly product? There's already Lambda@Edge and Lambda, which theoretically can run WASM or WASM-like programs. This is the only thing I believe can push forward WebAssembly serverside. Otherwise, I don't believe the market is big enough.
What regulation comes out of the FTX collapse and the SBF trial? Whatever the regulation is, it will most likely be big enough to change the crypto landscape.
What happens to Twitter? As I wrote in Improving Twitter at the time of the acquisition, I don't believe that Musk can switch to subscription revenue quickly enough. Even with more eyeballs, advertising is unlikely to move the needle. So what happens to Twitter? Fire sale? I imagine Musk still has a few tricks up his sleeve to create controversy and drive engagement, but time is running out.
Which is more resilient in a downturn, usage-based vs. seat-based SaaS? Lots of counter-intuitive results so far, but too early to tell. Learning this will have some interesting implications for selling SaaS going forward.
Do incumbents successfully integrate LLMs into products (e.g., Excel, Word), do growth companies with somewhat mature products integrate them and move the needle (e.g., Notion, Airtable), or does it require a completely new product? My prediction is that integrations will be easily copyable by the best distribution, and breakthroughs will only happen when generative AI completely changes the product (or better yet, counterpositions it).