State of AI Report 2023

  • We have covered this series of reports before.
  • The latest 2023 report is worth a flick (all 160 slides).
  • This graph, for example, shows the largest Nvidia H100 chip clusters – interesting to see TSLA there, who also run the 4th largest A100 cluster in the world.
  • Or see Slide 76 which suggests that Nvidia’s advantage (the use of its chips in academic papers) continues to increase.

History of Powerpoint

  • Fascinating history of how we all ended up doing PowerPoint slides all the time.
  • The creator of PowerPoint also has a colourful story – “It’s hard now to imagine deafening applause for a PowerPoint—almost as hard as it is to imagine anyone but Bob Gaskins standing at this particular lectern, ushering in the PowerPoint age. Presentations are in his blood. His father ran an A/V company, and family vacations usually included a trip to the Eastman Kodak factory. During his graduate studies at Berkeley, he tinkered with machine translation and coded computer-generated haiku. He ran away to Silicon Valley to find his fortune before he could finalize his triple PhDs in English, linguistics, and computer science, but he brought with him a deep appreciation for the humanities, staffing his team with like-minded polyglots, including a disproportionately large number of women in technical roles. Because Gaskins ensured that his offices—the only Microsoft division, at the time, in Silicon Valley—housed a museum-worthy art collection, PowerPoint’s architects spent their days among works by Frank Stella, Richard Diebenkorn, and Robert Motherwell.” 

Training vs. Inference and GPU Demand

  • Everyone is trying to figure out what AI means for GPU demand.
  • It’s hard as the true picture is muddied by providers investing in their own customers, demand-pull forward, and strategic buying ahead of having a real use case (see Saudi, UAE, U.K.)
  • Confounding all this is Meta releasing Llama 2 for almost free, followed most recently by its coding version (by far the most useful application of AI so far).
  • This matters because training is a lot more GPU-intensive than inference. Free models mean less training needed. This specifically matters for Nvidia’s H100 chip (which by the way weigh over 30 kgs!).
  • Qualcomm actually thinks processing might happen right in our phones (they of course would benefit most from this).
  • Eventually, a lot of the AI processing will move over to the device for several use cases. The advantages of doing it on the device are very straightforward. Cost, of course, is a massive advantage. It’s — in some ways, it’s sunk cost. You bought the device. It’s sitting there in your pocket. It could be processing at the same time when it’s sitting there. So that’s the first one. Second is latency. You don’t have to go back to the cloud, privacy and security, there’s data that’s user-specific that doesn’t need to go to the cloud when you’re running it on the device. But beyond all of these, we see a different set of use cases playing out on the device.” Qualcomm CFO Akash Palkhiwala (via The Transcript).

Podcast Business and Spotify

  • Podcast advertising leads to pretty good returns for brands.
  • After conducting a study with 250 advertisers and marketers, it says two-thirds (67%) of podcast ad buyers say that every $1 spent on podcasts returns between $4 and $6 for their brands.
  • Yet SPOT is struggling to capture this – why?
  • This blog post covers a lot of reasons. For example:
  • “[What’s] most misunderstood about Spotify is Spotify doesn’t get to monetize all the podcast content that they have. So in the most recent quarterly earnings report, they say that they had 5 million podcasts on their platform, but 99.9% of those podcasts, Spotify does not get to monetize.

Understanding AI

  • Economics says change happens in the adjustment of prices and relative prices.
  • Thanks to GPT, every programmer has the potential to be 10x more productive than the baseline from just 2 years ago.”
  • This means:
  • (1) The data-software combined price is collapsing opening enormous “volume growth”.
  • (2) Within it the relative value of software vs. data, especially unique data sets, is changing in the benefit of the latter (Media companies?).
  • (3) New scarcity is arising – likely in hardware and energy.
  • Full article here.

AI Moat

  • This leaked memo from Google has been doing the rounds last two weeks.
  • The jist is that no one has a moat in AI.
  • The arguments boil down to the idea that there has been so much innovation that open source will win.
  • Ben Thompson lists a few counterarguments about why this might not be true.
  • The other point, mentioned by a friend, is having intellectual property infringement experience as a key competitive advantage.

Getting to Grips with AI

  • Since ChatGPT burst on the scene a lot of investors have been scrambling for the implications.
  • One interesting argument we already covered here.
  • The other way to approach it is see what the industry experts are saying – this post has a good collection.
  • If you want to read the actual transcripts grab a free two week trial here.
WordPress Cookie Notice by Real Cookie Banner