GPU Semiconductor Content

  • Nice slide from KLA.
  • So I tried just to put together this chart to show how different the GPU package is between 2015 & 2024. So of course, the B100 chip, the GPU introduced a few months ago, and this is not enough because Jensen has already introduced the next generation of GPU last week” (h/t The Transcript).

95 Theses on AI

  • Interesting list.
  • Decisions made in the next decade are more highly levered to shape the future of humanity than at any point in human history.
  • Technological transitions are packaged deals, e.g. free markets and the industrial revolution went hand-in-hand with the rise of “big government” (see Tyler Cowen on The Paradox of Libertarianism).
  • Natural constraints are often better than man-made ones because there’s no one to hold responsible.

AI = Insane Energy Demand

  • “Boston Consulting Group believes that AI and regular data center demand will grow to 7% of total electricity demand by 2030. To put this in context, this is the equivalent of the electricity used for lighting in every home business and factory across the United States. It’s a huge amount of energy. Most traditional data centers that were built 10 years ago were 10 megawatts or less. Today, it’s not uncommon to see 100-megawatt data centers. And with our clients, we’re talking about data centers that approach 1,000 megawatts. And they require 24/7 power. This is something that doesn’t get talked about enough in my opinion.”
  • That is from Constellation Energy’s CEO Joseph Dominguez (Source: The Transcript) – who of course is talking his own book but still.
  • Others confirm this, like the IEA – which thinks AI energy demand will double by 2026 – “that’s equivalent to adding a new heavily industrialized country like Germany to the planet”.
  • There are other huge environmental impacts – e.g. water.

Altman’s $7 Trillion

  • Back-of-the-envelope analysis of why Altman wants this sum to build semiconductor capacity and why it isn’t such a crazy number.
  • It’s a useful reminder of what it will take for AI to scale in the coming years.
  • The article also links some more serious analysis of the trend in the cost of training AI (like this).

AI and Healthcare

  • Healthcare is one area where the application of AI, in its LLM and other forms, could be enormous.
  • This nice article from AlphaSense Expert Insights explores the topic, mirroring the huge rise in expert calls in the sector mentioning the term.
  • It is not all areas that can be bent to the will of ML. As this piece argues, academic literature and the correspondent knowledge graph is both difficult and not that useful to program.
  • If you want to read some of these transcripts, you can grab a two-week free trial.

State of AI Report 2023

  • We have covered this series of reports before.
  • The latest 2023 report is worth a flick (all 160 slides).
  • This graph, for example, shows the largest Nvidia H100 chip clusters – interesting to see TSLA there, who also run the 4th largest A100 cluster in the world.
  • Or see Slide 76 which suggests that Nvidia’s advantage (the use of its chips in academic papers) continues to increase.

Eric Schmidt on AI

  • Interview between Goldman Chair/CEO David Solomon and former CEO/Chair of Google on the future of Generative AI is worth a read.
  • In general, the disruption occurs first in the industries that have the most amount of money and the least amount of regulation.
  • Pairs nicely with this analysis of the latest batch of Y-combinator companies that are using AI/ML startups (139 in total!) and what areas they are working on.

Chief AI Scientist at Meta Talk

  • Yann Lecun is the Turing Award winning professor at NYU and Chief Scientist for AI at Meta.
  • His latest talk at MIT is worth a listen.
  • It is a bit technical but left me with a feeling that though LLMs are a big breakthrough, they have big limitations.
  • Models beyond the autoregressive LLM that start to mimic some of the planning and reasoning required to rival human intelligence are a lot more complicated with not-so-neat solutions.

Understanding AI

  • Economics says change happens in the adjustment of prices and relative prices.
  • Thanks to GPT, every programmer has the potential to be 10x more productive than the baseline from just 2 years ago.”
  • This means:
  • (1) The data-software combined price is collapsing opening enormous “volume growth”.
  • (2) Within it the relative value of software vs. data, especially unique data sets, is changing in the benefit of the latter (Media companies?).
  • (3) New scarcity is arising – likely in hardware and energy.
  • Full article here.

AI Moat

  • This leaked memo from Google has been doing the rounds last two weeks.
  • The jist is that no one has a moat in AI.
  • The arguments boil down to the idea that there has been so much innovation that open source will win.
  • Ben Thompson lists a few counterarguments about why this might not be true.
  • The other point, mentioned by a friend, is having intellectual property infringement experience as a key competitive advantage.

How does ChatGPT work?

  • In the spirit of Feynman this superb blog post, by none other than Stephen Wolfram, gives a lucid explanation of what is going on under the hood of the latest tech phenomenon.
  • The short answer is “it’s maths”.
  • “But in the end, the remarkable thing is that all these operations—individually as simple as they are—can somehow together manage to do such a good “human-like” job of generating text. It has to be emphasized again that (at least so far as we know) there’s no “ultimate theoretical reason” why anything like this should work. And in fact, as we’ll discuss, I think we have to view this as a—potentially surprising—scientific discovery: that somehow in a neural net like ChatGPT’s it’s possible to capture the essence of what human brains manage to do in generating language.
WordPress Cookie Notice by Real Cookie Banner