Blog Hero Image

Posted by Nicholas Mersch on Apr 8th, 2024

The Year of the Semiconductor

When a man in a leather jacket who works in semiconductors can pull crowds like it’s a T-Swift concert, you know The Times They Are A-Changin. Of course, I’m referring to Nvidia CEO Jensen Huang and his keynote address at GTC a couple of weeks ago. He packed the San Jose Sharks arena but talked about a different type of hockey stick in his Steve-Jobs-like address. He spoke not about Bauer, Warrior, or Sherwood but about the exponential hockey-stick growth curve in compute efficiency when it comes to training and using (inferencing) AI models.

Key takeaways:

  • The tech market is pivoting towards semiconductors, driven by their pivotal role in AI advancements. This reversal of previous trends that favoured software models positions hardware at the forefront of technological value and investment.
  • Despite substantial investments in AI infrastructure, the AI industry's revenue is lagging, presenting a pressing monetization challenge that echoes the early, uncertain days of mobile app development. This urgency underscores the need for swift and effective solutions.
  • Nvidia's new Blackwell (B100) series marks a monumental leap in AI efficiency, signalling a speed of improvement that goes far beyond Moore's law.
Source: Nvidia GTC Conference

The big unveil was Nvidia’s new Blackwell (B100) series that, in tandem with its full system server, is touted to be 30x better at inference and 4x better at training than its predecessor, the H100. As a reminder, when the H100 came out, it was also 30x better at inference and 9x better at training than its predecessor, the A100. This is not Moore’s law, where we see a 2x improvement every year or so. This is sea change innovation happening at break-neck speed.

Think about it like this. Theories are nice in a vacuum. You can crank out as much Good Will Hunting-style math on chalkboards as you’d like, but if you are not able to test theories around the movement of electrons that push boundaries of what we previously thought possible, theories are about as useful as a farting app on your iPhone. We have now reached a level of physical infrastructure computing capability where we can stress-test new and transformational ideas.

All of this, of course, still needs to get played out. We do not have a clear picture as to how use cases and business models will unfold around artificial intelligence. As “software ate the world,” will “AI eat software?” Right now, the math is not yet mathing. We spent about $50B in GPUs from Nvidia last year to build out the infrastructure compute layer of AI, yet we only have $3B of AI revenue across the entire industry. We still haven’t figured out the monetization layer.

These early stages of AI end-use cases remind me of the early novel apps on your iPhone. We had this incredible new technology plus an app store, but we had no idea what to do with it. These gimmicky apps like calculators and flashlights eventually became widgets as we ushered in the mobile era and figured out the app layer through social media/entertainment (Facebook, Snapchat, Instagram, YouTube, Spotify, etc.) and productivity apps (Gmail, Outlook, Calendars, etc.). We’re not quite there yet with AI, and we’re just getting started on the app layer.

While semiconductors are witnessing a step-function change in revenue, other areas of the tech industry have been very slow off the starting blocks. “Hardware” used to be a bad word in technology. It was associated with high manufacturing costs, low gross margins on unit sales, cyclicality, and perceived to be eventually commoditized. These companies were thrown aside for SaaS models, which were infinitely scalable, had high gross margins, and were incredibly sticky. How the turntables have turned – software is struggling, and semiconductors seem unstoppable (for now).

Hence, the market has hereby declared its holy decree that 2023, 2024, and maybe the next three years shall be dubbed the “Years of the Semiconductor.” Don't get left behind.

Semiconductor Mania

Any time we have a breakthrough technology ushered in, value and revenue recognition happen in waves of deployment. We are great proponents of studying the past, as history doesn’t repeat itself, but it often rhymes. The corollary that we are focusing on is the CAPEX buildout cycle that bred the necessary pre-conditioning for cloud + mobile technology to flourish.

Source: Morgan Stanley

We believe we will follow a similar path, except the timeline will be compressed due to the exponential growth we are seeing in AI. Nvidia’s data centre quarterly numbers speak for themselves, and they have already single-handedly started the semiconductor CAPEX cycle.

Source: Nvidia, Statista

Additionally, we’re starting to see more of the tertiary infrastructure buildout come online. For example, Vertiv is one of the best pure-play ways for exposure to the power and thermal management of data centers. Liquid cooling is a small percentage of total revenue right now, but the company is expecting a 50%+ CAGR in the space. We also must remember to invest along the entire value chain when it comes to semis, not just Nvidia or AMD. While Nvidia may be the arms dealer in the war for AI, semicap equipment providers (like LRCX, KLAC, AMAT, and ASML) are the weapons facilities. The machines that make the machines.

Source: @IvanaSPEAR,
Source: @IvanaSPEAR,

While there is no shortage of hype around semiconductors, we believe there is still significant room to run. However, we are not blindly all in and will change our minds when the facts and fundamentals change. What will change our minds? We see three potential threats to fundamentals:

1) Margin compression: While Nvidia’s stock price skyrockets, it continues to become cheaper on a P/E basis every time it prints a quarter and pushes out its goalposts. With 75% gross margins and 45% FCF margins, it is evident how much pricing power Nvidia has. If supply constraints weaken, so will Nvidia’s margin. This, paired with revenue growth, are the line items to watch for forward sustainability. No CAPEX buildout cycle lasts forever at this level of profitability.

2) Competition: We are starting to see more competition from Hyperscalers (Amazon, Google, Microsoft) building their own silicon and startups like Groq building more application-specific semis. Market players are salivating over Nvidia’s profitability profile. As Bezos famously said, “Your margin is my opportunity.”

3) Technology risk: We’re still in the very early days of frontier large language models (LLMs) and even earlier days of AI that isn’t an LLM. Right now, there is an insatiable demand for compute. This could change if there is an evolution in the process behind how AI works (i.e. a new version of a transformer). While this is unlikely over the short term, a new and more efficient process for approaching AI could lower the overall demand for compute.

Software’s Messy Quarter

While on the surface, it looks like all of technology is off to a torrid start to the year, software companies have struggled. We saw numerous gap-downs after earnings where companies largely “beat and lowered,” meaning guidance was very soft. Mr. Market was not kind to these companies. The market pays premium multiples for companies that are growing. When companies stop growing, the market stops paying up for them. Simple as that. What we have experienced over the last six quarters is a “growing up” of software companies. Growth has moderated significantly while margins have improved. This has created a short-term holding pattern where we believe we are now able to buy software at reasonable multiples.

Source: Altimeter, @jaminball, Bloomberg, Pitchbook

Coming out of 2023, many were looking for a period of IT cost optimization to stabilize. Instead, we have not yet seen the trough. A lot of this is due to the explosion of interest in implementing generative AI into workflows, where organizations are taking a “hurry up and wait” approach when it comes to allocating their IT spending. The incremental spending that is getting added is all around generative AI applications instead of traditional analytics.

On an enterprise level, companies often start by testing a GenAI use case using a frontier LLM, which is typically more expensive. Once they master the integration into their workflow, they engineer a similar solution on AWS (or their preferred data platform). This solution uses an open-source LLM, a key element in optimizing cost while maintaining the benefits of GenAI integration.

An important concept to understand here is data gravity. Data gravity is the idea that companies want to implement GenAI on top of where their data already sits. This is a “switching cost,” which acts as a moat that AWS, GCP, and Azure have already established. Software companies will need to carve out their niche within this space to prevent churn and reaccelerate growth.

Our investment thesis in the fund is to capture a full-stack layered approach to the AI buildout cycle. However, our thesis that GenAI would lead to early revenue recognition in the infrastructure software stack was premature. We believe this will materialize in the latter half of this year and into the first half of 2025. With multiples compressed back to reasonable historical levels, we are comfortable owning these companies down here at this time. What would change our thesis is a failure to establish a trough if Hyperscalers move in to capture an outsized portion of GenAI revenue.

Final Thoughts

When it comes to a baseball analogy of what inning we are in for AI, we haven’t even pulled into the parking lot yet. Companies are still trying to figure out:

  1. How to implement this technology into their data strategy,
  2. How the monetization schedule will work as model costs are optimized, and
  3. How to de-risk their AI strategy so that one supplier does not have outsized pricing power.

We are now at a precipice in AI where our computational capacity has evolved to the point where we are able to test new theories that were previously inconceivable. In an area that is moving rapidly, investors must critically analyze where value will accrue in the system. As the industry changes, so will our holdings.

It is time to get smart on AI or get left behind.

–Nick Mersch, CFA, Portfolio Manager

Commissions, trailing commissions, management fees and expenses all may be associated with investment fund investments. The prospectus contains important detailed information about the investment fund. Please read the prospectus before investing. There is no assurance that any fund will achieve its investment objective, and its net asset value, yield, and investment return will fluctuate from time to time with market conditions. Investment funds are not guaranteed, their values change frequently, and past performance may not be repeated.

This information is provided for illustrative and discussion purposes only. This material is not intended as a formal research report and should not be relied upon as a basis for making an investment decision. Historical trends do not imply, forecast or guarantee future results. Information is as of the date indicated and subject to change without notice. Nothing herein constitutes a prediction or projection of future events or future market behavior.

The information is not investment advice, nor is it tailored to the needs or circumstances of any investor. Information contained in this document is believed to be accurate and reliable, however, we cannot guarantee that it is complete or current at all times. The information provided is subject to change without notice.

Funds mentioned in this story

Nicholas Mersch, CFA

Nicholas Mersch has worked in the capital markets industry in several capacities over the past 10 years. Areas include private equity, infrastructure finance, venture capital and technology focused equity research. In his current capacity, he is an Associate Portfolio Manager at Purpose Investments focused on long/short equities.

Mr. Mersch graduated with a bachelors of management and organizational studies from Western University and is a CFA charterholder.