The catch is that there are >10 startups and big-tech teams trying to do what OpenAIs is doing, and at least a few of them appear to be competitive. One or two of them may emerge as a winner. But there's only one Nvidia.
With that said, I do believe Nvidia is overvalued - if it triples profit its PE ratio would still be 30 (i.e. 30 years to return on investment), while there's a fairly good chance someone would catch up to them in 10-20 years.
Thanks, I stand corrected. I guess at 35 times latest earnings it's more reasonably valued. Nvidia's growth in just the past year was stronger than I expected. At this level we need more nuanced arguments.
The next thing to explain the discrepancy between Nvidia's valuation and OpenAI's would be that Nvidia's monopoly position effective eats into the profits of the AI startups for the foreseeable future. Had OpenAI already been profitable, its valuation would have exceeded 86B.
> I guess at 35 times latest earnings it's more reasonably valued
If you are willing to just take quarterly revenue which I think is reasonable for NVidia, it is valued at around 40 times the current estimated earnings for this quarter which isn't too overvalued.
The bigger thing I worry about NVidia is not current earnings but the possibility that the earning won't last when either AI wave fades off or competitors enter the market leading to loss in margin.
I disagree. Google have their tensor chips – whether they work well enough for those outside of Google is somewhat irrelevant, they clearly work for Google who are going to be one of the major players in serving AI for the forseeable future (I'm biased, but seems clear to me). Microsoft have their own chips on the way, rumoured to be this year. Amazon have their own chips on the way, rumoured to have been in development for a number of years.
All 3 of these companies sell Nvidia hardware on their cloud offerings because it's what buyers want, but with Nvidia's pricing and resource constraints there is a huge pressure on the cloud providers to push customers into their own offerings. I don't expect any to stop offering Nvidia chips directly, but I'd bet that in the next few years all of their hosted value-add services will be on alternatives (i.e. hosted AI, inference, training, etc where customers don't need to know the hardware).
Nvidia have more of a moat than OpenAI for sure, but I think Nvidia's best days are 2023/4, and that things will look very different soon.
With the pace of innovation at Google this is absolutely not guaranteed. Speaking from historical perspective it is much more likely that Alphabet acquires another startup or small ASIC-based company which got right the interconnect part. And still - it is not guaranteed. This Gemini thing tries to launch like three times already, but can't even blimp in comparison with OAI. Besides OAI is more or less MS.
I'm talking specifically about chips here. Google has been developing their tensor stuff for a while now, and it's been publicly documented as running much of their training and inference stacks. The fact that they are getting value out of it in models competitive with others suggests that the chips are basically a success, does it not?
That's fair, we don't know. I guess my point is that with all 3 of the main cloud providers with their own chip programs, and with Google having a proven track record of training/serving competitive LLMs on theirs, I'm not that bullish about Nvidia at anywhere near the current prices.
I think this is most likely a temporary blip. GPUs were a bit of a commodity 5 years ago, with Nvidia, AMD, and Intel all producing reasonable stuff. Large AI accelerator chips weren't much of a market ~5 years ago, Nvidia were first to take the market, but in a few years time they'll also be back to commodity status.
Nvidia have a small moat with CUDA, but their eye-watering prices are a huge incentive for users to try alternatives, and ultimately the current price is built on them being the only provider of GPUs with 40/80GB of memory. That's the fundamental enabling technology, and that's not particularly tricky for competitors to replicate.
Nvidia may be the "best" AI accelerator chips on the market for years to come, but being 20% better and 20% more expensive than AMD, and all the cloud providers using their own in-house chips where they can, is not a $1.8Tn company as far as I can tell, it's much more like what Nvidia were ~5 years ago.
well, you make a fair point. let me also raise the question - why were Intel, AMD and alike so slow with developing custom APUs for ML tasks. this sounds incredibly short-sighted. i mean - the ML area has been developing steadily for at least 20 years, with vast amounts of new stuff coming since 2013 perhaps. this makes 10 years, and I believe the dev cycle for new chips is potentially around a decade.
so question is: where are these guys' ML chips? sorry but AVX512 is not something that provides enough juice, and apparently some smart-head at Intel decided to lock end-users out of it?
because, honestly, it was not NVidia who kicked the GPUs forward, but these brave CUDA devs who actually created some valuable software to run on top of them - first for crypto mining, then for the LLMs and NNs in general.
honestly - i start to really despise this company, even though there's a 3090TI in my home box. and with the most recent talk given by CEO - fingers crossed someone comes and eats their lunch, they so much deserve it.
Current valuations of companies don’t necessarily need to reflect the future earnings in a smooth curve. For example if I believe that theirs is a 50% chance that someone will catch up to nvidia in 20 years this should still be priced way above 20x because they might not. A lot of investors might feel that way and it might be another decade until the stock price drops significantly.
> fairly good chance someone would catch up to them in 10-20 years.
I think that's the bet. Maybe someone could catch up, but what are the chances incumbents buy new entrants or otherwise defeat them? What's the chance regulators will get involved? If you rate those favorably for NVIDIA, you'll price it higher.
There’s more hardware than nvidia but it takes some time to unwrap the proverbial matmul from CUDA. Much more hardware is in the pipeline, it just (ahem) has to be much better than nvidia to make migration from CUDA worth it. groq is one such limited but extremely impressive example IMHO.
With that said, I do believe Nvidia is overvalued - if it triples profit its PE ratio would still be 30 (i.e. 30 years to return on investment), while there's a fairly good chance someone would catch up to them in 10-20 years.