2024 Presidential Election
3d
9635
🇺🇸 Presidential Debate Right Now 🇺🇸
Tech Industry
Yesterday
1458
Anyone over 30 and not millionaire yet?
2024 Presidential Election
Yesterday
228
If you could pick anyone for president who would it be?
Fitness
Yesterday
763
What to do with 330 lbs?
2024 Presidential Election
3d
24369
Biden's win probability is at 18%
Curious question. People in Computer Science and Software Industry are super smart. They reverse engineer stuff all the time and implement their own thing. If an nvidia h100 gpu is available for physical inspection and breakdown and the CUDA white papers are available, why don’t the smart people of other super smart companies like FAANG reverse engineer and come up with their own GPU. It is fine if it is 10x slower but they can try to make it faster in v2. What makes nvidia GPUs so irreplaceable? I mean… if meta can commit a bunch of money to removing GIL in python, can’t it do something similar to create GPUs? Especially interested in hearing from hw folks in Apple, AMD, Broadcom, Qualcomm, etc.
This post was made by a leetcoder for sure, Teslas are sold to super smart people all the time. Why don’t they just take it apart and make their own EV and compete with Tesla and take their market share? Use your brain for more than 30 seconds, miracles might happen
Nvidia just happened to be lucky and randomly have the right product at the right time, but only the fools would believe that they can unlimitedly milk this cash cow. At Google we run our services for the vast majority on TPUs, which are much cheaper to buy and to operate than GPUs. Amazon and Microsoft are also working at their own AI accelerators, they are just late in the game. It's just a matter of time and GPUs will eventually become a commodity. Until then, enjoy the ride until it lasts :-)
market is just irrational you say?
I would say that excited investors don't have a full picture, and tend to believe that certain companies can grow indefinitely. It was not long ago that Tesla was a 1B company, there was infinite demand for electric cars, Tesla's competitors were lagging 10 years behind, and Musk was on his way to dominate the world :-)
First crypto mining l, then GenAI and now both are driving this momentum. Hardware development, especially CPUs and GPUs take years of RnD, manufacturing a d supply chain expertise. Now all other big tech are chasing Nvidia, while they are not sitting idle, they have a head start and leading the race. Eventually, AMD and likes will level the playing field or the demand will subside, but until then, Nvidia will reign supreme.
You have to deal with the entire system as a whole how do you connect the thousands of gpus how you make it work in unison how do you deal with network delays there are a lot of other pieces besides the gpu as well nvidia has the entire stack
It will take years and billions to design and manufacture a competitor, nvda has a big head start. Even if you can reverse engineer h100, now you need to manufacture it, have good yields etc, by the time you make some, h200 is coming out
Summary of it is hardware development is much harder than software development. It's not easy to reverse engineer a chip and develop it instantly. There's a lot more involved in hardware development. The fabrication side of it is also another factor. Nvidia also has the full stack developed for the GPUs to function. Nvidia has gone through many cycles and perfected everything while others are still playing catch up. It's going to take some time before others can catch up.
But how does the full stack matter if it's all just frameworks to the end user, and many of said frameworks allow using any applicable accelerator, not being limited to NVIDIA per se? If anyone's actually using a third-party AI as a service, then they wouldn't even know whether the underlying hardware is made by NVIDIA, AMD, AWS, Google, Meta, or someone else.
@wawY45 Even though the frameworks are generic enough, they perform their best with NVIDIA hardware due to the tight integration they are able to achieve. This gives an illusion that anyone can compete on the same playground as long as we use the same framework but the reality is far from it.
Oh they only have something like a 30year head start on GPUs.
Everyone is working on their own chip. Meta has one going. Amazon has one going. It takes time though and doesn’t happen over night. I’m sure msft has one in the works. Google already developed TPUs. Everyone realizes GPUs are expensive and are working towards reducing or eliminating their reliance on Nvidia.
This is like everyone is building an iPhone when iPhone launched… Amazon fire phone, Microsoft phone this and that… so see history and apply it to present scenario
It’s not like that at all. Those were consumer products. Consumers had to buy them. Data center chips are not consumer products. Every company knows their specific use cases and will design ASIC (application specific integrated circuit). They don’t get them right on their first try but over several iterations, they’ll get better. Everyone is building their own ARM based CPUs already so chip design experience and expertise exist within all of the big companies
Nvidia has been working on GPUs since 30 years. They've mastered the full stack HW/SW/ Fabrication of chips for specific workloads like graphics, gaming, ai etc. it is not easy for competitors to catch up as it would happen in software. And for the recent rise it's because every company wants to develop LLMs which currently uses GPUs for training. But now all big techs have started working on their own accelerators for AI workloads. It will take time for others to have good products in this space. Curious to see what Nvidia would do to sustain then.
They will push for another high compute area hard... Its a supply thing, not paradigm change. Videogames, crypto, ai ... Same as arm killing x86
The short answer is the software in a hardware company has a much higher bar than the software in software companies.
Long answer pls
cuda good