It’s insanely costly, insanely hard, and an insanely hard market for a new competitor to enter. Even something as “simple” as dev support is a gigantic hurdle.
No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
Matt Stoller had a nice writeup recently in his monopoly newsletter BIG about how we got into the current mess. TL;DR: basically financialization (prioritizing stock price over innovation, like at Boeing) and a lack of antitrust enforcement as a previously competitive market got monopolized (see chart below)
Interesting to note that most of those are not chip makers but fabless semi conductor companies who outsource all of their production to global foundries and tscm.
Capitalism stifles innovation and suppresses competition.
Duopolies are very prevalent in tech, think AMD/Intel, AMD/nvidia, Windows/MacOS, iOS/Android, etc.
As to why? idk. Big companies buy up small ones until one left so they don't get sued for being a monopoly? Maybe, but I don't think that applies to all those cases.
In tech it’s often a bad thing to have 37 of something. How many phone operating systems can app developers reasonably serve? Does it benefit consumers to have 19 different graphics chip standards?
It's not even really "two companies". Nvidia has 92% of the entire market. And the reason for that is mostly CUDA and its ecosystem which has become widespread among developers.
I think 90+% marketshare is technically considered a monopoly in many places.
But the existence of AMD still makes a huge difference IMO, you do have an alternative option, and Nvidia doesn't control the market completely.
Also personally I use AMD because I'm on Linux, and I don't want the proprietary Nvidia driver to fuck up my system.
So AFAIK on Linux, the majority actually run AMD.
All-AMD Linux desktop build here plus all-AMD Linux laptop.
Shit just works.
Sh 🤫 IT just works
It's even better than that:
They all come from Taiwan Semiconductor (TSMC).
There used to be more of a split between many fabs. Then it was TSMC/Global Foundries/Samsung Foundry. Then it was TSMC/Samsung Foundry. Now AFAIK all GPUs are TSMC, with Nvidia's RTX 3000 series (excluding the A100) being the last Samsung chip. Even Intel fabs Arc there, as far as I know.
Hopefully Intel won't kill Arc, as they are planning to move it back to their fabs.
The short concise answer is mostly cost. Nvidia, AMD, and Intel are all spending multiple billions of dollars per year in R&D alone. It's just not a space where someone can invent something in their garage and disrupt the whole industry (like, even if someone were to come out of left field with a revolutionary chip design, they'd need to convince investors that they'd be a better bet than literal trillion dollar companies).
The question isn't just about upstarts, it's asking how we got here. We can't start Ovidia in a garage, but Nvidia did at one point. So where'd everyone else go? What partnerships and preferences put Nvidia on top?
In general, tech is an industry with high fixed costs and low costs per unit sold. That kind of pricing structure tends to limit competition.
Nvidia was founded at a time when outsourcing chip fabrication was common and viable, so all Nvidia had to do was focus on design. After a series of failures and near bankruptcy, Nvidia was finally able to invent the idea of a GPU and sell it to the marketplace.
After that Nvidia bought several companies to round out its patent portfolio and capabilities, remaining a dominant company in an industry it created. The only real competition was with other companies that had previous chip design experience.
Patents and the fact that these chips are massively complex designs. We are talking architecture on the complexity level of the empire state building, most of which is a blend of proprietary designs developed over decades.
Nobody is saying you can't do it in your garage, in fact it's easier than ever to start. Let me know how it goes, look into some of the recent tapeout challenges to get an idea of what you are proposing people just make in a garage.
I was content to let the other comments address the history since I'm not particularly well versed there (and there's already enough confidently incorrect bullshit in the world). I mostly just wanted to interject on why there aren't more chip companies beyond just hand waving it away as "market consolidation", which is true, but doesn't take into account that barrier for entry in the space is less on the scale of opening up a sandwich restaurant or boutique clothing store and more on the order of waking up tomorrow and deciding to compete with your local power/ water utility provider.
The answer also gets kind of fuzzy outside the conventional computer space and where single board/ System On a Chip designs are common, stuff like Raspberry Pi's or smart phones, since they technically have graphics modules designed be companies like Snapdragon or MediaTek. It's also worth noting that computers have gotten orders of magnitude more complicated compared to the era of starting a tech company in your garage.
If it helps answer your question, according to Wikipedia, most of the other GPU companies have either been acquired, gone bankrupt, or aren't competing in the Desktop PC market segment.
If you are really curious, read Chip Wars by Chris Miller.
Market consolidation. Easier to compete when you have 1 or 2 competitors rather then 5.
Wait till I tell you about the handful of companies that own EVERYTHING else you buy.
You've watched Google,Facebook, and apple do it the last 20 years. If a good idea is spotted early enough, they buy the whole company before they can make it to market and grow to become a threat. It happens in any emerging tech and you're watching it happen now in the LLM space. Companies burn cash, waiting for their competitors to make a mistake or run out of money. Then they buy out the struggling company, absorbing any tech they might have, maybe some branding, but more importantly- their customers. Now they can jack up prices once market forces are eliminated.
If not for the threat of anti-trust laws, you would see single company rule in every single sector. That is the end goal of a company- a monopoly that crushes potential competition and squeezes consumers.
Railroads, telephone, petroleum, internet, airlines, all ended up as regional monopolies.
Intel never really tried to be a real competitor until a few years ago. 3dfx had market dominance in 90s but then basically committed suicide. There were a few other smaller manufacturers in the late 90s and early 2000s but they never really had significant market share and couldn't keep up with the investment required to be competitive.
3dfx had market dominance in 90s but then basically committed suicide.
As I remember it, it was Nvidia that killed 3DFX, Nvidia had an absolutely cutthroat development pace, and 3DFX simply couldn't keep up, and they ended up being bought by Nvivia.
But oh boy Voodoo graphics were cool when they came out! An absolute revolution to PC gaming.
We cannot forget that 3dfx went under when they bought STB to manufacture their own video cards instead of letting their board partners do it.
3dfx had market dominance in 90s but then basically committed suicide.
Very true. They committed suicide when they bought STB so that they could manufacture their own video cards. They didn't just focus on chip R&D, they needed to manufacture and market their own video cards instead of letting board partners do it.
Huawei have just started selling a gpu to the public that they made a few years back. It has a lot of VRAM, but it is old slow RAM, and doesn't have the software infrastructure (drivers etc) nvidia has. So currently it isn't a great option, but if you look at phones, or electric cars, there is every chance they become competitive in a relatively short time period. Time will tell.
I'm happy we still have two.
Short answer: hard to start a new company that can compete. Over the decades all the other companies have done poorly and gone bankrupt or been bought out.
still very distant in future but I believe in the power of arm. Dedicated gpu's are beasts now but I am rooting for arm to win this race
Never root for a monopoly over at least some semblance of competition
this is my only gripe with arm but at least I think industry standards are changing and everyone is taking risc seriously so if the giants are noticing it, the monopoly can easily be avoided
One could say it's an arms race perhaps
You are not mistaken.
In the early days of the PC, there were lot's of GPU options, as in literally dozens. So the first part of the question is why did they almost all disappear? The answer to that is that it became a much more complicated market with Windows, with way higher demands on the software side, and many hardware vendors suck at making software. So over time the best combo of hardware and driver beat out other high end manufacturers so we ended up with just 2, and the on-board / on-chip GPU made every low end 3rd party GPU next to irrelevant, with very little possibility of making a profit.
The low end chips were no-longer needed, as they can now be had cheaper and more efficiently as part of the CPU for both AMD and Intel. And since these are the only 2 CPU options for X86, Now that VIA has discontinued their X86 line acquired from Cyrix, there is no low end entry point in the PC market for a new maker of GPU.
The natural evolution is to start from a lower end, and if successful work up. This is not possible in the PC market, and makes entry to the market near impossible, except with enormous investments that may never pay off, especially since PC is a dwindling market.
As you mention Intel is dipping their toes, but despite doing a pretty good and big effort, and investing a lot to develop a better GPU, and actually delivering a good product at a reasonable price, that should be absolutely competitive on paper, their marketshare is absolutely minuscule, because Nvidia and AMD together dominate and already fill the needs for the mid to higher end market, and have brand recognition for graphics.
It's not that there aren't technologies that possibly could compete if scaled for PC, because those are actually pretty numerous on phones and tablets. But you can't port these cheaply to PC, because there is no market segment for them to slide in to easily.
It would require major investments to make them actually hardware performance competitive at higher scales, and investments in making good drivers. Intel had a big head start in these aspects, already making on-chip graphics that had drivers already. And still they are struggling, despite delivering a good product, and people have been screaming for a third option,because of high GPU prices.
This may not be the entire explanation, but I think it's a very significant part of it.
The better question IMO is why Intel never became more popular, considering how much people have raved that more competition in the GPU market is required.
And the explanation for that is:
but let’s be real, AMD and nVidia are the only options
Except Intel actually presented a good alternative, but was never seriously considered by most people for whatever reason.
Personally I didn't consider Intel, because I remember earlier attempts by Intel, where Intel quickly left the market again. And I didn't want a GPU where I'm left without driver support a year after I purchased it.
So in my case it was lack of trust in Intel to stay the course. But every other maker would have that exact same issue.
There have been a few attempts in the past from other makers, but they all had performance or driver issues or both, and left the market quickly . Intel delivered a stellar product by comparison. And if Intel drops out of GPU again, I think there's a pretty big risk it may be our last chance for a third significant mid-high end GPU maker on PC.
TLDR:
- All the old competitors couldn't cut it on either the hardware or software side, and so they died out.
- It's an insanely expensive market to enter and to stay in, with high risk of never making a profit.
The other thing you didn't talk about was the size of the market in general.
As onbaord CPUs were becoming popular the biggest reason for a GPU was games or video processing. Which, while significant markets, isn't huge.
Over the past couple decades, GPUs have made headway as the way to do Machine Learning/AI. Nvidia spent a lot of time and money making this process easier on their GPUs which lead to them not only owning the graphics market, but the much bigger ML/AI market. And I say the AI/ML market is bigger is simply that they are being installed in huge quantities in data centers.
Edit: My point being that the market shrunk before GPUs became so critical. To counteract Nvidias stranglehold, a lot of big tech companies are creating custom TPUs (Tensor processing units) which are just ML/AI specific chips.
Oh, and there are other graphics makers that could theoretically work on linux, like Imagination's PowerVR, and some Chinese startups. Qualcomm's already trying to push into laptops with Adreno (which has roots in AMD/ATI, hence 'Adreno' is an anagram for 'Radeon')
The problem is making a desktop-sized GPU has a massive capital cost (over $1,000,000,000, maybe even tens of billions these days) just to 'tape out' a single chip, much less a line, and AMD/Nvidia are just so far ahead in terms of architecture. It's basically uneconomical to catch up without a massive geopolitical motivation like there is in China.
I've noticed how poorly gpus are classified, and how it seems every intersting peice of AI software just has a list of gpus it can work on. So I can see customers just locking into one brand so they have less to memorize.