Every GPU That Mattered (sheets.works)

by jonbaer 208 comments 330 points
Read article View on HN

208 comments

[−] mrweasel 38d ago
It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.

At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.

[−] mizzack 38d ago
Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.

https://en.wikipedia.org/wiki/S3_Texture_Compression

[−] jdewerd 38d ago
Loads of games from the era roundtripped their textures through lossy S3/DXT compression and then stored them as uncompressed RGB or RGBA.

I know this because I wrote a Unreal Engine texture repacking tool with a "DXT detection" feature so that I wouldn't be responsible for losing DXT compression on a texture which had already paid the price, only to find that this situation was already hyperabundant in the ecosystem.

Many Unreal Engine games of the day could have their size robotically halved just by re-enabling DXT compression in any case where this would cause zero pixel difference. This was at a time before Steam, when game downloads routinely took a day, so I was very excited about this discovery. Unfortunately, the first few developers I emailed all reacted with hostility to an unsolicited tip from what I'm sure they saw as a hacker, so I lost interest in pushing and it went nowhere. Ah well.

[−] ryandrake 38d ago
The article blew a huge opportunity to showcase the great diversity of “Pioneering Era” 3D accelerators (they weren’t called GPUs until later). But instead they just pretended it was always NVIDIA vs ATI, and threw in a few Voodoos.
[−] flohofwoe 38d ago
It was only 3dfx and NVIDIA (since the TNT) that mattered in the 1990s though. All the other 3D accelerators were only barely better than software rasterization, if at all.

Seeing Quake II run butter smooth on a Riva TNT at 1024x768 for the first time was like witnessing the second coming of Christ ;)

[−] djmips 38d ago
Rendition's VQuake was actually pretty good, more than barely better than software rasterization.
[−] ChrisClark 38d ago
Edge anti-aliased polygons!
[−] ChrisClark 38d ago
Before that, you could even run Quake with anti-aliasing on one of those "barely better than software rasterization" cards, couldn't even be done on the first Voodoo cards.
[−] djmips 38d ago
And they say that Nvidia coined the phrase GPU - but I recall that Sony did it earlier... not that it really matters.
[−] aruametello 38d ago
+1 to that, when i first saw unreal tournament with the add-on compressed texture pack was a real WOW moment.
[−] holoduke 38d ago
Yeah it also lacked driver support. But it was for a very brief moment the king of the hill.
[−] formerly_proven 38d ago
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
[−] mrweasel 38d ago
Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?
[−] formerly_proven 38d ago
Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.
[−] hypercube33 38d ago
The ATi Rage 128 was used in everything short of toasters for a long time too. I assume that the drivers are part of what made it obsolete.
[−] jandrese 38d ago
I remember having a ton of servers with cut down Mach64 chips. They were so bad that you would get horizontal lines flickering across the screen while text was scrolling in an 80x25 text console. I don't know why server manufacturers go to so much effort to make the console as terrible as possible. Are they nostalgic for the 8 bit ISA graphics from the original 5150? They seem offended at the idea that someone might hook a crash cart directly up to their precious hardware.
[−] jandrese 38d ago
They were probably forced to update when they dropped older busses. Without a PCI or AGP bus on there they have to find something that can hang off of a PCIe lane.
[−] bluedino 38d ago
Drivers, probably.
[−] jeffbee 38d ago
Even current Dell servers less than a year old ship with G200 graphics. If it works, why change it? A 1998 ASIC can be put in the corner of a modern chipset for pennies or less.
[−] xattt 38d ago
My contributions: Matrox Parhelia for the first card supporting triple-monitors, and ATI All-in-Wonder which did TV out when media centre TVs weren’t really a thing.
[−] MBCook 38d ago
The big feature of the All-in-Wonder was TV in. You could record, in glorious analog detail that could quickly use up your entire hard drive.
[−] doubled112 38d ago
I can remember using an AiW card to play PS2 on my computer screen when my TV died. The latency wasn’t great but we still had fun.
[−] gen2brain 38d ago
I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get -vo mga for the console and -vo xmga for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.
[−] tbyehl 38d ago
For a moment, a Matrox G400 DualHead was THE card to have for a multi-monitor setup.
[−] rangerelf 38d ago
This was a very sweet video card.
[−] whizzter 38d ago
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
[−] aruametello 38d ago
4000 certainly did, the "shader execution reordering" gave an meaningful uplift to tasks that "underutilized warp units due to scattered useful pixels".

it seems to have helped path tracing by a lot.

[−] LoganDark 38d ago
I think their point is RTX is not useful.
[−] flohofwoe 38d ago

> S3 ViRGE and the Matrox G200

Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)

[−] pak9rabid 38d ago
The only thing the ViRGE was good for was passing through to a Voodoo2
[−] MBCook 38d ago
But it WAS ultra popular with OEMs. If you had embedded video there was a huge chance that was it.
[−] jandrese 38d ago
Matrox was really halfhearted with game support. They seemed far more interested in corporate customers, advertising heavily stuff like "VR" conference calls that nobody wants. They were early with multi-monitor support back when monitors were big, heavy, and expensive. I had a G200 that was the last video card I've ever seen where you could expand the VRAM by slotting in a SODIMM. It also had composite out so you could hook it to a TV. I played a lot of games on it up until Return to Castle Wolfenstein, which was almost playable but the low res textures looked real bad and the framerate would precipitously drop at critical times like when a bunch of Nazis rushed into the room and started shooting.

Last time I saw a Matrox chip it was on a server, and somehow they had cut it down even more than the one I had used over a decade earlier. As I recall it couldn't handle a framebuffer larger than 800x600, which was sometimes a problem when people wanted to install and configure Windows Server.

[−] rasz 38d ago

>S3 ViRGE

decelerator?

>Matrox G200

because it never got opengl driver? Because it was 2x slower than even Savage3D? Nvidia TNT released a month later offering 2x the speed at lower price

https://www.tomshardware.com/reviews/3d-chips,83-7.html

truly a graphic card that mattered! :)

[−] cubefox 38d ago
This is an ad from viral marketing company and everyone here is falling for it.
[−] john_strinlai 38d ago

>

This is an ad from viral marketing company

they arent a marketing company:

"Dashboards, CRMs, automations. We're a small consulting team that turns your messy spreadsheets into systems that run your business."

[−] cubefox 38d ago
[−] john_strinlai 38d ago
huh, my mistake, you are correct. they apparently do offer marketing visualizations, they just dont mention it anywhere on their home page apparently.
[−] izzydata 38d ago
What are they advertising? Nvidia graphics cards?
[−] ActorNightly 38d ago
From memory the cards that stood out were

Nvidia 6xxx series, which was the first card to support SLI. I remember my gaming pc in college with 6x series card, and being able to get another card and use and SLI bridge that increased performance in some games.

Nvidia GeForce 900 series, which had the Titan with 12gb, first card iirc to able to support larger resolution gaming.

Nvidia RXT series which started with 20xx i think, first card to come with 24gb of ram.

And then the modern 4xxx series which used to fry power cables.

[−] fooker 37d ago

> RTX 4000 and 5000

These GPUs have made DLSS and frame generation usable technologies, getting you reasonable 4k gaming on a budget.

It’s not perfect yet, but almost all new games support it and despite the widespread complaints, very few people actually disable these features.

[−] PunchyHamster 38d ago
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
[−] dantillberg 38d ago
I don't believe this list was curated as the title suggests. It's just a semi-random list of popular-ish GPUs with LLM-generated descriptions.

The site looks nice, which fools us into thinking thought and effort was put into this.

[−] __alexs 38d ago
A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.
[−] vman81 38d ago
Honorable mention, the Rendition Vérité 1000 https://fabiensanglard.net/vquake/index.html

Released before the Voodoo 1 with glquake and gl support for Tomb Raider.

[−] paavohtl 38d ago
I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.
[−] arjie 38d ago
Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.

And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).

[−] Shalomboy 38d ago
This is a wonderful-looking infographic, but I truly don't think there are 49 GPUs that mattered in the PC gaming hardware space - let alone all of computer graphics. Call it recency bias, but after the Pascal cards it feels like maybe one or two more entrants actually mattered?
[−] bob1029 38d ago
The 8800 GT is easily the most impactful GPU in my mind. The combination of that video card with valve's Orange Box was insane value proposition at the time.

I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.

[−] andai 38d ago
There's no horizontal scroll bar, apparently I need to click and drag the GPU section leftwards with the mouse. (Am I old now?)
[−] pjmlp 38d ago
That mattered on the PC evolution, it misses many others e.g TMS34010.

https://en.wikipedia.org/wiki/TMS34010

[−] snarfy 38d ago
Matrox needs a mention somewhere. GPUs do raster too, and theirs optimized for an entirely different market.
[−] xcodevn 38d ago
I have a strong feeling that this website was designed by Claude Code using the /frontend-design skill.
[−] tetris11 38d ago
I really want to see TDP over time.

If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better

[−] Tepix 38d ago
Missing the Radeon RX Vega 64!
[−] CamouflagedKiwi 38d ago
I don't think much of the "defining game" thing. Many of them feel like they're just thrown in as a big game at the time - Diablo 2 is an amazing game, and was very popular, but it wasn't fully 3D and the resolution was so limited I don't think there was usually a need to buy a new video card to play it (in fact I think it might have been just fine in software most of the time).
[−] Lammy 38d ago

> Apple chose the Rage 128 [Pro] for the original iMac G3, making it the most popular Mac GPU of its era.

This is misleadingly-worded because the original iMac had a 3D RAGE ⅡC, the five-colors models had 3D RAGE Pro, and the slot-loading models had the earlier RAGE 128 VR.

Yes those are all confusingly named by ATi :p

But based on the timeline and features mentioned, they're specifically talking about this one and not any of the earlier chips in the RAGE family: https://en.wikipedia.org/wiki/ATI_Rage#Rage_128_Pro_/_Rage_F...

iMacs didn't ship with RAGE 128 Pro until the year-2000 Indigo & DV models, by which time the RAGE 128 Pro was already 11 months old: https://everymac.com/systems/apple/imac//faq/imac-g3-video-p...

[−] mikepurvis 38d ago
Well my 9070 XT made the list; I've been quite happy with it, great performance with paying the Nvidia tax.

RIP my Radeon 7500 from high school though, that was always a budget card, and we all had them but wanted the 9700. Couldn't beat the box are from that era though: https://www.ebay.com/itm/206159283550

[−] cestith 38d ago
Rx580 is on there, but not the R9 290. I’m not sure where the Rx500 series actually pushed technology forward. They always seemed like the AMD budget line. And if 580 is important, why not the 590 or the 570?

Few of the “pre-GPU” graphics accelerators that seem to have mattered are here. The ViRGE. The Mach32 and Mach64. The Trident cards, like the TGUI9440. Yet the Voodoo often isn’t considered a GPU and is on the list.

[−] kawsper 38d ago
We had the Riva TNT2 in our family computer, so that was fun to see that again, I think it was paired with an AMD K6-2 chip.

One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.

When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.

[−] blackhaz 38d ago
I don't understand this - where's Trident VGA?
[−] alentred 38d ago
Awwww..., this brings so many memories. I had almost all of the early ones: Voodoo 2, Riva TNT2, then GeForce 3 (I think...). Then I switched to laptops and didn't have a discrete graphics till last year when I started playing with LLMs locally. So basically I jumped from GeForce 3 to RTX 3090 :) Thank you for bringing those memories back!
[−] slabtickler 38d ago
Fails to mention any TBDR-based GPU at all. Do PowerVR and Qualcomm not exist? Or hell, Apple?
[−] Neil44 38d ago
I had the Voodoo 1 with VGA passthrough from the 2D card. When you loaded a game you'd head a little clunk from a relay on the Voodoo taking over the VGA signal and you knew you were about to have a good time. Doesn't seem that long ago!
[−] paddy_m 38d ago
I'd be really interested to see SGI on this chart. When did consumer hardware exceed what you could do on an SGI box?

I think Sun and HP had some 3d capabilities, but it was mostly aimed at engineering/CAD

[−] 0x70dd 38d ago
This brings so many memories. I remember how badly I wanted an GeForce 6800 Sadly, I was never able to justify spending this much money on a GPU. Still holds true, even today.
[−] BoredPositron 38d ago
Missed the Voodoo 5 5000 which laid the ground work for nvlink
[−] dist-epoch 38d ago
I think it's a terrible UI - requires 3 different things to see the GPUS: scrolling vertically down to see the Era buttons which then scrolls up and hides the Era buttons even if you have enough vertical screen space, clicking on the Era buttons, clicking < > buttons to see the GPUs of an Era.

I can't remember last time I've seen such a confused design.

[−] latentframe 38d ago
Compute stopped behaving like a consumer good and started behaving like an infrastructure; the prices went from competitive cycles to higher while the performance kept compounding and that’s usually what happens when something becomes a bottleneck for entire industries and not just for end users so the gap between what people use and what’s at the frontier says it all.
[−] RantyDave 38d ago
The Nvidia NV1 mattered even if it was a misstep.

I'd say Voodoo 3 mattered because it killed 3dfx.

And the Matrox Parhelia mattered for much the same reason.

[−] finaard 38d ago
I have fond memories of lending a Voodoo 2 from a friend when I was moving from a 486 to a K6 based system component by component. At that time I was still using my old ISA VGA card, which meant 2D performance was horrible, and I couldn't really watch videos on that thing - but thanks to the Voodoo I could play Unreal Tournament without problems.
[−] Lwrless 38d ago
I don't see my first GPU on there, it was the humble GeForce4 MX440. It could run almost any game I cared about for a surprisingly long time, even if it's not a true modern card. These days almost all my machines are on iGPUs baked into the CPU. There's way less fun for me, but they are a lot more compact at least.
[−] cubefox 38d ago

> We build visual stories like this for companies

Combined with the color scheme of this site, this might be a cleverly disguised Nvidia ad.

Edit: Clicking through to their main page [1]: yeah, that's definitely an Nvidia ad.

1: https://sheets.works/data-viz/hire

[−] Zealotux 38d ago
Ah I was just trying to remember the model names last week and this website pops up like magic, weird how the internet works sometimes. The 560 Ti was a dream for teenage me and most of my friends back then, but I must say my Radeon HD 4870 game powered most of my favourite Team Fortress 2 years.
[−] silversmith 38d ago
Missing the Rage Fury Maxx, finest welding job by the boffins at ATI, severely hampered by software support.
[−] Night_Thastus 38d ago
I wouldn't call a card like the 5080 important. It was incremental compared to the previous generation, a poor value for money, and was awkwardly placed - being very cut down compared to the 90 class of that generation - significantly more than earlier generations.
[−] bobsmooth 38d ago
I was so sad when I retired my 1060 6GB. That thing served me well for almost a decade.
[−] abhikul0 38d ago
The 9400 GT mattered to me as it was my first gpu. Had bought NFS Carbon only to find that the home pc only had a CD drive not DVD lol, so finally with that drive upgrade also came the 9400 GT and fun ensued.
[−] glitchc 38d ago
Not including the Diamond Monster Fusion, the first 2D/3D card, is a glaring omission.
[−] momocowcow 38d ago
not a very good list, from a historical perspective it’s missing many important cards, as mentioned by others

also, the gpu did not exist until 1999

looks like this was created for engagement

[−] deadcore 38d ago
Did anyone else notice the decline of graphics on the GPU's coolers! I missed that classic box artwork too!
[−] bnolsen 36d ago
The radeon 8500 deserves to be on that list. It still has active Linux support under mesa.
[−] charcircuit 38d ago
Why didn't datacenter GPUs make the list. AI trained with them is such a significant part of computing today.
[−] yasuocidal 38d ago
Cant seem to load the page, is it down? can’t establish a connection to the server at sheets.works
[−] schnitzelstoat 38d ago
I remember having the Voodoo card to play Thief: The Dark Project. It felt incredible at the time.
[−] stared 38d ago
I remember Voodoo - precisely because I didn't have it back then, as it was a luxury option.
[−] Computer0 38d ago
Thanks for the website Claude! By the way the GTX 1080 and 1080ti use the same image.