Skip Navigation

The Great NVIDIA Switcheroo | GPU Shrinkflation.

gamersnexus.net

The Great NVIDIA Switcheroo | GPU Shrinkflation | GamersNexus

We look at how NVIDIA has downsized essentially all of its gaming GPUs in terms of relative configuration compared to each generation’s flagship

  • This article expands upon our "RTX 4080 problem" by looking at the entirety of the RTX 50 series, including how the RTX 5070 looks an awful lot like a prior 50-class or 60-class GPU.
  • NVIDIA is giving you the least amount of CUDA cores for a given class of GPU than ever before.
  • GPU prices have crept higher across the board, but NVIDIA's, in particular, have lost step with what we came to expect from generations of GPU launches.
9 comments
  • I miss the old Nvidia, the one making the 900 and 10 series cards. You got an amazing GPU at reasonable prices, everything less than $800.

    Nowadays, I feel like the market has deteriorated on Nvidia's end. Idgaf about the eight games with Some anemic raytracing overlays. I want good raster performance at 4k 120, good physics performance,and a good price.

  • On one hand I agree with GamersNexus Steve that 'line go down = bad,' but on the other hand you could consider it as the flagship becoming more and more of an outlier. (In other words, if the graph were normalized such that the 80 series line were flat, then I think the lower-model lines would also be flat but the 90 series line would be going up.)

    If Nvidia "fixed" it by just not offering the 5090 in its current form at all and instead having the fastest non-pro/compute/AI card be one with a lower price and fewer cores, would that make Steve and gamers happy?

    • This is true. But it also ignores price dynamics.

      One of the first GPUs that I "bought" (convinced my father to pay for an upgrade) was the GeForce 6600 for ~$250 or so (maybe $275 max) in 2004. This is the true price, not American-style list price. We bought it for that price (in local currency) at a computer store. I believe US true prices were (much?) lower that $275 at that time, but I could be wrong.

      $275 in 2004 is around $470 in 2025. You are not getting a Nvidia 6600 class card for $470 (all in) from AMD or Nvidia. The closest would be the Intel B580 which goes for around $340 (true price) where I live. But I would argue the B580 is not comparable to what the 6600 was in 2004. And the 6600 was broadly available in 2004 (at relatively competitive prices) even though I did not live in the "western world".

      And keep in mind that I don't remember the exact price of the 6600 that we bought in 2004. My memory tells me that it was around $250 which would be $420 is current dollars (solid price difference to the $470 mentioned earlier).

  • I agree with the premise that NVIDIA is ripping people off more and more every generation, but this is such a weird metric to use.

    Knowing Gamer’s Nexus’ fixation on negativity, they probably started with the goal of finding a metric where the line has gone down for every skew and worked backwards from there.

    • As someone that has bought a graphics card as part of a build in these scenarios:

      • after having a first job
      • after graduating from college
      • about 9 years later as a professional

      The first 2 felt like a nice upgrade given my larger budget (200-300 more total for the entire computer). The last one felt like the worst purchase of my life. If gamers nexus is sometimes negative, it is for a good reason, we all feel it.

      I have a feeling they also had to buy their own components and have not forgotten that despite nvidia or amd fighting for stats, and streamers always having a top of the line build most gamers don’t (as shown by steam survey results over and over).

    • Why do you say that? Note I did not downvote you, I almost never downvote in this community (I am a mod after all).

      I don't watch YT videos around mainstream technology (niche topics is different), so I only read GN's articles. It's relatively balanced with solid analysis considering the target audience.

      What sort of baseline in terms of posture do you have in mind? Let's say LTT is shill positivity and GN is a fixation on negativity. What would sort of balanced approach do you think would work?

      Keep in mind I am not a GN fanboy and I only read their articles when they are posted here. But that being said I would rather choose "commercial negativity" that represents my interests than "shill positivity".

      And in defence of GN, they've actually done a lot of good for the global PC enthusiast community. GN taking up a story forces OEMs and semiconductor designers to react and they are big enough that they don't necessarily need review samples (so they have some leverage).

      How is this a bad thing?

9 comments