Nvidia Reportedly Ends GeForce RTX 5070 Ti Production, RTX 5060 Ti 16 GB Next

techpowerup.com

50 points

ndiddy

a day ago


73 comments

lemoncookiechip a day ago

No worries, gamers.

You can subscribe to our GeForce NOW service to rent a top of the line card through our cloud service for the low low price of 11€$£ or 22€$£ a month with *almost no restrictions.

*Except for all the restrictions.

  • merpkz a day ago

    I just bought a 5070 Ti a week ago and can attest I have used it for maybe 3-4 hours since then. It begs the question maybe I should have rented the compute instead instead of paying 900 eur on spot - that's like 3 years worth of rent.

    • observationist a day ago

      If the compute is the unit of value under consideration, maybe. But there's more - you have access, freedom from supervision, the capability to modify, upgrade, tweak, adjust anything you want, resell compute under p2p cloud services when idle, etc. And then if the market for these gets hot, you can sell and recoup your initial costs and then some. The freedom and opportunity benefit - as opposed to the dependence and opportunity cost of renting - is where I personally think you come out on top.

    • throwaway2027 a day ago

      Because it'll set a precedent and eventually kill off being able to own the hardware to run things locally anymore in the future.

    • Aurornis a day ago

      The GeForce Now service is actually a decent deal for casual gamers.

      The hardcore and frequent gamers won’t like it but it was never really for them.

      • close04 a day ago

        The problem is that they're always a great deal, the best even, while there are alternatives. The noose tightens only after everyone is onboard.

        And the competition on the GPU market is soft to say the least.

      • theodric a day ago

        So is a Steam Deck, really

    • dymk 20 hours ago

      If you buy, you still own the 5070 at the end of three years and can sell it. If you rent, you have nothing.

    • wongarsu a day ago

      The correct calculation is not 900€/36 months but (900€-$resell_value)/36 months. If you sell your GPU for 450€ after three years you saved a good bit of money. If the AI bubble doesn't pop, your resale value might even be a good bit higher than that. I've had a used 1080TI that I used for five years and then sold for nearly the same price, making it effectively free (minus electricity use and opportunity cost)

      • fluoridation a day ago

        Even if you don't resell it, at the end of the three years you still have a GPU that you can keep using, or gift, or whatever. After three years of renting, you have nothing.

    • threetonesun a day ago

      Currently after 3 years the price of the GPU if you decide to sell it might be a wash, much like it was after the crypto boom. Granted you have to pay for electricity to run it, but you also have full control over what it runs.

    • RulerOf 21 hours ago

      I briefly considered doing the GPU rental thing, but the added latency and video encoding artifacts annoy me endlessly.

    • dymk a day ago

      It cost 900eur because nvidia is shafting you

    • Fire-Dragon-DoL a day ago

      Yes! Then you want to play one of the FromSoftware game and you are doomed.

      Damn nvidia

    • dyauspitr a day ago

      Where the hell is 900 eur 3 years of rent

      • diab0lic a day ago

        Haha. I read that the same way the first time I read it. The commenter means 3 years of renting GPU from nvidia via cloud services.

      • close04 a day ago

        I think that's comparing to 3 years of GeForce Now at ~22EUR/month for the Ultimate plan, for a total of ~800 EUR. For someone using in 3h/week then you might as well go for the free plan and pay nothing. But renting has while owning can only have financial cost, renting has a hidden cost on top of that. It leads to "atrophy" of the ownership right and once you lose that option you'll never get it back. That will have incalculable costs.

  • PunchyHamster a day ago

    It's 33 eurodollar now. I'm sorry I meant 44

    • lemoncookiechip a day ago

      China will save us, except no, we'll just ban their hardware sales, sucks to suck.

      • fc417fc802 a day ago
        3 more

        Does China have any cutting edge fabs yet? I thought it was still just TSMC, Samsung, and maybe Intel.

        Maybe consumer electronics will move backwards by a process node or two?

        • lemoncookiechip a day ago
          2 more

          They've just recently been able to reverse engineer ASML's EUV machines. They're years and years behind, although the way things are moving forward with hardware prices skyrocketing (RAM, SSD, GPUs...) regular consumer won't have much choice in anything anyway for a while.

          • fc417fc802 a day ago

            So at a rough guess would that be expected to put them on the equivalent of TSMC 7 nm a few years from now?

            I wonder if a bunch of consumer electronics will move back to something like 12 nm for a while? Seems like there's a lot of capacity in that range. Zen 2 wasn't so bad, right?

ecshafer a day ago

Crucial shut down, Nvidia not producing consumer cards. Even with AMD cards, if there's no memory available then we can't get them either.

Ram is 4-5x the price of a year ago.

Is AI going to kill the consumer computer industry?

  • Aurornis a day ago

    > Nvidia not producing consumer cards.

    This is a false statement. They’re still producing consumer cards. You can go buy a 5070FE in stock on their web store at MSRP right now. You can buy a discounted 5060 from Best But below MSRP.

    They’re changing production priorities for a little while if the rumors are accurate.

    RAM prices have always been cyclical and prone to highs and lows. This is an especially high peak but it will pass like everything else.

    These predictions that the sky is falling are way too dramatic.

  • baal80spam a day ago

    > Is AI going to kill the consumer computer industry?

    Even if, the death of the AAA gaming is nothing I will cry about. Most games don't require anything remotely as performant as 5070.

    • ecshafer a day ago

      I don't play any AAA games really. The only "AAA" game I've played in the past few years is basically Baldurs Gate 3 and Kingdom Come Deliverance II. But mostly I play rpgs and strategy games that don't require much gpu power at all.

    • t-writescode 21 hours ago

      VR games inhale ram. This avoidance of putting any valuable amount of ram in graphics cards is a serious and consistent problem for larger VR games.

    • piva00 a day ago

      There are niches like sim racing which require a high powered GPU if you want to run ultrawide or triple screens though.

      Just saying that your grudges with AAA games have a blast effect you might not be aware of.

    • emsign a day ago

      I don't care about AAA gaming either, it's stale but one day the AI bubble will kill something you cry about though.

  • fc417fc802 a day ago

    That's one possibility. Another is that it's temporary until production can be ramped up (but I doubt it because fabs). Pessimistic take is that the suppliers expect the bubble to pop soon (and very violently) and want to maximize their take while they still can.

    Or maybe assuming the trend holds in the longer term it could mean that consumers will move downstream of datacenters. Anyone who wants a GPU rocking 3 to 5 year old recycled enterprise gear.

throwaway2027 a day ago

Maybe game companies will be forced to optimize their games and focus on innovative gameplay elements rather than graphics.

  • wongarsu a day ago

    AAA game companies won't care, they'll just continue targeting the latest console. For most of them releasing on PC is a nice bonus, not a core part of their strategy

voidfunc a day ago

Death of PC gaming incoming.

Happy I just bought my 5080 before Christmas. Theyre all on borrowed time.

  • ndiddy a day ago

    Well if you look at the SKUs they're discontinuing, they're taking out all the lowest end models with more VRAM to save the allocation for the higher end models with jucier margins. For example the 5070 Ti costs $500 less than the 5080 but both have 16 GB of VRAM. I imagine that for the near future, they'll have the 5060 8GB, 5070 12GB, and 16GB will be limited to the 5080 for consumers willing to spend $1300 on a GPU.

  • legobmw99 a day ago

    I am a recent 5070ti purchaser so I'm also feeling lucky, though if they exit the gaming market entirely I suspect the drivers will all go to crap soon thereafter

patapong a day ago

Very curious about the second order effects of the hundreds of billions poured into LLMs. Perhaps even if LLMs do not pan out, we will have a massive increase in green energy production, grid enhancements and a leap in capacity for general-purpose computing over the next few years? Or maybe that is my naive side talking...

  • JohnBooty 15 hours ago

    I agree that the potential is there for... something? I don't know what. The things you mentioned are possibilities for sure!

    Maybe another way to look at it is: with hundreds of billions being tossed around, could there possibly not be second-order effects?

    We'll see....

  • tencentshill 21 hours ago

    You can't plug an H200 into your home desktop. Compute would be owned and rented out by server hosts.

Anonyneko a day ago

Bought a slightly overpriced, even by its own standards, 5090 in May, hope that it lasts me through the next five years of madness, and that the madness will have some kind of a temporary respite (or I luck out on a higher paying job, or figure out how to invest properly - it seems like a lottery these days).

My only small regret is that I decided to build an SFF PC, otherwise I would've gone for 128 GB of RAM instead of just 64. Oh well, ̶6̶4̶0̶ ̶K̶B̶ 64 GB should be enough for most purposes.

827a a day ago

IMO, sadly: the DIY PC world is on life support and will likely be something that isn't even really possible to do, for top-of-the-line performance, by 2028.

I don't necessarily think that everything is going doomer "subscription based cloud streaming"; the economics of these services never made sense, especially for gaming, and there's little reason to believe that the same incentives that led to Nvidia, Crucial, etc wanting out of the consumer hardware business wouldn't also impact that business.

Instead, the future is tightly integrated single-board computers (e.g. Framework Desktop, the new HP keyboard, Mac Mini, RPi, etc). They're easier for consumers to buy. Integrated memory, GPU, and cooling means we can drive higher performance. All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated. They're smaller. It allows one company (e.g. Framework) to capture more margin than sharing with ten GPU or memory middle-men who just slap a sports car-looking cooler on whatever they bought from Micron and saying they're a real business.

My lingering hope is that we do see some company succeed who can direct-sell these high-end SBCs to consumers, so if you want to go the route of a custom case and such, you still can. And that we don't lose modular storage. But I've lost all hope that DIY PCs will survive this decade; to be frank, they haven't made economic sense for a while.

  • fc417fc802 a day ago

    > All of the components getting sourced by one supplier means the whole "X is leaving the consumer market" point is moot, and allows better bulk deals to be negotiated.

    I don't think that checks out. The fabs are booked out AFAIU. This is going to hit SoCs (and anything else you can come up with) sooner rather than later because it all depends on the same fabs producing the same silicone at the end of the day. It's just packaged differently.

    They left the consumer market due to the price difference. It's not that there aren't middlemen willing to purchase in bulk right now. It's that the OEMs aren't willing to sell at any price because they've already sold their entire future inventory at absurd prices for the next however many months or years.

    I assume there will still be at least a few SoCs to choose from but the prices will likely be completely absurd because they will have to match the enterprise price for the components that go into them.

    • 827a a day ago

      They're booked out by the people making these bulk deals. Apple has significant weight to throw at TSMC, Micron, etc when it comes to negotiating both capacity and price; far more than the DIY manufacturers. The same laws apply to Nvidia, Quallcomm, and AMD.

      The sub-argument to this is that graphics cards will drive up fab prices for other packaged silicon products; this has probably been true for the past two years, but its very likely that we'll see this change in 2026. Even if theoretical demand stays high (which is debatable, but not for today): every major AI lab is sitting on warehouses of gigawatts of ready silicon, with nowhere to power them, so real demand will drop as the bottlenecks of data center construction and power delivery are solved. Those problems will take another few years. Even if these companies have the money and want to spend it: It makes zero sense to buy cards today, to have them sit in a warehouse for two years, when you can sit on the cash and buy newer-generation cards in a year.

      I would bet very real money that, sometime in 2026, we will see Nvidia reduce or cancel a committed fab order from TSMC.

      • bigyabai 15 hours ago
        3 more

        I disagree with this for the most part. Nvidia's SOC designs have been industry-leading for almost a decade now, dGPUs are not their lifeline or their one-trick pony. They have a diverse base of customers that will schedule meetings with Jensen to re-allocate that fab capacity long before it gets cancelled. Apple doesn't, but Nvidia does.

        That's a real difference with the weight that Nvidia brings to the table, versus other customers. Not just the difference in capital, but the demand and diversity at play. There is Nvidia hardware in crop dusters and cruise missiles, datacenters and deep-sea SONAR. Apple kicked out their partners and doesn't even consider paying market-price when TSMC asks them to. They have the money to buy iPhone silicon at-cost, but the iPhone doesn't make enough money to compete with Raytheon's margins. Apple is smaller than you think, and all it takes is a team player to prove it.

        The performance side still won't add up in the favor of SOCs anyways. Distributed machines with high-speed interconnect run circles around the fastest Macs that an equivalent budget can buy. Benchmarks all show Nvidia being more power-efficient at raster and compute despite having the more complicated architecture. Nobody is ripping up their dGPU racks to install an SOC cluster right now, datacenters aren't putting their Nvidia cards on Ebay to buy more Ryzen blades. The opposite is happening really; SOCs are being fast-tracked into obsolescence to better handle heterogeneous workloads that homogenous SOCs can't do efficiently.

        Your initial claim ("they haven't made economic sense for a while") baffles me. SOCs would be cleaning up shop in the HPC niche, if they were any good at it.

        • 827a 8 hours ago

          The scale Apple operates at is sometimes hard for people to grasp. To present a small window into it: Apple did $73B in hardware revenue in the most recent quarter; Nvidia did $57B in total revenue. Now, consider that most of nvidia’s revenue comes from selling $30,000+ GPUs to businesses, with only a small number of TSMC-fabricated chips; whereas Apple sells $1000 phones. The scale of Nvidia’s contracts with TSMC, while increasing, doesn’t even begin to approach Apple’s. That’s why when TSMC began to open shop in the US, they did it with Apple.

          It’s true that Nvidia has a more diverse set of end-customers than Apple’s silicon. But, the customers you’re describing were the ones they had in 2020 when their revenue was $3B, versus the $57B they do today. The vast, vast majority of that revenue growth has came from less-than ten customers; the usual suspects. Revenue growth is a proxy for their contracts with TSMC. If the hyperscaler revenue growth takes a hit, the scale of that hit would outpace their ability to just shuffle fulfillment around (though, I have no doubt they will try)

          You’re also misunderstanding my initial claim, which might explain why you are baffled. I did not claim that SBCs make better economic sense than modular computers. I claimed that DIY computers rarely make economic sense over non-DIY computers today. Modular computers can be non-DIY; a quarter of every Best Buy is full of them, and it’s what most businesses in the HPC, CAD, etc spaces would buy their employees: a pre-build from Dell. Dell’s/etc relationships with suppliers grant them the same power laws that I explain fuel the SBC segment’s growth; power laws that the middlemen in the DIY space, like Corsair/etc, aren’t benefited from as strongly, and easily overwhelm with their own markup for sports car cooling fins and RGB.

        • fc417fc802 13 hours ago

          I think you've misunderstood what's being discussed here. AI demand for silicone has exceeded supply capacity and thus driven costs sky high. What's being debated is the projected impact on consumer electronics other than RAM and dGPUs - such as SoCs.

          GP suggests that AI hardware acquisition has outstripped the capacity to rack and power said hardware for the foreseeable future. If that's true we would expect orders to start getting postponed or even cancelled soon.

          It's an interesting hypothesis but I'm not sure I believe it.

  • JohnBooty 15 hours ago

         Instead, the future is tightly integrated single-board computers
    
    Well, all of that is true, but all of that has always been true, right?
etempleton a day ago

I could see Nvidia completely stepping out of the low to mid range Desktop GPU space. The margins have to be peanuts compared to their other business lines.

nerdjon a day ago

It will be interesting to see what the long term impact of this will be, the headline misses the biggest part that they (if the phrasing they use is correct) should be producing more of the lower speced (and cheaper) 5060 8GB model.

So while the news is not great, I think it is far from any doom and gloom if we are in fact going to be getting more 5060 cards.

As it is the value of the crazy higher speced cards was questionable with most developers targeting console specs anyways. But it does bring to question how this might impact the next generation of consoles and if those will be scaled back.

We will likely be seeing some stagnation of capability for a couple years. Maybe once the bubble pops all the work that went into AI chips can come back to gaming chips and we can have a big leap in capability.

emsign a day ago

Rumors have it they'll stop producing gaming GPUs all together. :(

  • JohnBooty 15 hours ago

    It's hard for me to believe they'll put 100% of their eggs into the AI basket, even if it's insanely more profitable than consumer GPUs at the moment.

    AI is simultaneously a bubble and here to stay (a bit like the "Web 1.0" bubble IMO)

    Also, importantly, consumer GPUs are still an important on-ramp for developers getting into nVidia's ecosystem via CUDA. Software is their real moat.

    There are other ways to provide that on-ramp, and nVidia would rather rent you the hardware than sell it to you anyway, but.... I dunno. Part of me says the rumors are true, part of me says the rumors are not true...

infecto a day ago

I don’t subscribe to all of this doom and gloom. I would like to consider myself a gamer and to be frank I used the same computer setup for since 2018 until I recently upgraded it in the past few months. Even with increased costs we are seeing the dollar spent per hour of entertainment is ridiculously cost effective.

re-thc a day ago

Time for AMD to shine?

  • zvqcMMV6Zcr a day ago

    AMD's approach to pricing was "comparable NVidia card minus $50". If price of remaining NVidia cards goes up then AMD will follow.

  • roboror a day ago

    They already had worse margins so probably not unless they've been hoarding RAM. AMD also wants DC money.

    • PunchyHamster a day ago

      you'd fool me looking at how PITA is to make stuff work compared to NVIDIA

      • pixl97 a day ago

        Wanting something, and executing on it properly are two different things.

  • whatevaa a day ago

    Knowing AMD, shine by doing the same.

    • keyringlight a day ago

      My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running between contracts for consoles or compute products. At the very least it's a welcome byproduct that provides a competent iGPU and a test bed for future 'main' products. It's been a long while since AMD has shown future vision for PC GPUs or they've led with a feature instead of following what others do.

      • re-thc a day ago

        > My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running

        During this time AMD was focused on CPUs. They've already said that they'll focus more on GPUs now (since CPUs are way ahead and AI is a thing) so this should change things.

  • MrBuddyCasino a day ago

    AMD avoids a price war with Nvidia for the simple reason that Nvidia has much, much more cash and will win this war, easily.

zcw100 a day ago

Have some faith, if it really is an AI bubble and it pops imagine the deals you're going to get like when Etherium went to PoS.

xnx a day ago

[flagged]

  • Anonyneko a day ago

    If it was easy enough to rent my desktop while I'm not using it (such that I can get it back whenever I need it myself, within a few minutes at most), I would happily do it.

  • nerdjon a day ago

    Do you mean when the computer is not in use or “unused” in the sense that even when gaming it is just being used for gaming and not something “productive”.

    2 very different arguments and not fully clear which you are trying to make.

    • xnx a day ago

      > Do you mean when the computer is not in use

      This. No judgement on any particular use. Just worth a reminder that the most advanced machines every produced make this magic rocks that sit there idle most of the time.

      • fluoridation a day ago

        It'd probably be unwise 100% utilize every machine ever produced, just in terms of waste heat, but even just simple wear and tear.

  • CivBase a day ago

    You could say that about litterally anything. Food, housing, fuel, heat, water. There are always solutions for better optimizing global resource allocation - especially if you're willing to ignore the wants and rights of the people.

newsclues a day ago

Gamers are going to burn DC to the ground.