September 27th, 2024

Leak claims RTX 5090 has 600W TGP, RTX 5080 hits 400W

Preliminary specifications for Nvidia's RTX 5080 and RTX 5090 graphics cards suggest significant performance advancements, with the RTX 5090 featuring 600W TGP and 21,760 CUDA cores, potentially launching in early 2025.

Read original articleLink Icon
Leak claims RTX 5090 has 600W TGP, RTX 5080 hits 400W

Preliminary specifications for Nvidia's upcoming GeForce RTX 5080 and RTX 5090 graphics cards have been leaked, suggesting significant advancements in performance. The RTX 5090 is expected to feature a total graphics power (TGP) of 600W, utilizing the GB202 graphics processor with 21,760 CUDA cores and 32GB of GDDR7 memory on a 512-bit bus. This positions the RTX 5090 as a high-performance option, potentially doubling the performance of the RTX 4080. In contrast, the RTX 5080 is projected to have a TGP of 400W, powered by the GB203 GPU, with 10,752 CUDA cores and 16GB of GDDR7 memory on a 256-bit interface. The performance gap between the two models is notable, with the RTX 5080 offering roughly half the specifications of the RTX 5090. Speculation suggests that Nvidia may be adopting a multi-chiplet design for the RTX 5090, similar to its datacenter GPUs, although a monolithic design is also possible. The release date for these cards remains uncertain, with some sources indicating a potential launch in early 2025. Until Nvidia provides official confirmation, these details should be viewed with caution.

- Nvidia's RTX 5090 may have a TGP of 600W and 21,760 CUDA cores.

- The RTX 5080 is expected to have a TGP of 400W with 10,752 CUDA cores.

- The performance gap between the RTX 5090 and RTX 5080 is significant.

- Speculation exists regarding a multi-chiplet design for the RTX 5090.

- The release date for the RTX 50-series is uncertain, possibly in early 2025.

Link Icon 14 comments
By @mikae1 - 7 months
I wish performance per watt[1] was more regarded than pure performance in times of ecological crisis. Let's see how these two fare...

[1] https://wikipedia.org/wiki/Performance_per_watt

By @magicalhippo - 7 months
My 2080Ti is still holding up reasonably well, helped by the fact I only run 1440p, but it's also doing OK for inference[1]. Though my "must have shiny new things" craving is getting hard to ignore.

Considered getting a new GPU earlier this year but then realized 5xxx was "around the corner", but now it seems they're pushed back to next year. And with AI being what it is, I'm guessing prices won't drop significantly.

Would be nice if AMD could get their GPU act together so it was a more viable alternative, NVIDIA could do with some competition.

edit: just recalled I had a dual-chip GPU back in the day, the ATI 4870X2[2]. Though that was more like two GPUs glued to one PCB, so effectively "single card SLI".

Hopefully the 5090 would be a better experience, as my 4870X2 never quite lived up to what it could theoretically do.

[1]: https://www.pugetsystems.com/labs/articles/llm-inference-con...

[2]: https://www.techpowerup.com/gpu-specs/radeon-hd-4870-x2.c236

By @bryanlarsen - 7 months
Prediction: the cards will sell crazy well.

Gamers don't need high end video cards, they want high end video cards. In general, the high marginal price for low marginal value of high end video cards prevents most gamers from acting on their desires.

But this generation of video cards provides a couple of other justifications for the purchase:

- it will allow them to run uncensored ML stacks locally - it will allow the buyers to train themselves on the hottest new career path.

A large number of people who use these excuses to justify the purchases to themselves or their loved ones will only use it for gaming, but those excuses will fuel a lot of sales.

This seems like the wrong generation for AMD to skip the halo tier of gaming cards.

By @kiririn - 7 months
Even the 250W 2080Ti (+150W Intel) is oppressive to be in the same room with during warmer months. I know it probably won't be, but it should be a hard sell in countries that don't have air conditioning as standard. Not to mention the noise needed to cool such heat
By @ghastmaster - 7 months
AMD's decision to back out of the high end cards seems even more logical given this information. High end may be trending too powerful. I used to buy the high end cards, but my next one will not be so. The cost and the power consumption are large factors in that decision. I do not need nor can I afford a super computer for gaming.
By @gtirloni - 7 months
If the trend in NVIDIA deceiving customers continue, the 5090 will be the new xx70 and the 5080 the new xx60.
By @PedroBatista - 7 months
This tango NVIDIA is dancing will continue as long as ( local ) AI doesn't reach plateau where even most enthusiasts are happy with the AI model they have, because when it comes to games, other than a few streamers and rich kids, nobody will run to buy this, even AAA games are dying and the appetite for running them with the latest and greatest graphics has mostly disappeared ( given the cost ).

From a gaming perspective, technicality is not anything like the ending days of Voodoo cards, but somehow it reminds me of the same feeling.

By @Joker_vD - 7 months
If this doesn't stop, then very soon we'll just end up with external video cards because this is ridiculous. We already kinda have that [0], but it's quite a hack. I wonder how well PCIe can be delivered over the ribbon cables though...

[0] https://www.razer.com/mena-en/gaming-laptops/razer-core-x

By @cabirum - 7 months
Can I boil water for my tea with it already, or should I wait next gen?
By @lvl155 - 7 months
Guess I will heat my basement with 5090 this winter. 3090 was one of the best purchases I have ever made. Mined ETH to pay for itself within two months. Then came all the AI fun. Can’t think of anything that was as useful in my life.
By @idiocrat - 7 months
Finally a valid reason for those 1000W platinum PSUs.

Their highest efficiency is at 80-90% utilization, but the efficiency drops off when underutilized.

By @ThinkBeat - 7 months
Instead of hunting globally for places to power more and more AI, Perhaps instead electricity usage should be frozen at some level and it would be up to the AI chip makers to produce solutions that gives more calculations for the same amount of power.

Which has been a huge factor in data center servers from what i know for a long time. Somehow that has been forgotten

By @Ekaros - 7 months
Will the price be 3000 or 5000 for 5090? With nice top of the line CPU you can a small space heater here. Must be nice in summer.
By @atemerev - 7 months
We want cheap(ish) desktop GPUs with 80G or more VRAM, so not only large corporations could meaningfully participate in AI research.

The price divide between “desktop” and “datacenter” GPUs is artificial and no doubt there is a collusion of some kind.