NVidia GeForce RTX 40x0 cards - x2 the power consumption, x2 the performance

  • Guest, it's time once again for the hotly contested and exciting FoH Asshat Tournament!



    Go here and fill out your bracket!
    Who's been the biggest Asshat in the last year? Once again, only you can decide!

Utnayan

I Love Utnayan he’s awesome
<Gold Donor>
16,269
12,027
Same here. I love my case and I really don’t want to get a bigger one for a new gpu. I would if I was two or three generations behind but I’m running a Ryzen 5900 with a 3090 and every game I play run flawlessly with highest settings at 1440p.

I have a 3700x and 2070 Super and run everything flawlessly at max settings in 1440p. Still zero need to upgrade for the long forseeable future.
 

Daidraco

Golden Baronet of the Realm
9,071
9,097
All of you just openly walking into the 4090 sound like you buy a new iPhone every release. As soon as I heard the 4090 doesnt support PCI 5 and will only have HDMI 1.4 display - I noped completely the fuck out. It was already a hard sell with the power draw and the case requirements.
 
  • 5Solidarity
  • 1Truth!
Reactions: 5 users

Springbok

Karen
<Gold Donor>
8,970
12,472
All of you just openly walking into the 4090 sound like you buy a new iPhone every release. As soon as I heard the 4090 doesnt support PCI 5 and will only have HDMI 1.4 display - I noped completely the fuck out. It was already a hard sell with the power draw and the case requirements.
Looks like someone isn't even reading the testing data or watching the reviews - I'd have agreed with you two days ago, but the thermals and power draw are downright impressive for a gen to gen leap like we are seeing here in performance. The cards are massive, but the thermals are quite good (and this is just the FE card) and power draw is generally impressive (especially considering g2g leap in performance and previous assumption of insane power draws).

1665514894941.png



1665514927383.png


1665515211495.png


This is pre under-volt as well. The uplift in performance is significant as well gen 2 gen, and finally makes gaming at 4k+ a reasonable goal. That said, it's still very expensive, the cards are humungous and if you are still gaming at 1440p or lower resolution it probably makes no sense to buy a $1600 toy. The 4090 though taken in the context of the 4080 cards pricing/performance expectation and the cost of the 3090 TI is a reasonable value and substantial boost in performance. If you don't want to fork over money to scumbag Nvidia fair enough, I might be passing as well (just out of principle) but based on what I'm seeing today, these are all very good performance, thermal and power #'s and the card looks solid.
 
  • 1Faggotry
Reactions: 1 user

spronk

FPS noob
22,473
25,382
All of you just openly walking into the 4090 sound like you buy a new iPhone every release. As soon as I heard the 4090 doesnt support PCI 5 and will only have HDMI 1.4 display - I noped completely the fuck out. It was already a hard sell with the power draw and the case requirements.

wait, what? everything i've seen said the 40xx cards will support HDMI 2.1 including VRR, I assume you mean DisplayPort 1.4 not 2.0. It is kinda weird though, DisplayPort 2.0 was standardized in 2019 but the first big video cards supporting it will be AMD 7xxx GPUs this fall. Some of the Intel ARCs support it too i guess, lol

the 30xx was the same, 3 1.4 Displayport ports and 1 HDMI 2.1 port
The Nvidia RTX 4090 shares the exact same ports as its RTX 3090, which includes 3x DisplayPort and a single HDMI 2.1 connection which supports up to 4K 120Hz and 8K 60Hz.
 

Daidraco

Golden Baronet of the Realm
9,071
9,097
Looks like someone isn't even reading the testing data or watching the reviews - I'd have agreed with you two days ago, but the thermals and power draw are downright impressive for a gen to gen leap like we are seeing here in performance. The cards are massive, but the thermals are quite good (and this is just the FE card) and power draw is generally impressive (especially considering g2g leap in performance and previous assumption of insane power draws).

View attachment 437442


View attachment 437443

View attachment 437444

This is pre under-volt as well. The uplift in performance is significant as well gen 2 gen, and finally makes gaming at 4k+ a reasonable goal. That said, it's still very expensive, the cards are humungous and if you are still gaming at 1440p or lower resolution it probably makes no sense to buy a $1600 toy. The 4090 though taken in the context of the 4080 cards pricing/performance expectation and the cost of the 3090 TI is a reasonable value and substantial boost in performance. If you don't want to fork over money to scumbag Nvidia fair enough, I might be passing as well (just out of principle) but based on what I'm seeing today, these are all very good performance, thermal and power #'s and the card looks solid.
Never mentioned any details, but thank you for assuming that Im just talking out of my ass. I dont care if it can render an entirely new fucking galaxy on top of us with the size of that behemoth and its peak power draw. If anything, you proved my fucking point. Thanks for being condescending.
wait, what? everything i've seen said the 40xx cards will support HDMI 2.1 including VRR, I assume you mean DisplayPort 1.4 not 2.0. It is kinda weird though, DisplayPort 2.0 was standardized in 2019 but the first big video cards supporting it will be AMD 7xxx GPUs this fall. Some of the Intel ARCs support it too i guess, lol

the 30xx was the same, 3 1.4 Displayport ports and 1 HDMI 2.1 port
The Nvidia RTX 4090 shares the exact same ports as its RTX 3090, which includes 3x DisplayPort and a single HDMI 2.1 connection which supports up to 4K 120Hz and 8K 60Hz.
I should have been more clear. I assume I just had more hope that people knew that they use multiple fucking monitors and one monitor being on 2.1 and the others on 1.4 is going to be a complete shit show. Fuck, they could have at least made the 1.4's into three 2.0's?

BASICALLY = You're paying 1600 bucks for a card that is skimping on the fucking HDMI ports and on PCI Gen 5 support. Both of which would drastically improve the performance/quality of life.

Watch the fucking fat man and learn.
 
  • 2Like
  • 1Picard
Reactions: 2 users

Bubbles

2022 Asshat Award Winner
<Bronze Donator>
44,384
-29,779
I'll wait for a good game to come out, lol.

Also for good PCIE 5.0 SSDs to come out, the first batch looks pretty lame.

ashes of creation is probably the first one that might need this kind of power
 
  • 1Moron
  • 1OK, Sure
Reactions: 1 users

Daidraco

Golden Baronet of the Realm
9,071
9,097
ashes of creation is probably the first one that might need this kind of power
Im more interested in AOC. But The Finals cranked up to max (the last thing you'd probably do in an FPS) looks like its going to take a monster of a machine.
 

Leadsalad

Cis-XYite-Nationalist
5,946
11,881
Blue bar is power consumption in watts. Debauer was testing at 60% limit and finding the performance was nearly the same as default 100%.
1665524552868.png


1665524559848.png
 
  • 1Like
Reactions: 1 user

Malakriss

Golden Baronet of the Realm
12,293
11,674
You either FPS as a hero or melt long enough to become the villain
 
  • 1Like
Reactions: 1 user

Xexx

Vyemm Raider
7,385
1,603
I wonder what time BB cards will go on sale. Gotta use that credit on something, and a gpu sounds about right.
 

spronk

FPS noob
22,473
25,382
digital foundrys 4090 review. i'm still pretty confused about DLSS 3.0, reviews mostly seem positive about it but i hear people say it adds input lag and stuff which i don't really get. plus honestly i'm too old to really care about input lag anymore, my reaction times are gonna be way longer than any input lag

 

nu_11

Golden Baronet of the Realm
3,020
19,736
digital foundrys 4090 review. i'm still pretty confused about DLSS 3.0, reviews mostly seem positive about it but i hear people say it adds input lag and stuff which i don't really get. plus honestly i'm too old to really care about input lag anymore, my reaction times are gonna be way longer than any input lag

Digital Foundry already found that DLSS 3.0 + Reflex has lower input latency than native rendering sans Reflex.