NVidia GeForce RTX 40x0 cards - x2 the power consumption, x2 the performance

spronk

FPS noob
22,638
25,709
dlss 3.0 only runs on the 4xxx series, i dunno if dlss is any good or shit since i never game on my pc and i still am rocking a 1080ti. my steam deck is my gaming PC now lol

my gut tells me paying $1700 for a 4090 is gonna be a "i regret this" move in a year when something more powerful is out for $500-800 cheaper, especially since there is fuck all gaming wise that demands that much power right now. callisto protocol is the only thing i can really think of in the next six months that might push GPUs a bit and even that is gonna run perfectly fine at 60 fps on PS5s. maybe if you are huge into Flight Simulator or still playing Elden Ring it might be great on a 4k monitor

still if 4080s are pretty easy to get $900 is "ok" for the 12GB version. nvidia's gonna keep all the 3080s and 3090s in the channel and discount them even further, 3080ti FE is $900 at best buy right now but why buy that when the 4080 will be what, at least 50-100% faster for the same price?
 
  • 2Like
Reactions: 1 users

Borzak

Bronze Baron of the Realm
24,664
32,045
Ray tracing was the big thing for the 3xxx series and that was kind of meh depending on which game you were playing.
 

Xexx

Vyemm Raider
7,438
1,644
Really sucks AMD is waiting till November to unveil their shit - didnt they do it the same way last time? Curious if they will be able to match Nvidia this time around, if not so long as close but cheaper its still a win.
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,707
34,882
It'll all depend on the power draw of the AMD cards. If they come in at <400, AMD easily has my money this gen. Hell, they had it last gen too, because FUCK nVidia. My 6700XT plays almost everything at 100+fps in 1440p at max settings. Hell, even Hell Let Loose which is pretty poorly optimized and can get really graphically intense with 60+ players will only dip to 80FPS, even at max settings. If I were to upgrade, it'd probably just be to swoop up a 6900XT on super discount.

I still don't like Adrenalin's intuitiveness/interface as much as nVdia's, but AMD's drivers this gen have been just fine and their "smart access memory" feature is badass. Not a single issue the entire lifetime of my card so far, other than 22.8.0 had some weird crashing issues with web players, but has been fine since 22.8.1.
 
  • 2Like
Reactions: 1 users

Mist

Eeyore Enthusiast
<Gold Donor>
30,431
22,247
Not sure why people are making a big deal out of the power draw, the 4080 16GB has a stock 320W TDP which is the same as the 3080 10GB FE, for what is likely to be an absurd performance uplift. Thats a huge jump in performance per watt.

The 4090, which you'd have to be stupid to buy, has an insane power limit, but everything about the card is insane. They've decided to just get rid of the Titan line, so the 4090 is the 'ludicrous speed' variant. I've never bought one of the Titan or X090 cards, so what they're priced at or what power specs they put on it are irrelevant to me.

Also, the 4080 16GB has the same price point as the 3080 Ti which is a reasonable comparison architecturally. I bet we don't see Ti cards this generation, and they just do a 'Super' refresh a year or so from now.

If you ignore the 4090, which you should, none of this seems out of line. 'Rebranding' the 4070 as a 4080 12GB is a little underhanded, but let's wait and see the performance numbers, it might not be out of line either.
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,707
34,882
Not sure why people are making a big deal out of the power draw, the 4080 16GB has a stock 320W TDP which is the same as the 3080 10GB FE, for what is likely to be an absurd performance uplift. Thats a huge jump in performance per watt.

The 4090, which you'd have to be stupid to buy, has an insane power limit, but everything about the card is insane. They've decided to just get rid of the Titan line, so the 4090 is the 'ludicrous speed' variant. I've never bought one of the Titan or X090 cards, so what they're priced at or what power specs they put on it are irrelevant to me.

Also, the 4080 16GB has the same price point as the 3080 Ti which is a reasonable comparison architecturally. I bet we don't see Ti cards this generation, and they just do a 'Super' refresh a year or so from now.

If you ignore the 4090, which you should, none of this seems out of line. 'Rebranding' the 4070 as a 4080 12GB is a little underhanded, but let's wait and see the performance numbers, it might not be out of line either.
I'm more concerned with the transient load spikes, I could give a shit less about the "base" power draw.

And yeah, the 4090 is stupid, but that doesn't change the fact that the power consumption is "TOO DAMN HIGH!". Like I said, if AMD can get that down to sub 400 at similar performance and $2-300 less? SPICY!
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,431
22,247
But why do you care about the power draw on the flagship card that no one should reasonably be buying?

I'm interested in seeing what the 4060 and 4070 will be like, if we're going to see <200 watt cards that beat the current 3080 ti.
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,707
34,882
But why do you care about the power draw on the flagship card that no one should reasonably be buying?
Because it's usually an indicator of what the "low-mid"range cards are going to draw. And also because I like spending "fuck you" money on dumb shit.

I'm also just bitching to bitch, because nVidia could offer that same performance at 200 watts and I still wouldn't buy their shit, because fuck 'em. But, I like having more reasons for why nVidia sucks donkey dong.
 
  • 1Like
  • 1Pathetic
Reactions: 1 users

Pyros

<Silver Donator>
11,060
2,262
Ray tracing was the big thing for the 3xxx series and that was kind of meh depending on which game you were playing.
DLSS isn't really like ray tracing though, it doesn't make the games look better, if anything they'll look worse, but it makes them run noticeably faster. DLSS 2.0 is really nice IF you're GPU capped for stuff, since the loss in quality is fairly negligible unless you're comparing screenshots and stuff, while you'll gain a nice FPS boost. I know I liked it in the few games I played with it in 4k, but it's probably borderline worthless for 1440p unless you're aiming for stupidly high refresh rates and since not every game supports it, it still depends a lot on what you're playing.

The question is what the fuck is going to need the power of a 40xx card(based on current knowledge of the numbers, we'll see when we get real tests for the actual numbers) and also DLSS. 4k120? 8k60? I think that's the only stuff you'd want DLSS for and you'd need a really expensive monitor to play with this anyway(maybe a TV for the 8k stuff if you got scammed into buying one?), that and games that actually support those settings. Maybe in a couple of years though once devs stop making games compatible with PS4/Xbox One and just start cranking up the graphics of stuff but atm seems pretty redundant.

Still looking forward to the real tests, and especially heat/noise/watt results.
 

Springbok

Karen
<Gold Donor>
9,032
12,625

Been on the $ since Ampere. Looks like the 4070, I mean 4080 12g is a bit worse than 3090ti in rasterization but we’ll see. Pricing is retarded and the 4090 is the only offering that’s reasonably priced
 

nu_11

Avatar of War Slayer
3,068
20,058
I think the 4090 is $100 too much, 4080 12gb is $150 too much, and 4080 16gb is $200 too much.

Its not bad enough for me to consider it gouging /shrug
 

Neranja

<Bronze Donator>
2,605
4,143
It'll all depend on the power draw of the AMD cards. If they come in at <400
Rumor says AMD reference designs will come in well below 400W, more around 350W. AIB partners may go all in >400W, though, because the TSMC node allows for a lot of headroom. Which is what Nvidia utilizes this generation for their 4090.
 

Neranja

<Bronze Donator>
2,605
4,143
Pricing is retarded and the 4090 is the only offering that’s reasonably priced
Depends on what your goals are. Nvidia aimed their new cards purely on performance, and nothing else. Rumor has it that they cost 50% more to produce that the corresponding Ampere cards.

They clearly wanted to show AMD who's boss, at the cost of power usage and price.
 
  • 1Like
Reactions: 1 user

Tmac

Adventurer
<Gold Donor>
9,379
15,939
Printing a shitload of money + shutting down production + mining + greed.
You think Nvidia watched people pay 1800$ for 3080s during the Covid times and wasn't immediately having internal meetings about all the revenue they were leaving on the table?
I mean how is it that within 1-2 generations we went from $500-$600 top end card to $2K, shits fucking crazy.

I think it's as simple as, "The gov gave everyone $1500 3x." What do you think gamers did with one of their checks?

They bought a gubmint subsidized GPU. The prices will inevitably come back down bc there won't be de facto subsidies.
 
  • 1Truth!
Reactions: 1 user

Springbok

Karen
<Gold Donor>
9,032
12,625
I think the 4090 is $100 too much, 4080 12gb is $150 too much, and 4080 16gb is $200 too much.

Its not bad enough for me to consider it gouging /shrug
I mean based on Ampere there's only a $100 delta from 3090->4090. That to me is a reasonable pricing increase given costs/supply chain (actually, its SO reasonable I'm wondering if Nvidia has some concern from AMD at the top end, because I was sure the 4090 would be closer to $2k MSRP). There's a $500 delta from 3080->4080 and a $300 delta from 3070TI->4080 12GB. I'm sure they do cost more to produce, and the testing should give us an idea on performance (lol at Nvidia's charts). I suspect these will be awesome cards, but the 4080 card pricing is insane to me - the notion that the performance uplift gen to gen somehow gives Nvidia latitude to ratchet prices, restructure model naming conventions etc is nonsense. That logic gives them authority to charge $2k for the 6080 because its 50% faster than the 5080 which was $1500 and 50% faster than the 4080? What? This isn't directed at you just a general rant about the whole state of the market.

I noticed on EVGA's site over the weekend you could have bought 3090TI's for $900 all day long, and their entire stock is now depleted. Kind of wondering if I should have just jumped on that as I'm pretty sure the 3090TI will be a better card for pure rasterization than the 4070**(4080 gimped card).
 

Hateyou

Not Great, Not Terrible
<Bronze Donator>
16,324
42,432
Those 3090TIs were refurbs. Probably still a good deal with EVGAs customer support.
 

Malakriss

Golden Baronet of the Realm
12,359
11,759
EVGA had 1080 Ti B-stock for $200 for about 10 minutes before they went OOS last night. Companies still want up to $400 for 2080s, $800 for 3080s, and $1200 for 4080s.

Nope.
 

Brahma

Obi-Bro Kenobi-X
12,010
42,610
I actually went over the chart comparing the models in a bit more detail. WTF is this shit? The 4080 12 vs 16GB version isn't just a memory upgrade. This is more like a 40fuckin70. They trying to get 900 bucks what essentially is a 4070? That is some seriously shady shit.

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4090NVIDIA GEFORCE RTX 4080 16GNVIDIA GEFORCE RTX 4080 12GNVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3080
GPU NameAda Lovelace AD102-300?Ada Lovelace AD103-300?Ada Lovelace AD104-400?Ampere GA102-225Ampere GA102-200
Process NodeTSMC 4NTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size608mm2~450mm2~450mm2628.4mm2628.4mm2
Transistors76 BillionTBDTBD28 Billion28 Billion
CUDA Cores1638497287680102408704
TMUs / ROPsTBDTBDTBD320 / 112272 / 96
Tensor / RT Cores576 / 144TBD / TBDTBD / TBD320 / 80272 / 68
Base Clock2230 MHz2210 MHz2310 MHz1365 MHz1440 MHz
Boost Clock2520 MHz2510 MHz2610 MHz1665 MHz1710 MHz
FP32 Compute90 TFLOPs49 TFLOPs40 TFLOPs34 TFLOPs30 TFLOPs
RT TFLOPsTBDTBDTBD67 TFLOPs58 TFLOPs
Tensor-TOPsTBDTBDTBD273 TOPs238 TOPs
Memory Capacity24 GB GDDR6X16 GB GDDR6X12 GB GDDR6X12 GB GDDR6X10 GB GDDR6X
Memory Bus384-bit256-bit192-bit384-bit320-bit
Memory Speed21.0 Gbps23.0 Gbps21.0 Gbps19 Gbps19 Gbps
Bandwidth1008 GB/s736 GB/s504 GB/s912 Gbps760 Gbps
TBP450W320W285W350W320W
Price (MSRP / FE)$1599 US$1199 US$899 US$1199$699 US
Launch (Availability)October 2022November 2022November 20223rd June 202117th September 2020
 
  • 2Like
  • 1Solidarity
Reactions: 2 users