NVidia GeForce RTX 40x0 cards - x2 the power consumption, x2 the performance

Mist

Eeyore Enthusiast
<Gold Donor>
30,430
22,246
Pretty soon we will be required to foster small migrant children to reach into our rigs to press the gpu release latch
I have to use a plastic take-out knife to release the latch on mine. It works really well actually. Nothing else would work without being too pokey and damaging the board or being too brittle. But the backside of a decent plastic knife is perfect. There's like 3 different 'grades' of plastic utensils, you want the middle grade one ideally.
 

Fucker

Log Wizard
11,580
26,225
I have to use a plastic take-out knife to release the latch on mine. It works really well actually. Nothing else would work without being too pokey and damaging the board or being too brittle. But the backside of a decent plastic knife is perfect. There's like 3 different 'grades' of plastic utensils, you want the middle grade one ideally.
I knew someone who used a knife on it. Managed to break the latch and ruin the mainboard all in one go. He also used thermal joint compound on top of thermal pad and wondered why the cpu ran hot.
 
  • 1Worf
Reactions: 1 user

Izo

Tranny Chaser
18,537
21,414
I have to use a plastic take-out knife to release the latch on mine. It works really well actually. Nothing else would work without being too pokey and damaging the board or being too brittle. But the backside of a decent plastic knife is perfect. There's like 3 different 'grades' of plastic utensils, you want the middle grade one ideally.
eyebrow GIF
 

Kirun

Buzzfeed Editor
<Gold Donor>
18,707
34,877
I have to use a plastic take-out knife to release the latch on mine. It works really well actually. Nothing else would work without being too pokey and damaging the board or being too brittle. But the backside of a decent plastic knife is perfect. There's like 3 different 'grades' of plastic utensils, you want the middle grade one ideally.
Ever use that puppy to cut up your mondo turds that won't flush? A "poop-knife", if you will...
 
  • 1Worf
Reactions: 1 user

Araxen

Golden Baronet of the Realm
10,260
7,614

One of the 4080 cards should actually be named 4070.
 
  • 1Repost
Reactions: 1 user

Neranja

<Bronze Donator>
2,605
4,143
and think DLSS 3.0 could be neat for "free" FPS boosts
Depends on how well it is implemented. If they really put work into it, then DLSS 3.0 has to "predict" how the future frame looks like to create an inbetween frame to display.

If Nvidia does the lazy thing, like basically every high refresh rate TV does at the moment, and the interpolation is just between two existing frames, then that adds at least one or two (depending if you include the generated frames in the count) latency. Gamers would probably crucify Nvidia for this, especially after Nvidia created tools and benchmarks how "low latency" their whole graphics pipeline is, from mouse click to monitor.

Also, please note that most of those "2x to 4x faster" claims always include DLSS 3.0 in some form. Read the fine print: "DLSS Frame Generation on RTX 40 series when applicable". This is very obvious when you look at the Microsoft Flight Simulator benchmarks: That benchmark is very CPU bound, and is not that limited by GPU performance. So those "2x performance" gains probably come mostly from DLSS 3.0 frame generation. It's very obvious when you compare it to the other benchmarks under "Today's Games"

tl;dr: Nvidia wants you to sell high end graphics cards for "Next Generation" games. Of which one is Portal, and the other Cyberpunk 2077. This is the "buy a 1600W power supply to be future-proof" of graphics cards.
 

Springbok

Karen
<Gold Donor>
9,032
12,625
Yeah, adoption rate for the games has to be there and I struggle to see it with AMD controlling the console gpu space. It'll happen for a handful of bleeding edge games though and be awesome on those titles I'm sure. Is that a selling point of the card though? Not to me. What are the rasterization #'s? Anxious to find out.
 

Hateyou

Not Great, Not Terrible
<Bronze Donator>
16,320
42,427
I don’t feel like any studios really tried pushing the envelope on the 30x0 series, I’m not sure we will see it with the 40x0 series either. It’s expensive to do and your market is smaller with those kinds of games. I mean Elden Ring looked good but you can tell they weren’t trying too hard, it could have been improved in a ton of areas, but it would run like shit for a lot of people if they would’ve done that.
 

mkopec

<Gold Donor>
25,410
37,503
Yeah, adoption rate for the games has to be there and I struggle to see it with AMD controlling the console gpu space. It'll happen for a handful of bleeding edge games though and be awesome on those titles I'm sure. Is that a selling point of the card though? Not to me. What are the rasterization #'s? Anxious to find out.
yeah most of those bleeding edge games end up big ole turds anyway.
 

spronk

FPS noob
22,627
25,698
Microsoft Flight Simulator and Cyberpunk are the only games I can think of that go out of their way to lean into various things like DLSS and RTX and the cards make a huge difference.

Callisto Protocol is out in Dec, God of War 2 also but thats PS5 only for at least 2 years, other than that not sure what else is out in 2023... Starfield will be optimized for Series X, then there is stuff like FF7R part 2 (is a PC port at launch?), Jedi Fallen Order 2, Final Fantasy 16, and FF7 Chrono whatever. I'm pretty sure all of those will run perfectly fine on anything really.

Star Citizen is the 4090s only hope but by the time it comes out the 6090 will be out
 

Hateyou

Not Great, Not Terrible
<Bronze Donator>
16,320
42,427
Microsoft Flight Simulator and Cyberpunk are the only games I can think of that go out of their way to lean into various things like DLSS and RTX and the cards make a huge difference.

Callisto Protocol is out in Dec, God of War 2 also but thats PS5 only for at least 2 years, other than that not sure what else is out in 2023... Starfield will be optimized for Series X, then there is stuff like FF7R part 2 (is a PC port at launch?), Jedi Fallen Order 2, Final Fantasy 16, and FF7 Chrono whatever. I'm pretty sure all of those will run perfectly fine on anything really.

Star Citizen is the 4090s only hope but by the time it comes out the 6090 will be out
I doubt it will be out that quick. I’m guessing they can get another decade or more of grift from scam citizen.
 

mkopec

<Gold Donor>
25,410
37,503
Microsoft Flight Simulator and Cyberpunk are the only games I can think of that go out of their way to lean into various things like DLSS and RTX and the cards make a huge difference.

Callisto Protocol is out in Dec, God of War 2 also but thats PS5 only for at least 2 years, other than that not sure what else is out in 2023... Starfield will be optimized for Series X, then there is stuff like FF7R part 2 (is a PC port at launch?), Jedi Fallen Order 2, Final Fantasy 16, and FF7 Chrono whatever. I'm pretty sure all of those will run perfectly fine on anything really.

Star Citizen is the 4090s only hope but by the time it comes out the 6090 will be out
Its all about the consoles, bro. Like I said, maybe the PC versions will have more dials in the options to boost the graphics and shit but most games these days are all console and ports for PC these days. I cant even think of a pure PC type game that came out in the past like 5 yrs. Im sure they exist but they are not the bread and butter. Plus now all the indie shit is all going old school anyway. Which is great BTW because it focuses on gameplay rather than OOOOH Shiny graphics.
 

Elderan

Blackwing Lair Raider
590
407
I want to upgrade my 2x 2080tis but cant find a cpu I like more than my w3175x 28 core. Wish intel would get back into making HEDT cpus again was hoping for some 38 and 54 core ones.
 

mkopec

<Gold Donor>
25,410
37,503
I want to upgrade my 2x 2080tis but cant find a cpu I like more than my w3175x 28 core. Wish intel would get back into making HEDT cpus again was hoping for some 38 and 54 core ones.
Mind me asking why? Do you do some science/engineering modeling type shit on that thing that requires all them cores?
 

spronk

FPS noob
22,627
25,698
rendering accurate anime titties requires massive rasterization power


according to that:

3090 ti with RTX and DLSS 2.0 on hits 60 fps with Cyberpunk
4090 ti with RTX and DLSS 3.0 hits 170 fps

50C temps on 4090, 2850mhz stock and DLSS 3.0 cuts GPU wattage by 25% compared to 3090.

in practice i doubt i'd notice any of it, but numbers are cool
 

Elderan

Blackwing Lair Raider
590
407
Mind me asking why? Do you do some science/engineering modeling type shit on that thing that requires all them cores?

I mainly need the cpu power for running backtesting simulations for automated trading software. The more cores the better.