NVidia GeForce RTX 40x0 cards - x2 the power consumption, x2 the performance

Neranja

<Bronze Donator>
2,605
4,143
Im sure there is a 4090Ti lol its just not announced until they are ready to fuck our wallets.
The TI cards usually come out when the node stabilizes and yields from the fab increase, making the binning process for top end chips possible. Sure, you could get a 4090 Ti today, but they could only produce it in homeopathic doses.

Also, they need something if AMD hits it out of the park. RDNA 2 was too close to home, so you got an 3090 Ti instead of a Titan. Having a "Titan" card nearly beat by a competitor would damage the brand.
 

Malkav

French Madman
2,686
1,583
The TI cards usually come out when the node stabilizes and yields from the fab increase, making the binning process for top end chips possible. Sure, you could get a 4090 Ti today, but they could only produce it in homeopathic doses.

Also, they need something if AMD hits it out of the park. RDNA 2 was too close to home, so you got an 3090 Ti instead of a Titan. Having a "Titan" card nearly beat by a competitor would damage the brand.

I'd gladly switch to AMD but the issue is that Nvidia managed to lock in a lot of people with all their proprietary shit. Between Gsync, RTX, DLSS (FSR isn't up to par yet), Gamestream, that's a lot of things to forgo if I make the switch.
 

Neranja

<Bronze Donator>
2,605
4,143
I'd gladly switch to AMD but the issue is that Nvidia managed to lock in a lot of people with all their proprietary shit. Between Gsync, RTX, DLSS (FSR isn't up to par yet), Gamestream, that's a lot of things to forgo if I make the switch.
What you mentioned is gaming stuff, but where Nvidia is really king though is the creator market: With things like RTX Voice, or any GPU-Acceleratred application. Things like Resolve, Premiere, After Effects, etc.

While AMD is still lagging behind, they have started putting in the effort. Their drivers have improved, and they started shipping alternatives to Nvidia software. It will take some time though, and may even never catch up to Nvidia.
 

Malkav

French Madman
2,686
1,583
What you mentioned is gaming stuff, but where Nvidia is really king though is the creator market: With things like RTX Voice, or any GPU-Acceleratred application. Things like Resolve, Premiere, After Effects, etc.

While AMD is still lagging behind, they have started putting in the effort. Their drivers have improved, and they started shipping alternatives to Nvidia software. It will take some time though, and may even never catch up to Nvidia.

It really feels like Nvidia has an irongrip on devs and studios at this point, locking them down in partnerships to support their proprietary techs. It really is an uphill battle for AMD.
 

Neranja

<Bronze Donator>
2,605
4,143
It really feels like Nvidia has an irongrip on devs and studios at this point, locking them down in partnerships to support their proprietary techs. It really is an uphill battle for AMD.
At the same time I don't know anyone who really likes Nvidia. Their behavior is a lot like Microsoft before the antitrust lawsuit.

AMD on the other hand has the console market (except for the Nintendo Switch), which makes it hard for game development studios to outright ignore them.
 

Brahma

Obi-Bro Kenobi-X
11,947
42,276
I'd gladly switch to AMD but the issue is that Nvidia managed to lock in a lot of people with all their proprietary shit. Between Gsync, RTX, DLSS (FSR isn't up to par yet), Gamestream, that's a lot of things to forgo if I make the switch.

DLSS and Gsync is the only one I personally cared about. Is there even a difference at this point between G/Freesync? All those other features I never bothered with.

I don't even bother with the DLSS any longer with a 3080ti.
 

Daidraco

Golden Baronet of the Realm
9,210
9,313
No matter what happens with AMD - its still worth it to wait till after they release their shit in November. Not only to see if the card is as nice as we would hope, but to see if Nvidia tries to price compete.
 

Springbok

Karen
<Gold Donor>
9,021
12,580
1663949681981.jpeg
 
  • 3Worf
  • 1WTF
  • 1Mother of God
Reactions: 4 users

Malakriss

Golden Baronet of the Realm
12,343
11,732
Soon we'll just have air conditioners built in for how much that fat ass sweats.
 

Neranja

<Bronze Donator>
2,605
4,143
Not only to see if the card is as nice as we would hope, but to see if Nvidia tries to price compete.
If the rumors are true, this generation will be a hard time for Nvidia: Production costs for them are 50% up from Ampere for Lovelace, while AMD only has slightly higher costs, mostly for the added RAM chips (going from 256 bit bus to 384 bit bus).

So if AMD can price their 7800 XT at $699 to $799, well below the 4080 12GB easily outperforming it, while still making a nice profit. This would eat up a lot of the low to midrange market from Nvidia, which is probably where the bulk of sales for both companies are.
 
  • 1Like
Reactions: 1 user

Leadsalad

Cis-XYite-Nationalist
5,965
11,930
I hate messing with water cooling shit, though. ffs.
We're just hitting the limits of size for fin space and these usually terrible fans they stick on these things. I'd have expected them to at least gone to 25mm fans to improve static pressure and airflow but apparently they just added more metal and re-used the same thin mediocre fans they were using before.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,410
22,191
DLSS and Gsync is the only one I personally cared about. Is there even a difference at this point between G/Freesync?
Yes, actually.

This video is really good, because they compare the ASUS XG27AQM with the PG279QM which are effectively the same monitor but one has G-SYNC and the other is just G-SYNC-compatible-freesync.


Here's the moneyshot though:

1663954261973.png


Without the actual proprietary G-SYNC module, as the FPS drops, the pixel overshoot gets MUCH worse, leading to a huge loss of image quality (ghosting, inverse-ghosting, whatever) at the same time that framerates are dropping.

There's other differences too.
 
Last edited:

Mist

Eeyore Enthusiast
<Gold Donor>
30,410
22,191
The PG279QM has a hardware G-SYNC module, the XG27AQM is merely Freesync/G-SYNC compatible. The XG27AQM overclocks to 270hz, while the PG279QM only goes to 240hz.

Other than that, they're basically the same panel.