NVidia GeForce RTX 30x0 cards

Quineloe

Ahn'Qiraj Raider
6,978
4,464
People who buy cards in the 3080ti price range usually run high resolution monitors, so they're more likely to run into VRAM issues, and most of them want to use the card for a few years.

If they run into VRAM isses in 2022, they're not gonna be okay with that.
 

Xexx

Vyemm Raider
7,722
1,803
How about telling us what new info (if any) is included in this video instead of requiring people to spend 25 minutes watching this dipshit ramble on?

1598750037832.png


3060/3070 around 250w - 3090 can be over 400 w/ 350 min.

1598750155214.png


Supposed release schedule is

3080 10GB mid sept
3090 paper launch same time
3080 20GB before 3070 (late sept)
3070 8GB early oct (16GB TBD)
3060 8GB before thanksgiving

- good time to start around 11:35
 
  • 3Like
Reactions: 2 users

bytes

Molten Core Raider
957
638
Who is going to buy the baseline 3080 if there's 16gb models for the 3070 coming and 20gb for th 3080 itself? It seems like a completely pointless model.
 
  • 5Like
Reactions: 4 users

Aazrael

Golden Baronet of the Realm
3,166
11,700
So what does reference $1000 in the leaks mean in real life money once they hit the stores?
 

Quineloe

Ahn'Qiraj Raider
6,978
4,464
Who is going to buy the baseline 3080 if there's 16gb models for the 3070 coming and 20gb for th 3080 itself? It seems like a completely pointless model.
companies selling pre-built gaming rigs. They'll probably get massive discounts on the cards and they get to advertise with the big 3080 number.

The card will do very well at launch and when the card starts to struggle half way through its supposed life cycle, these guys will just tell themselves "lol computers are outdated so fast" and actually believe it.
 

Mist

REEEEeyore
<Gold Donor>
31,084
23,420
The 3080 really should have been 12gb, that seems plenty practical for 4k or triple 1440p. 16gb doesn't seem like something you would ever really hit.
 
  • 4Like
Reactions: 3 users

Xexx

Vyemm Raider
7,722
1,803
I dont think i hit the 11GB on my 2080ti - I am unsure what games would even hit the 10GB on some of the future cards.
 

Mist

REEEEeyore
<Gold Donor>
31,084
23,420
I dont think i hit the 11GB on my 2080ti - I am unsure what games would even hit the 10GB on some of the future cards.
It would take 4k + really big textures + a bunch of extra effects to hit 10gb.

If the 3080 is really a 320W+ card, I just don't see the fucking point of it. Anyone that can build around that much power consumption + heat is just going to buy the 3090 at thirty more watts and not give a shit about the extra 600 dollars because this is clearly their main hobby.

They really should have kept the 3080 under 275W, so you could still put one in a "normal" case with a "normal" power supply.
 

LiquidDeath

Magnus Deadlift the Fucktiger
5,034
11,828
Yeah, if these are right I'm just going to wait for the 16GB 3070 since all I care about is pushing higher frames at 1440p.
 
  • 3Solidarity
  • 2Like
Reactions: 4 users

Mist

REEEEeyore
<Gold Donor>
31,084
23,420
Yeah, if these are right I'm just going to wait for the 16GB 3070 since all I care about is pushing higher frames at 1440p.
Then why do you need 16gb?

1440p would never even hit 8gb even with every post-processing, reflections, shadows, etc, you could possibly think of cranked.
 

LiquidDeath

Magnus Deadlift the Fucktiger
5,034
11,828
Then why do you need 16gb?

1440p would never even hit 8gb even with every post-processing, reflections, shadows, etc, you could possibly think of cranked.

I was under the impression that the RAM amount was the significant limiting factor between cards when trying to push higher frames at resolutions beyond 1080p. If this isn't the case, then what is it?
 

Mist

REEEEeyore
<Gold Donor>
31,084
23,420
So everything that isn't RAM is responsible for FPS?
FPS drops off like a rock if you have insufficient RAM to hold the frame buffer and textures in memory. It is a "you must be this tall to ride" limitation, not a "more is better" metric.

The previous biggest memory hog of AAA games was GTA 5, and that still only used about 5gb of RAM at 4k with full reflections. This might have finally been surpassed by Horizon: Zero Dawn, which is way outside the bounds of normal because it's probably ported like shit.
 

Xexx

Vyemm Raider
7,722
1,803
It would take 4k + really big textures + a bunch of extra effects to hit 10gb.

If the 3080 is really a 320W+ card, I just don't see the fucking point of it. Anyone that can build around that much power consumption + heat is just going to buy the 3090 at thirty more watts and not give a shit about the extra 600 dollars because this is clearly their main hobby.

They really should have kept the 3080 under 275W, so you could still put one in a "normal" case with a "normal" power supply.
Well what do you see as normal? I feel my 850s as normal and 1000 as high. However I’ve always bought 750 or 850 myself. IF the 3090 is 1400ish I will grab one. I’d hope my 850+ plat is enough for it.

Also ram speed can give more FPS, can be anywhere from 2-15.

 

Mist

REEEEeyore
<Gold Donor>
31,084
23,420
Yeah I'd say a "normal" single GPU gaming PSU is 650 or 750, and 850-1000 is high.

I'd be worried about running a 320W video card on a 750W PSU for 3-4 years. You're probably putting ~600W of load on it most of the time you're gaming.
 

Quineloe

Ahn'Qiraj Raider
6,978
4,464
FPS drops off like a rock if you have insufficient RAM to hold the frame buffer and textures in memory. It is a "you must be this tall to ride" limitation, not a "more is better" metric.

The previous biggest memory hog of AAA games was GTA 5, and that still only used about 5gb of RAM at 4k with full reflections. This might have finally been surpassed by Horizon: Zero Dawn, which is way outside the bounds of normal because it's probably ported like shit.

I'm told MSFS2020 can easily hog 8 GB?