NVidia GeForce RTX 30x0 cards

Vorph

Bronze Baronet of the Realm
11,003
4,743
That's why I have less than zero interest in buying a 144.
 
  • 1Like
Reactions: 1 user

Vorph

Bronze Baronet of the Realm
11,003
4,743
I already have one less-than-ideally supported on PC / completely unsupported on console thing (ultrawide), definitely could not deal with a second one.
 

a_skeleton_05

<Banned>
13,843
34,508
I've switched back and forth several times and you become accustomed to the change either way within a short period of time. It's hard to resist wanting above 60 whenever you're exposed to it even for a short period though
 

Argarth

On the verandah
1,210
1,050
Went from 60 to 75Hz when I bought my first Ultrawide, and I noticed a big difference. Quite surprised really, though it is a 25% increase.

My latest UW is also 75Hz and I'm more than happy with the compromise. Wouldn't be able to drive it much past 100 on max detail for newer games anyway (3840x1600)

I don't play competitive online shooters, but I'd probably buy a second monitor for that task alone if I did. Just can't live without the expansive world-view on my UW =)
 

jooka

marco esquandolas
<Bronze Donator>
14,415
6,132

Apparently, the 3080 Ti will not be the flagship product, but an RTX 3090 Ti/SUPER model that flaunts 24 GB of GDDR6X memory with a 384-bit interface and TBP of 350 watts. That’s almost double the RAM of the 3080 model (check out the table below for the specifics).


That's not sounding very affordable and bump up your power supply while you are at it.
 
  • 2Worf
  • 1Like
  • 1Mother of God
Reactions: 3 users

rhinohelix

<Gold Donor>
2,873
4,674




That's not sounding very affordable and bump up your power supply while you are at it.
Well, you have to look at the chips they are built on. As I understand it presently, the 3080Ti was going to built on the GA103, is now going to built on the GA102 to better compete with the RDNA2.0 and be like the 2080Ti was for Turing. Ampere's Titan, IIRC is going to be the 3090Ti and be built on the GA100, have everything under the sun and be the $1500 card Titans used to be, while the 3080Ti is going to be somewhere closer to between $799-999, although now I have heard both price points and who knows whether they were talking about the 3080Ti or the 3080 and if things have changed and if they were even right when I heard then and if I am recalling the information I heard correctly, so CAVEAT EMPTOR like a MF'er.
 
  • 1Like
Reactions: 1 user

uniqueuser

Vyemm Raider
1,737
4,889
You ain't got shit on this.

View attachment 276200
Bitch please...

stillgoing.png
 
  • 2Worf
  • 1Seriously?
  • 1Like
Reactions: 4 users

ToeMissile

Pronouns: zie/zhem/zer
<Gold Donor>
2,725
1,663
It's been 3 years since I upgraded from an I5 760 & GTX 460 to Ryzen 5 1600 and 1070 GTX. Still have the same 24", 1920 monitor but running it at 72hz instead of 60hz. It's a pretty big difference like Argarth mentioned above.

I don't really play much beyond Overwatch and some PoE these days, so i'm not in a hurry to upgrade. If anything it'll be a monitor first; at least 27", 2k, IPS, 100Hz, under $400?
 

Quineloe

Ahn'Qiraj Raider
6,978
4,463




That's not sounding very affordable and bump up your power supply while you are at it.
But there will be a FUCKLOAD of marketing talk in all the media how Nvidia is still top dog over AMD because of that card, yet on the steam hardware survey the card will never go above 0.2% due to its four figure price tag, so mission accomplished.

The vast majority of gamers use either budget cards (Nvidia 50s) or midrange cards (Nvidia 60s), so 4k gaming isn't happening in 2020 either.

And that's what the deal is, people buy 3050s and 3060s because of the media hype of the 3090ti.
 
  • 1Like
Reactions: 1 user

a_skeleton_05

<Banned>
13,843
34,508
Going from 60 to 75hz is a huge step because there are diminishing returns as you go higher and most people will not notice much of a difference past a certain point.

60 = 16.6ms (time to display a frame)
75 = 13.3
120 = 8.3
144 = 6.9

I can notice a difference up to 130ish but I've done a lot of testing and I'm quite satisfied around 110.

There's also response times to consider though, so simply bring fast won't matter if the monitor is shit and can't actually keep up
 
  • 1Like
Reactions: 1 user

a c i d.f l y

ಠ_ಠ
<Silver Donator>
20,060
99,460
8086K @ 5GHz
1080Ti OC
4K 60hz monitor

I use it to browse the internet and run emulators.... 🤔
 
  • 2Worf
Reactions: 1 users

Lunis

Blackwing Lair Raider
2,258
1,504
Anything less than 120hz is very noticeable to me after using a 144hz for 4 years now. 240hz is overkill for anything but fast paced fps games imo. You do notice the benefit of 240hz when you turn fast in a shooter.. but if you have to be pulling at least 240 fps for it matter anyways. 144hz seems like the sweet spot for now.
 

ver_21

Molten Core Raider
975
-361
But there will be a FUCKLOAD of marketing talk in all the media how Nvidia is still top dog over AMD because of that card, yet on the steam hardware survey the card will never go above 0.2% due to its four figure price tag, so mission accomplished.

The vast majority of gamers use either budget cards (Nvidia 50s) or midrange cards (Nvidia 60s), so 4k gaming isn't happening in 2020 either.

And that's what the deal is, people buy 3050s and 3060s because of the media hype of the 3090ti.

I keep hoping the market realizes how good AMD APUs are. Maybe once they start having 6 and 8 cores, they will stop looking like such a limited option.
 

Noodleface

A Mod Real Quick
37,961
14,508
Yeah I've watched a lot of comparison videos - like when they sit shroud at 60, 144, and 240 hz displays and see how he reacts. There's definite noticable improvement. I'd say between 120-144 is the sweet spot for me (I've never looked at 240), but when it dips lower I can feel it. 60 feels absolutely disgusting. That's not to say gaming at 60 is disgusting, but to me after 4-5 years at 144 hz it feels disgusting.

That said, if a console actually played at 60 fps it'd be a miracle.
 

Pyros

<Silver Donator>
11,059
2,262
I've been fine at 60FPS, it's very noticeable higher too but when I bought my recent PC I went with 4K 60FPS rather than higher FPS(and while eventually it'll be 4k 144FPS, shit isn't there yet and screens for 4k 60+ FPS were stupidly expensive). Haven't regretted it so far, but the fact I don't constantly play at 144(or 120) to compare is most likely why. 4k makes even shit stuff looks pretty, main issue is old shit not having scaleable UIs so it's small as fuck. I also don't play any FPS(well I'm in the process of playing Doom but that's like the first FPS I play since the release of Overwatch and I only played that a few weeks and before that it was I think Farcry 3), or fighting games, so frames isn't really that important of a factor for me, as long as it's stable.
 

jooka

marco esquandolas
<Bronze Donator>
14,415
6,132
Well, you have to look at the chips they are built on. As I understand it presently, the 3080Ti was going to built on the GA103, is now going to built on the GA102 to better compete with the RDNA2.0 and be like the 2080Ti was for Turing. Ampere's Titan, IIRC is going to be the 3090Ti and be built on the GA100, have everything under the sun and be the $1500 card Titans used to be, while the 3080Ti is going to be somewhere closer to between $799-999, although now I have heard both price points and who knows whether they were talking about the 3080Ti or the 3080 and if things have changed and if they were even right when I heard then and if I am recalling the information I heard correctly, so CAVEAT EMPTOR like a MF'er.

3090ti being a titan makes sense, if the 3080 ends up in the $500-600 and as good as they say I'd consider getting one and sell my 2080. Sucks nvidia don't do Apple drivers anymore, I'd love to use the 2080 in a hackintosh.
 

slippery

<Bronze Donator>
7,892
7,705
"The PCB and these cables alone will be a real challenge for the manufacturers of GPU water blocks."

You rat fucks
 
  • 1Worf
Reactions: 1 user