new Nvidia GeForce RTX 2080 cards released

Chanur

Shit Posting Professional
<Gold Donor>
26,748
39,100
I would like to upgrade my 970 even though it plays everything pretty well for me still. I am only on 1080 60hz. I think spending 700+ on a video card is lunacy though so probably won't be doing that either.
 
  • 1Like
Reactions: 1 user

Erronius

Macho Ma'am
<Gold Donor>
16,491
42,462
And that’s not respecting a customer. Either way! And the fact is. WE have the power to stop it by not buying their bullshit!

Look what’s happening to battlefield 5. Look what happened with battlefront 2. Get on the bandwagon and WE collectively can change it in a fucking heart beat.
You've never seemed like someone who liked shitty paradigms where a userbase is shit on and taken advantage of, LOL

MMOs and MMO devs are one example (But Curt loves you, Ut!)
 
  • 2Like
Reactions: 1 users

Utnayan

F16 patrolling Rajaah until he plays DS3
<Gold Donor>
16,314
12,083
I would like to upgrade my 970 even though it plays everything pretty well for me still. I am only on 1080 60hz. I think spending 700+ on a video card is lunacy though so probably won't be doing that either.

Yeah at 1080, youre still good for another decade at this point. The only people buying this shit are the same idiots that post their PC specs in their signature next to their 3D mark score and half of those idiots got it for free anyway.
 
  • 1Like
Reactions: 1 user

Chanur

Shit Posting Professional
<Gold Donor>
26,748
39,100
Part of the issue is I have been primarily on the PS4 the last year between Bloodborne and all the Dark Souls games. So for the first time since I got my first computer PC gaming has taken a back seat for a bit.
 
  • 1Like
Reactions: 1 user

spronk

FPS noob
22,706
25,841
one of the reasons i like gsync is that i don't have to fuck with settings on new games, without it you want to get as close to max refresh (120hz, 144hz, etc) which means fucking with settings until you get to that stable FPS. With gSync its not a big deal if the game runs at 67 fps or 88 fps, the monitor is adapting to your game you aren't trying to get the game to adapt to your monitor. It definitely extends the life of your card a lot, lot longer than where you may feel like you need to upgrade. I've been in the Forza Horizon 4 beta and playing on both PC and X and on PC at 4k it looks drop dead gorgeous but on a 1080 I'm locking it to 72 fps which is fine for me on a gSync monitor, I noticed screen tear immediately when I was playing windowed and had FPS uncapped.

I'm sure freesync is probably just as good but nvidia cards kick AMD cards asses in PCs and nvidia cards don't support freeSync... so...

and yeah HDR support in PC gaming is still wonky as fuck between Windows 10 fucking with shit, Fullscreen vs windowed, 4k vs 1080p, FPS issues, etc. It kinda feels like no one important really cares that much about HDR10+ and HDMI variable refresh rate to make it all work in PC gaming seamlessly which is a shame.
 
  • 2Like
Reactions: 1 users

Vorph

Bronze Baronet of the Realm
11,019
4,782
Like there is no value in free sync. Gsync does it better.
This is simply not true. You have to choose whichever one matches the brand of your GPU, but they both do the job of adaptive sync just fine.

I've been using ATI/AMD exclusively for almost 15 years now because I fucking hate Nvidia's business practices (which just keep getting worse and worse--look at the recent HardOCP and other enthusiast sites NDA debacle). Never had a single problem with drivers or anything else people always bring up about AMD, barring a 24 hour period where I had to rollback because they broke something with Skyrim on the day it + new AMD beta drivers were released.

As for DV vs. HDR10, DV is clearly better now, but I would much rather see HDR10+ win in the long run. DTS needs to catch up to Atmos too, so we can go back to the good old days of ignoring Dolby entirely.
 
  • 1Like
Reactions: 1 user

Mist

Eeyore Enthusiast
<Gold Donor>
30,480
22,330
I would like to upgrade my 970 even though it plays everything pretty well for me still. I am only on 1080 60hz. I think spending 700+ on a video card is lunacy though so probably won't be doing that either.
The 1060 GTX is the last card anyone will ever need for 1080 @ 60z unless they want this raytracing shit.
 
  • 2Like
Reactions: 1 users

a c i d.f l y

ಠ_ಠ
<Silver Donator>
20,060
99,460
Dolby Atmos and smart speaker location with an applicable film is pretty damn amazing. DTS MA made the first noticeable improvement in sound fidelity in a few fucking decades from previous audio standards, but lossless with smart speaker positioning and room compensation just blows my mind. We'll see what DTS comes out with.
 
  • 1Like
Reactions: 1 user

Chanur

Shit Posting Professional
<Gold Donor>
26,748
39,100
The 1060 GTX is the last card anyone will ever need for 1080 @ 60z unless they want this raytracing shit.
Was thinking of getting this for my wife actually. She told me to upgrade and give her my 970 . I think the new gen of cards are to much and not sure a 1070 would be a big enough of an upgrade to spend the money on.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,480
22,330
Was thinking of getting this for my wife actually. She told me to upgrade and give her my 970 . I think the new gen of cards are to much and not sure a 1070 would be a big enough of an upgrade to spend the money on.
I'm sure the 2070 will be down to 400ish before Christmas.

People are making way too big of a deal out of launch prices. Without cryptocurrencies to push up the price, I expect nvidia to price these aggressively after the initial launch. The fact that they are launching with the full 2080ti rather than releasing later means they are already getting good yields off their fabs for this process.
 
  • 2Like
Reactions: 1 users

a c i d.f l y

ಠ_ಠ
<Silver Donator>
20,060
99,460
This is simply not true. You have to choose whichever one matches the brand of your GPU, but they both do the job of adaptive sync just fine.

I've been using ATI/AMD exclusively for almost 15 years now because I fucking hate Nvidia's business practices (which just keep getting worse and worse--look at the recent HardOCP and other enthusiast sites NDA debacle). Never had a single problem with drivers or anything else people always bring up about AMD, barring a 24 hour period where I had to rollback because they broke something with Skyrim on the day it + new AMD beta drivers were released.

As for DV vs. HDR10, DV is clearly better now, but I would much rather see HDR10+ win in the long run. DTS needs to catch up to Atmos too, so we can go back to the good old days of ignoring Dolby entirely.

I was a hardcore AMD underdog fan since their first 486 and the Rage series cards. I'm running Intel and nVidia for the first time in what, 20 years? But the Ryzen 2700 would have been more expensive than the 8086K when the motherboard and ram were taken into consideration (even though I ended up overspending on ram), and the current Radeon offerings aren't anywhere close in performance when I could get a 1080Ti for $500. My last setup was an FX-9590 and R9-390X that I had had for the last several years. Main reason I jumped over? 4K. And I haven't been financially strapped, and not buying as frugally due to previous necessity.

That all said, I knew the 2080 was coming, so picking up folks dropping their used cards was pretty effective when 1080Ti OC'd cards are still retailing for $750. And as I predicted, it won't be a huge or even significant boost in performance. Just extra whistles that won't apply to 90% of games.
 

rhinohelix

Dental Dammer
<Gold Donor>
2,907
4,695
1080ti and 8700k. Alternating between a 1440p/144hz monitor and a 4k TV. The system isn't enough to power many newer AAA games at max settings to reach the full potential of either display. I either have to use a custom resolution on the 4k, or accept 70-90ish FPS on the monitor, or drop settings. No problem with older games except for MMO's though, and I play a lot of those so most of my gaming isn't being held back.

I kind of wish I had just stuck with 1080p gaming, because buying the monitor took me out of the "ignorance is bliss" aspect of gaming, and now I'm constantly chasing performance, sometimes to the point of being a detriment when it comes to enjoying gaming. But, on the other hand, playing games like the witcher 3 at real 4k on a 50 inch television is fucking amazing.
I have been using 43" TVs as monitors for years and can't really go back to something smaller. That means once you hit 60fps min at 95%+ of playtime, the rest is just gravy/insurance. I have a 1080ti with an overclocked 6700k at 4.5-4.7ghz, use a 4k Sony 720E 43" TV as my main prod monitor, set every game to ultra, and never look back.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,480
22,330
  • 1Like
Reactions: 1 user

rhinohelix

Dental Dammer
<Gold Donor>
2,907
4,695
I haven't, although I think it was being talked about/out in Europe last year when Harvey forced me to replace all of my monitors/TVs that were sitting on my desks in my office. According to Rtings its faster than the X720E, 9.x ms to 18.x ms lag but no HDR, not that I use it anyway. The base on the LG is also substantially better for desktop use. Although its a couple of hunderd dollars more, Certainly looks like something I would consider should I need to replace one of mine. I had a long run of using LG TVs before I went to a Korean 4K (pixel-perfect Wasabi Mango, /tear) and most of the TVs in my house are still LG.

The Rtings links for the two devices below:
Rtings review of the LG 43ud79-b
Rtings.com review of the Sony 43" X720E
 

Pasteton

Blackwing Lair Raider
2,609
1,724
Anyone remember those old demo videos by future crew or triton with ray traced graphics that were just stills ? If you do, then you’ll know that ray tracing should be fucking incredible if done in full 3D and totally fleshed out/optimized
 
  • 1Like
Reactions: 1 user

rhinohelix

Dental Dammer
<Gold Donor>
2,907
4,695
The multiple input thing makes them incredible for work.
43ud79-inputs-1-small.jpg


I don't know what you mean.

Edit- For those playing along at home, its 4 HDMI, 1 DisplayPort, 1 USB-C, and 2 USB 3.0's, along with the audio out IIRC.
Edit2-
Total Inputs from Rtings, as is the pic
DisplayPort : 1 (DP 1.2)
HDMI : 2 (HDMI 2.0), 2 (HDMI 1.4)
USB : 2 (USB 3.0)
USB C : 1 (DisplayPort Alternate Mode)
Analog Audio Out 3.5mm : 1

HDMI 1/2 only support 4K@30Hz (HDMI 1.4).
HDMI 3/4 can support 4K@60Hz (HDMI 2.0).
USB port 1 supports fast charging when the Quick Charge setting is enabled.
___
the first port is an rs-232c
 
Last edited:
  • 1Like
Reactions: 1 user

Erronius

Macho Ma'am
<Gold Donor>
16,491
42,462
So, basically the August Ames of monitors, is what you're saying

(minus the whole being dead part)
 
  • 2Worf
  • 1Like
Reactions: 2 users

Brahma

Obi-Bro Kenobi-X
12,064
42,998
I would like to upgrade my 970 even though it plays everything pretty well for me still. I am only on 1080 60hz. I think spending 700+ on a video card is lunacy though so probably won't be doing that either.

Dude...You're good for the next five years at 1080p.
 
  • 1Like
Reactions: 1 user