Desktop Computers

Chancellor Alkorin

Part-Time Sith
<Granularity Engineer>
6,052
10,317
Damn near equal in real world numbers, but watch out for the heat and power requirement on those 290s.
 

jeydax

Death and Taxes
1,422
960

jeydax

Death and Taxes
1,422
960
Holy shit what a coincidence, theSeasonic SS-760XP2 760Wis on sale at NewEgg for $100 after $25 rebate and promo code: EMCYTZG587

If you were contemplating this PSU or one similar to it then buy it and buy it now! That is $60-70 off and a fucking KILLER value for that PSU.
 

Joeboo

Molten Core Raider
8,157
140
The nice thing about going up to a 1440p monitor is that you really don't even need AA of any kind(and definitely not 8x or 16x. 1440p is getting close to the point that the pixels are small enough that your vision can't hardly see jagged edges even without AA. I'd suspect that by the time we get to 4K displays being the norm there won't be any need for AA whatsoever. It's always just been a workaround to help disguise shitty resolutions, basically.

So while you definitely take a performance hit on going up to 1440p from 1080p, you can make a lot of it back by needing little to no AA.
 

Zodiac

Lord Nagafen Raider
1,200
14
Yeah, no way. At 1440 look at the tops of fences, power lines, buildings etc - AA makes a huge difference. Fire up Tomb Raider and watch watch a zip line as you cycle through AA settings, you would have to be blind not need AA.
 

Joeboo

Molten Core Raider
8,157
140
Yeah, no way. At 1440 look at like at the tops of fences, power lines, buildings etc - AA makes a huge difference. Fire up Tomb Raider and watch watch a zip line as you cycle through AA settings, you would have to be blind not need AA.
I generally can't see a difference most of the time between very low AA(2x or 4x) and higher levels of AA(8x or 16x) at 1440p. Some games I can't see any practical difference with AA as opposed to no AA(mostly games with fast movement that rarely stop for long, racing games, FPS, etc)

the other benefit is that text usually looks much better and sharper with no AA, so that's nice.
 

Joeboo

Molten Core Raider
8,157
140
I agree that you get diminishing returns when going to the higher levels of AA, but to say that at 1440 you do not need any AA because of the higher resolution is just misinformation to the people in this thread looking for hardware advise.

Even at 4k AA makes a huge difference.

http://international.download.nvidia...x21-aa-off.png

http://international.download.nvidia...x21-2xtxaa.png
Eh, I wouldn't say that it's huge. At 1400p and especially 4K, AA does nothing at that point for the major stuff you are mostly looking at, characters, buildings, vehicles, trees, etc. Yeah the occasional fence line or power line looks slightly better, but at that point is the performance hit worth it for a gain in nothing but a smoother power line? How many games even have power lines? You can't see any difference in those screenshots in the things that people are looking at 99% of the time(character models and such)
 

Zodiac

Lord Nagafen Raider
1,200
14
The point is that you can run those AA settings on a 1080 monitor and it will look better than a 1440 monitor with no or lower AA settings.

AA does nothing at that point for the major stuff you are mostly looking at, characters, buildings, vehicles, trees, etc.
I think most people here would say that there is a significant difference between this:

rrr_img_70407.jpg


and this:

rrr_img_70408.jpg


and that's at a lower (2x) AA setting.
 

Chancellor Alkorin

Part-Time Sith
<Granularity Engineer>
6,052
10,317
2x is probably fine at 1440p as the jagged edges are much less pronounced (and they would be). This doesn't mean they're invisible.
 

spronk

FPS noob
23,847
29,101
And you're completely wrong. It is quite the opposite.

I could get more data but a video cards matters WAY the fuck more than a CPU does when it comes to gaming.
yeah you are right i dunno what i was thinking, but it does feel like it really, really depends on specific games nowadays on what is going to fuck you over. like

- shitty console ports (90% of AAA games nowadays): CPU/GPU are both important up to a point but stuff like SLI and the top end cards don't do much. Whatever graphic options they do actually have in game change the visuals so small that most of the time you won't even notice the differences.

- PC targeted AAA games: utter shit at launch. Crysis 3 and BF4 were horrendous games, Crysis 3 still sucks because its just a bad game that happens to look pretty. BF4 was a terrible game to play at launch, sure it had lots of graphical tweaking you could do but who cares when everything was crashing every 30 minutes or the netcode completely fucked you over.

- indie games, MOBAs, and MMOs : rarely pushing the envelope


I switched from 1080p to 1440p while still playing BF4, Titanfall, Wildstar (beta), TESO (beta), Landmark, etc and noticed my FPS dropping a little but not as much as I thought it woulda for increasing the number of pixels to deal with by 75%.

Mostly I'm just pissed that PC ports of everything nowadays seem to be super weak and not really taking advantage of PC features. I play ps4/wiiu/pc games side by side and the PC versions don't blow me away, and they SHOULD, compared to the consoles. Watch Dogs and Wolfenstein especially were pretty disappointing, and I don't have much hope for Far Cry 4 or Assassins Creed 5 either.

its pretty fucked up i have a pretty powerful PC (i7 32gb ram 680 raided SSD), a steam library of 400+ games, and the most fun I have is playing mario kart 8

oh well I guess its better than 10 years ago when games were going exclusively to console
 

Denaut

Trump's Staff
2,739
1,279
Even staring at the screenshots I can barely see a difference. I probably wouldn't even notice when playing.
 

Joeboo

Molten Core Raider
8,157
140
Even staring at the screenshots I can barely see a difference. I probably wouldn't even notice when playing.
Yeah, by the time I lean back and relax in my computer chair, I'm probably 3 feet from my 27" 1440p monitor. I'm not seeing those differences while playing a typical game. You'd have stop and really stare and nitpick for a while to see it.
 

Jovec

?
838
412
My rule of thumb is that if you can tell the difference between Max settings and one step lower when playing the game, then the game probably isn't worth playing.
 

jeydax

Death and Taxes
1,422
960
yeah you are right i dunno what i was thinking, but it does feel like it really, really depends on specific games nowadays on what is going to fuck you over. like
False. Again. Some games do require a little more CPU power but I've already pointed out the difference in those is even largely negligible. GPU always governs (unless the game is old as fuck or has mediocre graphics, in which case I'd say "WELL NO SHIT!").

- shitty console ports (90% of AAA games nowadays): CPU/GPU are both important up to a point but stuff like SLI and the top end cards don't do much. Whatever graphic options they do actually have in game change the visuals so small that most of the time you won't even notice the differences.
False.

780 TI's in SLI over a single card...
in BF4 at Ultra gives a 36 FPS / 26% increase at 1600x1200, 46 FPS / 35% increase at 1920x1200, and 41 FPS / 39% at 2560x1440, 7 FPS / 20% at 3840x2160.
in Crysis 3 at Ultra gives a 45 FPS / 45% increase at 1920x1080, and 20 FPS / 35% increase at 2560x1440.
in Metro: LL at Ultra gives a 27 FPS / 37 FPS increase at 1920x1080, and 21 FPS / 40% increase at 2560x1440.

A 760X2 (two 760's in one) were roughly half of the FPS of the SLI 780 Ti, and still 5%-10% slower than a single 780 Ti card. Further, a single 780 Ti will give a 39-45% increase over a single 760. I compared a 760 since it is a sort of mid-range at $250 vs. an enthusiant 780 Ti which is $690.

Really, SLI and top end cards don't do much? Yes, there are more issues with SLI than a single card until drivers are updated and optimized. But most games see a great FPS increase.

I'm sorry you've never been able to experience ultra graphics smoothly playing on all games are stuck on medium. There is a massive difference in graphical quality from Low/Medium/High to Very High/Ultra in newer/recent games.


- PC targeted AAA games: utter shit at launch. Crysis 3 and BF4 were horrendous games, Crysis 3 still sucks because its just a bad game that happens to look pretty. BF4 was a terrible game to play at launch, sure it had lots of graphical tweaking you could do but who cares when everything was crashing every 30 minutes or the netcode completely fucked you over.
This has nothing to do with CPU/GPU. This has to do with companies providing horseshit customer experiences.

- indie games, MOBAs, and MMOs : rarely pushing the envelope
That was already pointed out. But even in an MMO like WoW (9 fucking years old btw) there is a 38 FPS / 29%~ increase from a 760 to a 780 (non-Ti even).


I switched from 1080p to 1440p while still playing BF4, Titanfall, Wildstar (beta), TESO (beta), Landmark, etc and noticed my FPS dropping a little but not as much as I thought it woulda for increasing the number of pixels to deal with by 75%.
AAAAaannnnnnnd you're wrong/lying.

FPS dropfor a 680 going from 1080 to 1440
WoW: -44.5
Bioshock Infinite: -48
Tomb Raider: -47
Hitman: -27
BF4: -21
Crysis: -35
Dirt: -68
Metro: -32

I understand these are at ultra settings...

Mostly I'm just pissed that PC ports of everything nowadays seem to be super weak and not really taking advantage of PC features. I play ps4/wiiu/pc games side by side and the PC versions don't blow me away, and they SHOULD, compared to the consoles. Watch Dogs and Wolfenstein especially were pretty disappointing, and I don't have much hope for Far Cry 4 or Assassins Creed 5 either.
You don't play your PC games on Ultra anyways so why would you be pissed? YOU are saying you don't take advantage of PC features because you claim to not see a difference when there clearly is.

its pretty fucked up i have a pretty powerful PC (i7 32gb ram 680 raided SSD), a steam library of 400+ games, and the most fun I have is playing mario kart 8
This again doesn't have anything to do really with CPU/GPU but I can't argue with that. Mind blowingly amazing graphics quality does not a great game make.
 

Jovec

?
838
412
The issue of CPU vs GPU came up page or two ago. Here are 66 games tested with a 2500k vs a 4790k, using a Titan. HT was disabled on the 4790k, so 4c/4t vs 4c/4t. CPUs and GPU at stock. 25x16 res.All testing done by an enthusiast, and not a review site. Single GPU only, as multi-GPU setups have been show to be more CPU bound.

rrr_img_70450.png
 

jeydax

Death and Taxes
1,422
960
^ that would have saved me a lot of time but proves the point further.

Also, saying that AAA titles don't show much improvement graphical improvement from console to PC, here's an example. There's a fucking reason you don't think there is a difference between the two when you're running the game on Medium instead of Ultra.

Recorded March 26, 2013 (time of the games release)
PC on left, Xbox 360 on right
rrr_img_70451.jpg


And even that does not do it justice. The Xbox 360 version AFAIK runs at 30 FPS vs. PC's 100+
 

jeydax

Death and Taxes
1,422
960
Fyi the Corair Air 540 is on sale for $90 on SlickDeals and it comes with 3 extra case fans too. Great deal for a great case.

I would get a link but on my phone ATM.