yeah you are right i dunno what i was thinking, but it does feel like it really, really depends on specific games nowadays on what is going to fuck you over. like
False. Again. Some games do require a little more CPU power but I've already pointed out the difference in those is even largely negligible. GPU always governs (unless the game is old as fuck or has mediocre graphics, in which case I'd say "WELL NO SHIT!").
- shitty console ports (90% of AAA games nowadays): CPU/GPU are both important up to a point but stuff like SLI and the top end cards don't do much. Whatever graphic options they do actually have in game change the visuals so small that most of the time you won't even notice the differences.
False.
780 TI's in SLI over a single card...
in BF4 at Ultra gives a 36 FPS / 26% increase at 1600x1200, 46 FPS / 35% increase at 1920x1200, and 41 FPS / 39% at 2560x1440, 7 FPS / 20% at 3840x2160.
in Crysis 3 at Ultra gives a 45 FPS / 45% increase at 1920x1080, and 20 FPS / 35% increase at 2560x1440.
in Metro: LL at Ultra gives a 27 FPS / 37 FPS increase at 1920x1080, and 21 FPS / 40% increase at 2560x1440.
A 760X2 (two 760's in one) were roughly half of the FPS of the SLI 780 Ti, and still 5%-10% slower than a single 780 Ti card. Further, a single 780 Ti will give a 39-45% increase over a single 760. I compared a 760 since it is a sort of mid-range at $250 vs. an enthusiant 780 Ti which is $690.
Really, SLI and top end cards don't do much? Yes, there are more issues with SLI than a single card until drivers are updated and optimized. But most games see a great FPS increase.
I'm sorry you've never been able to experience ultra graphics smoothly playing on all games are stuck on medium. There is a massive difference in graphical quality from Low/Medium/High to Very High/Ultra in newer/recent games.
- PC targeted AAA games: utter shit at launch. Crysis 3 and BF4 were horrendous games, Crysis 3 still sucks because its just a bad game that happens to look pretty. BF4 was a terrible game to play at launch, sure it had lots of graphical tweaking you could do but who cares when everything was crashing every 30 minutes or the netcode completely fucked you over.
This has nothing to do with CPU/GPU. This has to do with companies providing horseshit customer experiences.
- indie games, MOBAs, and MMOs : rarely pushing the envelope
That was already pointed out. But even in an MMO like WoW (9 fucking years old btw) there is a 38 FPS / 29%~ increase from a 760 to a 780 (non-Ti even).
I switched from 1080p to 1440p while still playing BF4, Titanfall, Wildstar (beta), TESO (beta), Landmark, etc and noticed my FPS dropping a little but not as much as I thought it woulda for increasing the number of pixels to deal with by 75%.
AAAAaannnnnnnd you're wrong/lying.
FPS drop
for a 680 going from 1080 to 1440
WoW: -44.5
Bioshock Infinite: -48
Tomb Raider: -47
Hitman: -27
BF4: -21
Crysis: -35
Dirt: -68
Metro: -32
I understand these are at ultra settings...
Mostly I'm just pissed that PC ports of everything nowadays seem to be super weak and not really taking advantage of PC features. I play ps4/wiiu/pc games side by side and the PC versions don't blow me away, and they SHOULD, compared to the consoles. Watch Dogs and Wolfenstein especially were pretty disappointing, and I don't have much hope for Far Cry 4 or Assassins Creed 5 either.
You don't play your PC games on Ultra anyways so why would you be pissed? YOU are saying you don't take advantage of PC features because you claim to not see a difference when there clearly is.
its pretty fucked up i have a pretty powerful PC (i7 32gb ram 680 raided SSD), a steam library of 400+ games, and the most fun I have is playing mario kart 8
This again doesn't have anything to do really with CPU/GPU but I can't argue with that. Mind blowingly amazing graphics quality does not a great game make.