From what I've noticed over the last couple years, physics and shadows seem to have the largest impact on performance in games. Like for me I can run WoW at any resolution on max settings, but if I turn shadows from the highest setting down to say medium my framerate jumps like 15-20 higher. Same with PhysX in quite a few different games, Arkham Asylum being one of them.I'm sure you guys already know this, but the end resolution is just one of the many elements that determine frame rate. And the number of those elements increases every year. Between poly count, texture sizes, LoD bias, physics, actor count, AI, lighting, shadow, AA, tesselation, anistropic filtering, myriad other post processing options, you can make real choices to ditch 60fps and ditch 1080p for good reason.
This is true for most people. I'm not sure about this but I think it's because game engines typically rely on the CPU to do a lot of the shadow work (I'm not sure why).From what I've noticed over the last couple years, physics and shadows seem to have the largest impact on performance in games. Like for me I can run WoW at any resolution on max settings, but if I turn shadows from the highest setting down to say medium my framerate jumps like 15-20 higher. Same with PhysX in quite a few different games, Arkham Asylum being one of them.
Sure we know. But the resolution debates are already terrible to begin with, and introducing more unknown variables for games that have yet to be made isn't going to make the topic more manageable.I'm sure you guys already know this
There's a bit more to it with how it's put together and how it all works together beyond just the APU's raw numbers too.
The rub though is that no one can say exactly how +% gooder it is really supposed to be but we're supposed to believe that'people aren't giving it enough credit'because'there's a bit more to it with how it's put together'when even the Jaguar homers can't point to anything solid to base that belief on beyond the fundamental design architecture itself. Maybe it will gain another 10% in whatever arbitrary metric they later use, maybe it will gain 50% - we just don't know. I'm totally on board with the idea that Jaguar will net a boost from its APU architecture, but if they don't net an absolutely monstrous gain then it's completely possible that it won't be a huge improvement over comparable raw PC components. FWIW I hope its performance is phenomenal even if it encroaches on high end PC territory (as I would think that eventually any advance would benefit PCs as well), I just question how good it will actually be when delivered.but it's going to push more than people are giving it credit for and it will certainly look a hell of a lot better than current consoles do.
From what I've noticed over the last couple years, physics and shadows seem to have the largest impact on performance in games. Like for me I can run WoW at any resolution on max settings, but if I turn shadows from the highest setting down to say medium my framerate jumps like 15-20 higher. Same with PhysX in quite a few different games, Arkham Asylum being one of them.
I have definately noticed that same thing with shadows. Usually I just turn them off and i can go up a whole tier. Say High->Ultra just by turning off shadows.As someone with an older PC (about 5 years now) the first thing I do is turn shadows to low and sometimes even off. The difference in performance is sometimes staggering.
AMD chips have been shit for years, though.How is the hardware old? It's using a custom APU based on the AMD Jaguar that's not even released yet and was only unveiled in January.
Nah. It's going to be animprovementbut nothing like as much as the previous console generations, relatively speaking.How it's going to compare to PCs is pretty irrelevant to me personally. It's going to be a huge jump over current consoles and that's all I care about.
Some people were saying that there wasn't going to be much of an improvement between this generation and the next one.I'm not even sure what the argument is here. Is anyone here arguing that the next gen consoles won't be able to give dramatically better graphics than current gen?
I don't understand what the use is in comparing the delta between 1998-2002 tech and 2008-2012 tech. Are you guys are inferring that we're approaching the point where more processing power won't result in more capability or graphics? If so you're completely wrong.
Uhhh no. While Intel has been beating AMD in quite a few ways, AMD's chips aren't anywhere close to being "shit" for their pricepoint, especially when it comes to computer graphics where the CPU doesn't matter nearly as much as the video card.AMD chips have been shit for years, though.