CPU wise, even on average gaming computers, PCs are significantly ahead of both consoles. Intel's CPU performance is just too good - AMD can't match it.And how does it fare compared to the average gaming PCs (assuming not many people bought $1k CPU with $1k video card)?
In terms of the CPU, I'll quote Anandtech --Current intel's may be 2-3xmore powerful yeah but PCs from 2-3 years ago can still run most current games on max settings. If we were to compare the APUs used inside of the PS4 to a specific intel/nvidia setup what would it be?
http://www.anandtech.com/show/6972/x...laystation-4/2If you're not familiar with it, Jaguar is the follow-on to AMD's Bobcat core - think of it as AMD's answer to the Intel Atom. Jaguar is a 2-issue OoO architecture, but with roughly 20% higher IPC than Bobcat thanks to a number of tweaks. In ARM terms we're talking about something that's faster than a Cortex A15. I expect Jaguar to be close but likely fall behind Intel's Silvermont, at least at the highest shipping frequencies. Jaguar is the foundation of AMD's Kabini and Temash APUs, where it will ship first. I'll have a deeper architectural look at Jaguar later this week.
Cant agree more...I really enjoy (as does my wallet) not having to replace video cards and RAM every year.So basically you don't need top-end PC hardware to play games for the next generation either. Unless a PC exclusive ultra graphic intensive MMO comes out, which is doubtful.
Which is great news to me - having to upgrade video cards every 6 months-1 year back in the early 2000s sucked. Back when there was actual PC exclusive games that took advantage of the best PC hardware.
Need? No, but hopefully with everything being X86 architecture they should be able to make much more scaleable engines that can take advantage of high end hardware better. That's going to be a big plus if you're a PC gamer who is into doing lots of hardware upgrades.So basically you don't need top-end PC hardware to play games for the next generation either. Unless a PC exclusive ultra graphic intensive MMO comes out, which is doubtful.
Which is great news to me - having to upgrade video cards every 6 months-1 year back in the early 2000s sucked. Back when there was actual PC exclusive games that took advantage of the best PC hardware.
Definitely. This last half decade or so has been the first time since 3D acceleration came to market in the mid 90s that I haven't bought a new video card with every new generation of cards. In the last 5-6 years I was able to skip form an Nvidia 9800gt to an AMD 4870 to a Nvidia 660Ti, a good generation(or 2 or 3) between each card, and at no point was I ever not able to play a game that I wanted to, and play it well. I want my $800 back that I spent on my dual 12MB Voodoo2s, lol. It's a great time to be a PC gamer, can't wait to see what games look like as 4K monitors start to come to market now and hopefully become affordable for the masses over the next 3 or 4 years.Cant agree more...I really enjoy (as does my wallet) not having to replace video cards and RAM every year.
Who seriously approves these responses? Never go full retard...![]()
I just can't handle it anymore