Sony sharing that shit with people irks me too. Microsoft *selling* it though is what really riles me up, plus the camera bullshit.Because that's what we need, politics-style advertising in our console wars. Sony's own catch-all privacy policy right now says they share anon behavioral data as well as game session data to third parties that have ads in games and they hope those third parties don't do anything bad with it. They go on to say to not give them any information if they don't want it shared - no opting out.
You don't even need the PS Eye for voice commands now, if that was an XBOne feature the headline would be 'No Kinect, no problem, still eavesdropping'
That would be pretty funny, even as a parody.Sony needs to come up with a commercial stating that 9/10 parents that work at NSA will not let their kids have an XB1 in their home. That would generate an interesting situation where they could simply say "no comment" instead of denying it
His vision, his fault. He doesn't get to run off without fallout. (Referencing Don Mattrick)
These issues come with facts. If you dont want to hear what is presented, I can't help you.
Focus your ire at the people who made the decisions in regards to the hardware. When the messenger is right you get mad, but I warn you because I don't like what I've seen over the last 20 months.
The planned console must be for gamers. XBO or PS4 or WiiU. All of us. When the direction is an app fest first, power second, this console is not for gamers. When questions about the drm/online policy are flagged internally and disregarded because monetization is the number one job for execs, this not for gamers. When you are lead to directly lie about features, thats not for gamers. Period.
Today, snap is broken. Fact. Maybe (and probably) it will be fixed for launch.
Today live connections and parties are broken. Fact. Maybe it will be fixed before launch.
Today ESRAM is a problem and more mature tools will help down the road. However, the hardware will always be gimped in comparison to the PS4. This will never change without new hardware. Fact. Get mad at who you should be getting mad at.
Not forums,
Not posters,
Not people who see whats wrong and want to help gamers.
See you after launch.
Have a question here.
Well that's good then. The problem with the GTA 4 stuff was way back when there was talk of it having a required install, not what you "could" do if you had a HDD. This was back when mandatory installs was not a common thing. In this thread, we have seen leaked images of the back of covers with a HDD marker and X amount of gigs required as an install. With GTA 4 I believe there were faked images showing a potential back cover with a HDD marked on it, which would indicate a mandatory install. This caused several gaming sites (even Norwegian ones) to pick up the story, since it would exclude every single person who had the 360 version without a hard drive (or require them to purchase one). Rockstar made mention of this, but said that they would not require it, but that it also affected their entire game, and that they had to lower the NPC/cars available, due to not being able to take full advantage of the combination of running the game from both HDD and DVD. It didn't matter if you installed it or not, because they never made the engine to be optimized with an install in mind. I remember thinking it was very odd that they limited both for a problem only affecting one console.X360/PS3 games being so close in visual fidelity is more of a happy coincidence than a purposeful thing. There are no such contractual obligations for parity. There are a lot of examples of games with very different frame rates and resolutions, especially early titles. There's also more recent disasters like Skyrim. People do take advantage of each console's hardware, that's actually pretty obvious if you look at early games from this generation and compare them to the latest games. The difference is night and day and you can't have that kind of massive quality jump on those two platforms without really specific optimizations for each one. They just happen to have come out very similarly.
I don't know what specifically was up with Rockstar on GTA4, but the majority of PS3/X360 games are very different when it comes to installs. With X360 games you can fully install *any* game to the hard drive. It basically copies the entirety of the data over and the disc is just for authentication. You cannot do that with PS3 games. PS3 games on the other hand almost always have a mandatory data install that they pull from to increase performance.
People misuse the whole "Lowest Common Denominator" thing a bit. It has very little to do with pure visual fidelity and mostly to do with core game design. AI, set pieces, level size, encounter design, how many things you can have and do, stuff like that. How many NPCs and Cars you can have definitely *can* fall under that. Though as seen in the leaked GTAV files, that doesn't have to be the case since the PC version is set to spawn a much higher level of objects than the PS3/X360 versions. At the time with GTA4 it was likely a design/engine limitation that did not let them set different object count rates.
The times when visual fidelity clashes with the "LCD" is stuff like Texture resolutions and people lazily reusing Art Asset Builds instead of making separate, higher resolution ones. Extra effects like grass, draw distance, reflections, shadows, etc, are all things that can easily be adjusted in scalable engines. Same goes for Resolution, Anti-aliasing, and AF for the most part. That's why all of that stuff is almost always very easily adjustable on the PC. If a console can take advantage of those "sliders" it will, they won't purposely hold it back for parity. At worst you'll see lower resolution textures than you should have because someone was being cheap or lazy and didn't want to do a separate asset build.
Sean thinks that they'll take full advantage of both platform's capabilities. I think the difference will be much more modest but we'll see. Just rendering the same everything at 720p vs 1080p is a huge difference, so maybe we'll see a lot along those lines. One of the biggest differences between the consoles is ram, and rendering at 720vs1080 shouldn't be impacted by that. What size textures you load, however, will.Well that's good then. The problem with the GTA 4 stuff was way back when there was talk of it having a required install, not what you "could" do if you had a HDD. This was back when mandatory installs was not a common thing. In this thread, we have seen leaked images of the back of covers with a HDD marker and X amount of gigs required as an install. With GTA 4 I believe there were faked images showing a potential back cover with a HDD marked on it, which would indicate a mandatory install. This caused several gaming sites (even Norwegian ones) to pick up the story, since it would exclude every single person who had the 360 version without a hard drive (or require them to purchase one). Rockstar made mention of this, but said that they would not require it, but that it also affected their entire game, and that they had to lower the NPC/cars available, due to not being able to take full advantage of the combination of running the game from both HDD and DVD. It didn't matter if you installed it or not, because they never made the engine to be optimized with an install in mind. I remember thinking it was very odd that they limited both for a problem only affecting one console.
But good to hear it is not really an issue.
This exactly. I spend 90% of my time gaming on a PC, and have a PC that will put both of the upcoming consoles to shame. I'm a big PC gamer here, but I don't see the point of having to chime in on a console discussion every time to restate the obvious about the PC.Pretty much. PC is a valid comparison when talking about gaming machines, but repeating the same tired "lol PC is best" in a thread titled "Next Gen Consoles: PS4 vs XBone" is just downright trolling.
Resolution is impacted by the RAM quite a bit. You can't really do 1080p and MSAA with the 32MB of ESRAM and you're more limited to shader AA like FXAA. They'll definitely take full advantage of the "basic" hardware on the PS4 before too long, that's relatively easy to work with. Whether they take advantage of the PS4's GPGPU compute abilities is another story and not something we'll see for quite a while and probably only heavily from First Party studios, Exclusive Games, or more Adventerous People like the Witcher devs.Sean thinks that they'll take full advantage of both platform's capabilities. I think the difference will be much more modest but we'll see. Just rendering the same everything at 720p vs 1080p is a huge difference, so maybe we'll see a lot along those lines. One of the biggest differences between the consoles is ram, and rendering at 720vs1080 shouldn't be impacted by that. What size textures you load, however, will.
What part of rendering at 1080 vs 720 is helped by having more or faster ram?Resolution is impacted by the RAM quite a bit. You can't really do 1080p and MSAA with the 32MB of ESRAM and you're more limited to shader AA like FXAA. They'll definitely take full advantage of the "basic" hardware on the PS4 before too long, that's relatively easy to work with. Whether they take advantage of the PS4's GPGPU compute abilities is another story and not something we'll see for quite a while and probably only heavily from First Party studios, Exclusive Games, or more Adventerous People like the Witcher devs.
That's not what I meant. Here's a piece on the ESRAM:Digital Foundry vs. the Xbox One architects Eurogamer.netIt's also 8x 4mb. and 16x 2mb. and lastly, 32x 1mb! Math rules.
And here's some stuff on a different note:Xbox One memory performance improved for production console Eurogamer.net"There are four 8MB lanes, but it's not a contiguous 8MB chunk of memory within each of those lanes. Each lane, that 8MB is broken down into eight modules. This should address whether you can really have read and write bandwidth in memory simultaneously," says Baker.
"Yes you can - there are actually a lot more individual blocks that comprise the whole ESRAM so you can talk to those in parallel. Of course if you're hitting the same area over and over and over again, you don't get to spread out your bandwidth and so that's one of the reasons why in real testing you get 140-150GB/s rather than the peak 204GB/s... it's not just four chunks of 8MB memory. It's a lot more complicated than that and depending on how the pattern you get to use those simultaneously. That's what lets you do read and writes simultaneously. You do get to add the read and write bandwidth as well adding the read and write bandwidth on to the main memory. That's just one of the misconceptions we wanted to clean up."
Goossens lays down the bottom line:
"If you're only doing a read you're capped at 109GB/s, if you're only doing a write you're capped at 109GB/s," he says. "To get over that you need to have a mix of the reads and the writes but when you are going to look at the things that are typically in the ESRAM, such as your render targets and your depth buffers, intrinsically they have a lot of read-modified writes going on in the blends and the depth buffer updates. Those are the natural things to stick in the ESRAM and the natural things to take advantage of the concurrent read/writes."
Developers just being able to dump huge stuff into the PS4's RAM is a pretty big performance advantage there.However, clearly it's still early days, and right now these machines remain very much uncharted territory - even for those who've been working with prototype hardware for a long time. Microsoft tells developers that the ESRAM is designed for high-bandwidth graphics elements like shadowmaps, lightmaps, depth targets and render targets. But in a world where Killzone: Shadow Fall is utilising 800MB for render targets alone, how difficult will it be for developers to work with just 32MB of fast memory for similar functions?