The Big Bad Console Thread - Sway your Station with an Xboner !

Gavinmad

Mr. Poopybutthole
44,684
54,488
edit - if current gen games are putting out texture resolutions more in the 240-480p range, then yeah, we'll see improvement, I just don't know that I believe that all games on current consoles have been that shitty. I mostly game on my PC, and only use my consoles for sports & driving games, those always seem to have relatively decent texture resolutions.
So after being told that you were wrong over and over again, you finally admit that you don't even use your console much.

Great contribution.
 

Soygen

The Dirty Dozen For the Price of One
<Nazi Janitors>
28,616
45,337
Quadrizzle*

At this point, Sony could have been running their games on a government super-computer and people will still bash MS harder.
 

Anomander Rake

Golden Knight of the Realm
705
15
Definition of SCHADENFREUDE
: enjoyment obtained from the troubles of others

Seriously, watching the news rolling in about MS falling on its face during E3, and the treasure trove of idiotic comments since have kept me more entertained these past few days than playing any actual games.
 

Kedwyn

Silver Squire
3,915
80
4k TV is certainly better than the recent 3d TV fad. However the bigger issue is going to be provider bandwidth. Calling it 4k tv and actually getting 1080p (since most providers down rez already) isn't really going to help things a whole lot.
 

Cor_sl

shitlord
487
0
RE: 4k. This post is rather informative --

Here's what I've learnt on the matter:

The chart that is frequently tossed around is derived from the 20/20 acuity chart. I believe it's called the Snellen chart.

In that chart... once you stop been able to define letters and numbers from each other with any reasonable accuracy, it's considered beyond the limit of your acuity.

For 20/20 vision, the limit of your acuity for the Snellen chart is roughly around 1 arcminute. Which is 1/60th of 1 degree on your field of vision. Of course as the place you're looking at deviates from your fovea (the center of your vision), the blurrier it gets. But we only count the sharpest part of your focus.

So it seems pretty reasonable to extrapolate that information about how much detail we can see from such a test.

Except there are a couple problems with doing this - the Snellen chart tests for rough shape recognition.

The character 'I' will look the same as the character 'l', or the number '1' at the limits of your acuity. Similarly M will look like W. And H will also look like M. But you won't mistake an M for an I.

So this means that even when we're hitting the limits of the Snellen acuity - we can still percieve additional information, albeit at reduced accuracy/increased fuzziness.

It turns out that the real limits of visual acuity (where we can't tell one element from another completely different element) is around the 0.15 arcminute mark. 1/6th of the Snellen acuity.

Having said this - in practical and useful terms, assuming that you sit from your screen such that it occupies a 35 degree FOV on your retina (the recommended distance - and the one that corresponds with 'the chart') - then the only areas where 4k will provide any benefit (and even then it'll be quite marginal) - is in areas of fine detail high contrast... such as hair shimmering in the light... or lines, or fine print in the distance.

For the most part, the extra resolution will not matter - other image quality factors including but not limited to contrast ratio, crushing/no crushing, frame rate, etc are more important to percieved quality.

On the other hand... if you want to sit closer... then the higher resolution will benefit you. But the problem with sitting too close is that film and video is framed under a set of natural assumptions - that the scene occupies a 35-45 degree fov on your retina max.

Taking the idea of sitting too close to an extreme - if you have to crane your head in order to see the corners of the screen where information like game huds typically lie, then it's too big.

As for film been a higher resolution - you can extract more information from a film than can be percieved - been that film is an analogue material with a cell like structure, you can keep going in - but past a certain point, all you're going to get is more pixels that make up a blob that make up a blur that make up part of an image that makes up an image. Going by a lot of remastered older film releases - modern 1080p digital filming offers significantly more perceived information than 4k transfers of older films.

http://www.neogaf.com/forum/showpost...3&postcount=47
 

Tuco

I got Tuco'd!
<Gold Donor>
49,744
90,163
interesting post/article and kind of confirms how I felt about that oft-bandied chart: It's more complicated and not as binary as that. Still I think it's a good guide on visual range and I am still pretty dubious on how impactful 4k tvs will be in the next 5-10 years.

As for the 'xbone dev boxes' being windows 7 rigs with geforce cards: you couldn't make this shit up if you tried. It also makes it more clear how artificial any exclusive titles are.
 

Lenas

Trump's Staff
7,742
2,459
As for the 'xbone dev boxes' being windows 7 rigs with geforce cards: you couldn't make this shit up if you tried. It also makes it more clear how artificial any exclusive titles are.
Well duh. Now that PC, X1 and PS4 are all using the same x86 architecture, any exclusives are purely because of MS/Sony showering the studio with money or owning them outright.
 

Ritley

Bronze Baron of the Realm
16,212
35,844
Well duh. Now that PC, X1 and PS4 are all using the same x86 architecture, any exclusives are purely because of MS/Sony showering the studio with money or owning them outright.
This is why I think Sony will do better on exclusives. Getting 3rd party devs to do exclusives will probably be prohibitively exclusive, so that leaves 1st party studios. I think Sony has the best 1st party studios by a wide margin.
 

Sean_sl

shitlord
4,735
12
As for the 'xbone dev boxes' being windows 7 rigs with geforce cards: you couldn't make this shit up if you tried. It also makes it more clear how artificial any exclusive titles are.
But but, the power of the CLOUD is require to make these games work bro!
 

Cantatus

Lord Nagafen Raider
1,437
79
First X-Men movie with Hugh Jackman, Patrick Stewart and Anna Paquin.

Keep up!
Wrong. It's the X-Men movie with James McAvoy, Michael Fassebender, and Hugh Jackman. The first X-Men movie is just "X-Men." The newest X-Men movie that has come out after a few other X-Men movies is "X-Men: First Class" or "X1." Why do people find that so confusing? It makes perfect sense that the most recent in the series is referred to as the first!