Saturday, July 19, 2008

The Fake-HD Era

By Alex Kierkegaard

So apparently Halo 3 runs at 640p and Call of Duty 4 at 600p (in both its X360 and PS3 iterations). I know what you are thinking -- "lol what? Aren't games on these consoles supposed to be 720p minimum? Does that mean Halo 3 and COD4 are not HD?"

What the hell is going on here?

At some point during the development of Halo 3 and COD4, Bungie and Infinity Ward were forced to make a choice. Either render their games at 720p with lower polygon counts/effects/etc., or drop the resolution down to 640p/600p and jack up the polygon counts/effects/etc. And what they both ended up deciding to do is drop the resolution. Why? Because that way their games would look better.

The truth is that, contrary to what Sony and Microsoft would have you believe, resolution is not the most important factor in graphics quality. As anyone who is into first person shooters on the PC will tell you, the effects are much more important.

PC gamers have been able to switch resolutions with a few clicks for well over a decade; at the same time they've also been able to experiment with various detail settings: things like advanced shader models, anti-aliasing and bump mapping; and more recently with transparency supersampling, HDR lighting effects and subsurface scattering (I could be making these up and you probably wouldn't know the difference, but that's exactly the point). Because of this, they've long since realized a few things that escape many of the rest of us.

Whenever a new blockbuster arrives, be it Unreal 3 or Crysis or whatever, they have to make a choice between going for higher detail settings or higher resolutions. What they've come to understand is that, if you don't have enough horsepower to go for both, it's always preferable to jack up all detail levels to the max, rather than to go for the highest resolution possible.

Now you have to understand that when you hook up your 360/PS3 to your HDTV you are indeed seeing a 720p image (or, more likely, a 768p or 1080p image, since 720p TVs do not really exist). But it's a fake 720p/768p/1080p image, just as fake as what you get when watching a regular TV channel on the same TV -- the console is not redrawing the game at the higher resolution, it is simply upscaling it, stretching it, muddying it up in exactly the same way that a photograph, for example, is muddied up when you stick it in an image-editing program and try to enlarge it past its native resolution.

Of course the usual forumroids claim that Halo 3 and COD4 look great anyway and that no one can even tell the difference (though the guys who discovered the fraud certainly could) -- but that's exactly the point: No one can tell the difference because your console will not even allow you to output the game in its native resolution. If it did, and if you did a side-by-side comparison, you too would come to understand the image-degrading effects of upscaling, you too would come to marvel at the wonders of the Fake-HD Era.

But Microsoft and Sony promised that all games would be HD, and by God they will keep that promise even if it means upscaling Game Boy Color games to the full 1920x1080 Fake-HD standard. [try playing Battlezone on the 360 XBLA - ed.] The funniest thing though is that a) Halo 3 and COD4 are not even exceptions -- there's apparently a whole range of games (from Tomb Raider to PGR3 and others) that are Fake-HD-ready, and b) things seem to be getting worse, since Halo 3 came out first at 640p, followed by COD4 at 600p.

Here's a thought -- How about using 480p?!

Do not misunderstand me here -- I applaud Bungie's and Infinity Ward's decision to drop the resolution. This is what I had been asking for all along: each developer should be able to choose the resolution according to each game's requirements. But what's happening here is a travesty! I didn't pay 1,200 euros for a 720p-native projector and titanium-plated HDMI cables so as to better be able to appreciate the artifacts of upscaling. At the very least give me an option to turn off the upscaling -- I could always feed the 600p/640p image to my projector, and using the 1:1 pixel mapping option get a crystal-clear image with the corresponding black borders. With a projector black borders are not such a big deal anyway, since you can always increase the screen size to compensate for whatever screen area you lost.

This is what gets to me most of all -- not only are Microsoft, Sony and the game publishers lying to you when they print "720p" or "HD" on the game's box (since, yeah, following the same mentality even Pong is HD as long as you don't mind stretching it enough) -- they are also making it impossible for you to play the game at its optimum image quality, something inexcusable for two consoles which have been hyped to death for precisely their image quality capabilities. To top it all off, we are not talking about some EA shovelware here -- we are talking about some of the current-gen's flagship titles! Which goes to show that in order to make a game with flagship-title-quality graphics, you often have to drop the resolution below this stupid HD standard. It is the competition for better graphics which drives developers to lower the resolution -- which gets us right back to my original article.

And let no one claim this is just a matter of lazy or incompetent developers -- granted, Bungie's graphics engines were always crap (all Halos were graphically outdated by at least one year before they even shipped), but Infinity Ward and Bizarre Creations know what they are doing. It's clear that these consoles, at least with the current software tools available, are not even capable of powering proper contemporary cutting-edge 720p games -- how much less so 1080p ones.

Which brings us to another very important but even more complicated subject. Forget about the Fake-HD games for a moment, and consider the real HD ones. Assuming that most 360/PS3 games are designed for a 720p resolution, what happens when you tell the console to display them at 768p or 1080p? Is the console really redrawing them at the higher resolution, or is it simply upscaling them -- that is to say stretching them?

This was the question my friend Recap asked me in the forum a while back, and the answer I gave him was that I was 100% sure that all games were redrawn. The reasoning behind my naive answer was simple: that's what happens in PC games. And if that's what happens in PC games, then surely the same thing must happen in these new-fangled consoles -- I gullibly assumed. In the PC world upscaling is unheard of, you understand -- if your computer can't handle a game at a given resolution you simply drop down to the next available one -- nobody in their right mind would think of stretching the image in order to fool himself. And yet this seems to be exactly what these consoles have been programmed to do, in order to fool -- who else? -- you and me, their prospective customers.

Because, you see, there has to be something fishy going on when Microsoft and Sony tell you that, for every game, you can pick between a range of resolutions (720p, 1080i, and 1080p) with no performance hit. I should have zeroed in on this earlier. I mean I kind of did, this giant question mark did enter my mind at one point, but I didn't follow through with it because I just couldn't bring myself to accept the magnitude of the fraud involved. The point here is that when switching between available resolutions in PC games there is always a performance hit -- the difference in the frame rate is always flagrantly discernible -- how much more so when we are talking the gigantic leap between 1280x720 and 1920x1080. The only time you don't notice the difference is when you are playing some relatively ancient game, in which case your system is so overpowered that it ends up running the game at 100+FPS regardless of resolution.

So the only way for Microsoft and Sony to be able to guarantee the same performance across all three resolutions is if the games were designed so as to never even come close to pushing the hardware. Of course this isn't happening, the games are indeed pushing the hardware, and ever more so as time goes by, so one would expect that there would be at least a few among the more graphically ambitious games which would just about hit, say, 60FPS at 720p, but which would crawl along at 5 or 10FPS at 1080p. But the games aren't crawling -- practically none of them are -- because they are not being redrawn. These consoles can't even handle COD4 at 720p; if you could tell them to really draw COD4 at 1080p, instead of stretching it, as they've been programmed to do, the game probably wouldn't even manage 10 frames per second. But of course such a result would be unacceptable for Sony and Microsoft. In the PC world people are expected to be smart enough to just lower the resolution by themselves if their system can't handle it, but in the consumer electronics space everyone is assumed to be a slightly slower version of Miss South Carolina whose head would explode like that guy in Scanners if a game started stuttering at a high res setting. So all games have to run at the same frame rate in every resolution, something which is technically impossible without the help of fraud.

So yeah, High Definition, huh? More like Fake High Definition.

Edited by Chronic
Read More on this Topic by Alex Kierkegaard from 2006


Chronic said...

Something I find interesting is that Gears of War, which many people consider to be th best looking Xbox 360 game, actually runs in 720P native. So really, the problem seems to be that not every developer can optimize their engine at every framerate. Halo 3 and Gears both run at 30 frames-per-second, but Gears really looks much better in comparison.
CoD4 runs at a stable 60FPS and I think thats one of the main reasons people have taken to it so well.

Personally I wish every game would run at 60FPS, more than having the highest 1080P+ resolution or craziest graphical effects. I remember hearing 3-4 years ago that every game on the PS3 would run at 60FPS @ 1080P native. What happened LOL. I guess as Kojima said, the system they got from Sony wasnt as powerful as they system they were promised.

umopapisdnpuaq said...

'Kierkegaard' is a really great word to say. But rather hard to rhyme so maybe that's why he was left out of Monty Python's Philosophy Song.

It is laughable how some of the old arcade games got 'HD'd up' and packaged into 360 Arcade games. The human eyes and ears are very very sensitive intruments, but the brain is tuned to miss out nearly all the detail that they pick up.

This is because it would overload it if we saw and heard at full capacity. It couldn't deal with it. So the brain is malleable and trainable and a series of slow but steady improvements makes a slightly bigger improvement seem so much more impressive because you think 'wow how is this even possible?'

Too big a leap - like going straight to 1080p people wouldn't appreciate it or have the right set-up to fully realise it. So it's just a marketing fight over the numbers. The key thing that what the brain does is focus in on smaller parts in great detail so you are better off having the details/effects there then have a slightly higher res all over.

The brain won't remember the number of lines but it will remember what was represented on them.

Mipam Thurman said...

The one thing I hate the most about "fake HD" is so-called bluray discs. 99.99% of all the DVDs that I put into my XBOX 360 or PS3 end up with enormous black lines on the bottom and top of the screen. Why am I getting a product that describes itself as 1920 x 1080 resolution? I actually work at a cable TV network so I know that films are not shot in 1080P so I guess I understand why the final product does not actually deliver what it says it will.

My real problem is why bother making a bluray version of the movie if you do not have a 1080P version on film? You just end up pissing off the consumer when you bait and switch. They should just make a regular DVD product and leave it at that.