Lord Von PS3 and I went back and forth on this Sub-HD thing. I’ve come to the conclusion that since the games NEVER run in their non-standard resolutions, why bother pointing out that they are sub-HD or non-standard? If it’s on my screen and the resolution says 720p or 1080p, it’s in HD. I’ll leave it at that. Enjoy the read, Lord Von don’t play!
From: Lord Von PS3
Subject: Re: For your consumption
My respects. I’m glad you liked the article, I thought it elegantly highlighted how developers encounter different problems with different hardware and employ different work-arounds. Neither version X360 / PS3 is ideal.
As for this ongoing ramble you Warzoner’s are having about “sub-HD”, I found your recent podcast (and Hip Hop’s EDTV mantra) to be entertaining yet again. I cannot agree with you about your idea of sub-HD, but at least I don’t say that without reason or evidence…
One thing you must learn about Lord Von is that he always has a reason.
The meaning of “sub-”
(Take your pick).
The definition of “HD Ready” by the official body responsible for labeling (EICTA).
Click here for the PDF released by the EICTA. (You want page 4).
The label “HD ready” is awarded to display equipment capable of presenting HD sources with a much higher resolution than standard PAL (576i) and meeting all the requirements detailed in section 4 below.
4. Requirements for the label “HD ready”.
A display device has to cover the following requirements to be awarded the label “HD ready”:
1. Display, display engine
• The minimum native resolution of the display (e.g. LCD, PDP) or display engine (e.g. DLP) is 720 physical lines in wide aspect ratio.
2. Video Interfaces
• The display device accepts HD input via:
o Analogue YPbPr1, and
o DVI or HDMI
• HD capable inputs accept the following HD video formats:
o 1280×720 @ 50 and 60Hz progressive (“720p”), and
o 1920×1080 @ 50 and 60Hz interlaced (“1080i”)
• The DVI or HDMI input supports content protection (HDCP).
FYI, for something to be classed as HD Ready 1080p, it has to support the higher standard and I quote (EICTA)…
“HD ready 1080p. Designed for display devices – including integrated digital TVs, monitors and projectors – that can in addition to 720p and 1080i also accept, process and display High Definition 1080p signals.”
It is clear what “sub-HD” means. For anything to be awarded the “HD Ready” sticker – it must comply with the official definition. Anything below the official definition of HD can be classed as sub-HD. The 576p that you stated below – is PAL EDTV & the p means progressive. You may not like it, but I’m telling you that’s how it is and that’s how it’s gonna continue to be sunshine!
Lord Von PS3
2009/10/9 Torrence Davis
I now use non-standard HD resolution instead of sub-HD. Sub, meaning below, would mean ED and SD. I can’t call something that is higher than ED sub-HD. So I use non-standard HD resolution as they can’t play ED or SD monitors but CAN play on HD monitors but aren’t the 720p/1080i/1080p standard resolutions. That’s the only thing that makes sense to me. 640p is HD but it’s non-standard. ED and SD are sub-HD.
Can’t wait for your response.
From: Lord Von PS3
Subject: Re: For your consumption
I remember my first PC had an integrated Intel graphics chipset that was capable of 1280×1024 at 4bit (16 colours). It also did 1192×900 at 8bit (256 colours), 1024×768 at 15bit (32768 colours) and 800×600 at 24bit (16M colours). The IBM SVGA monitor I owned wouldn’t go above 1192×900 (N.B. A sub-HD resolution) even though the graphics card could.
Now I’m cc’ing Hip Hop – because the following is relevant to him (and I enjoy listening to you two argue on the Warzone).
EDTV is better than SDTV in that EDTV provides a progressive scan picture. 576p (PAL) / 480p (NTSC). EDTV pictures have as many lines as SDTV but the difference is that with EDTV, the picture isn’t interlaced and so of course it is much better quality. On a big screen it certainly looks closer in quality to HD than it does SD. (Compare 240 *SDTV* alternating horizontal lines with 480 or 576 *EDTV* horizontal lines against 720 *HDTV* horizontal lines). If you could rewind the clock and get a SDTV & EDTV & HDTV side by side by side and compare the image quality, you would see the difference is biggest from SDTV to either EDTV / HDTV.
This improvement in image quality may be why Hip Hop Gamer mistakes his EDTV image for a HDTV image. Many sub-HD X360 and PS3 games have native resolutions of 600, 630 or 640 horizontal lines. Some even less (e.g. Ninja Gaiden 2 on X360, or Haze). I ask you to remember previous e-mail’s & BitBag conversations about pixel counting where it was discussed whether it is easy / hard to tell the difference between a sub-HD native image upscaled to 1080p and an actual 1080p (1920×1080) image.
For Hip Hop, a game like GTA4 (630 horizontal lines) won’t look amazingly different on an EDTV to a HDTV. You’re comparing 576p (if PAL) – or 480p (NTSC) with 630 lines (upscaled to 720 lines) on a HDTV. The game is running native at a sub-HD resolution and Hip Hop is looking at a sub-HD game on an EDTV rendering at 480p or 576p. Does it look massively different to what you see on a proper HD (720p) screen? Not if you don’t know what you’re looking for.
So to your specific points in reverse order.
3. ED and SD are sub-HD. I agree. This is a fact.
2. Why are you saying that a game supporting HD resolutions (720p / 1080i / 1080p) cannot be played on a ED / SD monitor? ALL video games MUST support 480i / 480p / 576i / 576p (SD / ED). They can all do this easier than they can HD! If you go play Uncharted : Drake’s Fortune in SD (say 576p) on an older (SD)TV, you’ll notice how there’s no tearing to be seen. It isn’t a HD picture, but it is a bit slicker. Would you prefer a resolution drop if it got rid of the tearing? Hmm.
1. Sub-HD means anything less than the HD standard.
Where *the game* is running at 480i / 576i, use SD.
Where *the game* is running at 480p / 576p, use ED.
Where *the game* is running at a specified resolution higher than SD / ED on an old school computer monitor (like my old IBM) use SVGA, or “high res”.
Where *the game* is running at 720p / 1080i / 1080p, use HD.
Where *the game* is running at a resolution < 480i / 576i, use sub-SD. Where *the game* is running at a resolution < 720p / 1080i / 1080p, use sub-HD.People refer to class games as sub-HD because even though the PS3 / X360 outputs to their HDTV's at 1080p / 1080i / 720p (i.e. HD!) - the *game* has a native resolution below a minimum defined standard HD resolution (1280x720 = 720p, or 1080i, or 1080p).People should not say a game has 600p or 640p graphics... The 'p' means progressive, not pixels. That 'p' applies to either a 480p, 576p, 720p or a 1080p signal, NOT the 600 (p)ixels of vertical resolution that are rendered by the game which may then be hardware up-scaled to 720p, 1080i or 1080p. People can't say a game has a 600p resolution - because there's no given or standard reference to the number of vertical lines (horizontal resolution). There is no definition for a 600p resolution.Faithfully, Lord Von PS3.2009/10/10 Torrence Davis
Sub-HD just doesn’t sound right. It doesn’t work for me. Not at all. Oh and btw, when I say a 640p game can’t be played on an SD tv, I mean the 640p version of the game, not the downsized 480p version. I just can’t come to terms with the term Sub-HD. You got me thinking though. Why do people care about native resolutions so much? ALL 360 games upscale to whatever the hell you set you Xbox to. It could be 720p,1080i or 1080p. I’ve never seen COD4 running in it’s native resolution have you? Graphic whores started this shit. It doesn’t really matter at all and we’ve been debating it for over a month now.
Wow, just wow.
Always appreciate your email.
On Oct 10, 2009, Lord Von PS3 wrote:
Well “sub-HD” is your problem, there’s a good reason it’s out there & for it to be used.
A game that is rendered at 1152×640 like GTA4 has to be upscaled to 1280×720.
If you were an EU / PAL gamer, you might be fooled into counting the difference.
* 640 – 576 = 64 horizontal lines for PAL.
* 640 – 480 = 160 horizontal lines for NTSC.
* 720 – 640 = 80 horizontal lines.
If you’re a EU / PAL gamer and you’re counting the number of horizontal lines, GTA4 is closer to SD (well, ED really) than it is HD!
There’s A LOT of people out there who think that way & many games this gen have a horizontal resolution below 1280 pixels (for 720p) which is more important than the vertical resolution for many game types. In saying that, racing games obviously benefit from a high vertical res as you’ll get a better perception of distance / depth which you need to see a corner coming. PAL SD (thanks to it’s 4:3 perspective) has a horizontal resolution of 720 pixels. 1152 (GTA4) is way more than that – but then it is 16:9. That’s basically why people don’t argue the horizontal resolution. You just can’t, there’s almost nothing else wide-screen to compare this generation to. At the same time, the fact that horizontal resolution is so important just serves as another reason as to why 600p, 630p, 640p & all these sub-HD terms that people use – are worthless! There’s no such a thing as a 600p game – because there’s no defined resolution for 600p. 600 x what exactly? 640 pixels by how many? Can you tell me? Can anyone? No. Native horizontal resolutions can change depending on the game. If there’s a game that doesn’t have 720 horizontal lines – chances are it may not have 1280 vertical lines either (more so on PS3). It’s daft that people class games as 600p, 640p, etc. It means nothing. People need to start quoting the full native resolutions of games if they’re going to compare or they can shut up! Developers could help out by revealing their native resolution, but they’re never going to do that.
As for why do we have all this… Well, sorry but I’m going to have to blame the fan boy wars for this one. Last generation – Sony fans were probably too arrogant. Now Microsoft has a new set of fans + Turn 10 (that Greenawalt!) + Microsoft – who are all doing the same in reverse. Many gamers simply want to know the native resolution given Sony’s “hype” about proper 1080p (1920×1080) gaming at the start of the generation. Many X360 fans get on Sony’s case and try to continually prove that Sony “lied”. X360 fans love to repeat over and over that the X360 GPU can push more pixels around than the RSX (which to be fair – it can) and if you’ve got great textures you’ll want a resolution that’s also great to show them off. What Sony are proving now this generation, is that games aren’t all about resolution. Sure the CELL processor is helping out with graphics effects such as those in SF4, KillZone 2, Uncharted 2, Tekken 6, Ninja Gaiden Sigma 2, etc, but with Sony’s newer CELL API’s & better use of the SPE’s, things like complex rag doll physics models can be calculated in under 2ms. If a game has to run at 60fps, chances are the Xenos GPU will do that better – not in every case – but in many cases (depending on the complexity of the game). If a game runs at 30fps, the PS3 has more CPU muscle and has the potential to “win” provided the game has a good developer – which is essential for most PS3 games.
As for playing CoD4 in its native resolution – it always is. The back buffer where the scene is rendered stays the same for CoD4. Where up-scaling has to be done, if the back buffer size is “non-standard” it must be up-scaled to the front buffer which must have a hardware scalar compatible resolution. This up-scale is done in software. The hardware scalar then up-scales the front buffer to 720p or 1080p when outputting to the display. All upscaling does is help fill your TV screen with something better to look at than big black borders!
Lord Von PS3.