Resolution
I understand what you're saying, Guy, regarding the 'resolution' of colour information within the PAL/NTSC system. These systems used the truncation of colour information as a sort of analogue video compression. All the digital video systems also use some form of sub-sampling for the colour information - 4:2:2 (Y:U:V) being the generally accepted standard, but 4:1:1 and 4:2:0 being adopted through the DV format - this is, after all, a form of data compression. The 'compression' or reduced bandwidth of the colour information has been implemented for a reason - the human perception of colour information, relative to the luminance information, is poor, and bandwidth savings which reflect this can be implemented.
The huge jump in visual quality, for me, is achieved through increased luminance resolution for HDTV - in the UK this is focussed on the 1920x1080i standard. The subjective quality when viewed on a native 1920x1080 screen is breathtaking, even with quite highly compressed footage.
Regards,
Ian
I understand what you're saying, Guy, regarding the 'resolution' of colour information within the PAL/NTSC system. These systems used the truncation of colour information as a sort of analogue video compression. All the digital video systems also use some form of sub-sampling for the colour information - 4:2:2 (Y:U:V) being the generally accepted standard, but 4:1:1 and 4:2:0 being adopted through the DV format - this is, after all, a form of data compression. The 'compression' or reduced bandwidth of the colour information has been implemented for a reason - the human perception of colour information, relative to the luminance information, is poor, and bandwidth savings which reflect this can be implemented.
The huge jump in visual quality, for me, is achieved through increased luminance resolution for HDTV - in the UK this is focussed on the 1920x1080i standard. The subjective quality when viewed on a native 1920x1080 screen is breathtaking, even with quite highly compressed footage.
Regards,
Ian
I wasn't talking about the color depth. I was talking about the spatial resolution of screen areas being rendered into different colors.
I know the PAL color encoding is better than NTSC, but not much better. In NTSC, the color information is encoded using a 3.58 MHz carrier. The color saturation is represented by the amplitude and the color hue is represented by the phase relative to the initial color burst for each line. Since there are only around 140 carrier cycles per visible line, this limits the very best color resolution to 140 "color pixels" per line.
But this performance (if you can call it that!) is only achieved by using a higher resolution source (like digital satellite or DVD) and using S-video output. If you actually encode the signal into a composite NTSC signal, it gets worse. Encoding to RF is even worse than that. (Like around 53 color pixels per line!)
<140 color pixels per line looks okay as long as you have a small screen, sit at some distance, and have no small image objects (like text). These were all reasonable assumptions 40 years ago, but not today. This problem is the reason that color text is so unreadable on SD TVs: The colors bleed and smear into each other and they create false color fringes where the luminance and color area boundaries don't line up.
I do have a digital satellite settop box sending S-video to my TV, so I can see the very best that is possible with NTSC encoding. It looks pretty good, until they put some text on the screen. If I switch to OTA HD to see the same image, the text edges are perfect from 2 feet away from the screen.
I know the PAL color encoding is better than NTSC, but not much better. In NTSC, the color information is encoded using a 3.58 MHz carrier. The color saturation is represented by the amplitude and the color hue is represented by the phase relative to the initial color burst for each line. Since there are only around 140 carrier cycles per visible line, this limits the very best color resolution to 140 "color pixels" per line.
But this performance (if you can call it that!) is only achieved by using a higher resolution source (like digital satellite or DVD) and using S-video output. If you actually encode the signal into a composite NTSC signal, it gets worse. Encoding to RF is even worse than that. (Like around 53 color pixels per line!)
<140 color pixels per line looks okay as long as you have a small screen, sit at some distance, and have no small image objects (like text). These were all reasonable assumptions 40 years ago, but not today. This problem is the reason that color text is so unreadable on SD TVs: The colors bleed and smear into each other and they create false color fringes where the luminance and color area boundaries don't line up.
I do have a digital satellite settop box sending S-video to my TV, so I can see the very best that is possible with NTSC encoding. It looks pretty good, until they put some text on the screen. If I switch to OTA HD to see the same image, the text edges are perfect from 2 feet away from the screen.
I think "Micro" displays are capable of displaying the exact same number of color pixels as they are luminance (they have no black/white pixels).
So if you have a 1920 x 1080 TV you could display that many discrete color pixels. All you need is a source that has that many.
Now whether the video processing (engine) will pass that many pixels is a different question, I was speaking only of the raw display capabilites.
So if you have a 1920 x 1080 TV you could display that many discrete color pixels. All you need is a source that has that many.
Now whether the video processing (engine) will pass that many pixels is a different question, I was speaking only of the raw display capabilites.
If you drive a good monitor with a VGA signal that matches the native resolution of the monitor, then you will get color resolution identical to luminance resolution. The VGA interface does not even have a luminance signal, just red, green, and blue levels.
With an HDTV that has a native pixel resolution of 1920 by 1080, you might be able to drive it with a VGA signal from a computer or a high density DVD player, and get the same color resolution. But if you drive it with an Over The Air HDTV signal, you will get less. OTA HDTV formats contain one set of color information for each 2 by 2 or 2 by 4 block of luminance pixels. So the total number of color pixels is 1/4 or 1/8 the number of luminance pixels.
So for 1080i format you would get 259,200 or 518,400 color pixels, compared to about 12,720 on an OTA SD NTSC image. That is why HDTV looks so much better: 20 to 40 times as many color pixels on the screen.
With an HDTV that has a native pixel resolution of 1920 by 1080, you might be able to drive it with a VGA signal from a computer or a high density DVD player, and get the same color resolution. But if you drive it with an Over The Air HDTV signal, you will get less. OTA HDTV formats contain one set of color information for each 2 by 2 or 2 by 4 block of luminance pixels. So the total number of color pixels is 1/4 or 1/8 the number of luminance pixels.
So for 1080i format you would get 259,200 or 518,400 color pixels, compared to about 12,720 on an OTA SD NTSC image. That is why HDTV looks so much better: 20 to 40 times as many color pixels on the screen.
So for 1080i format you would get 259,200 or 518,400 color pixels, compared to about 12,720 on an OTA SD NTSC image. That is why HDTV looks so much better: 20 to 40 times as many color pixels on the screen.
DVB uses MPEG-2 compression, implementing a 4:2:2 chroma subsampling system. There is a good explanation here:
http://en.wikipedia.org/wiki/4:2:0
Well, HDTV is not all that expensive.
I just bought a Philips Pixel Plus TV for Rs. 23500 (< $500 USD).
This is a 29" CRT that supports 1080p.
Only downside is that you dont get a HDMI or DVI input.
It has a high def Y,Pb,Pr input. But the Quality is just Outstanding.
Amazing Picture Clarity even at 480p standard DVD input.
It rocks at 100Hz or Prgressive Scan for normal SD Signals as well!
Follow this link for the specs:
http://www.consumer.philips.com/con...pe=CONSUMER&productId=29PT8836_94_IN_CONSUMER
I just bought a Philips Pixel Plus TV for Rs. 23500 (< $500 USD).
This is a 29" CRT that supports 1080p.
Only downside is that you dont get a HDMI or DVI input.
It has a high def Y,Pb,Pr input. But the Quality is just Outstanding.
Amazing Picture Clarity even at 480p standard DVD input.
It rocks at 100Hz or Prgressive Scan for normal SD Signals as well!
Follow this link for the specs:
http://www.consumer.philips.com/con...pe=CONSUMER&productId=29PT8836_94_IN_CONSUMER
Bargain?
If anyone is keen to take the HiDef plunge ... i think the Toshiba 42WLT66, available online at Comet for GBP1299, is a bit of a bargain.
This is a full 1080i LCD screen, and currently cheaper than many inferior specced models.
Regards,
Ian
If anyone is keen to take the HiDef plunge ... i think the Toshiba 42WLT66, available online at Comet for GBP1299, is a bit of a bargain.
This is a full 1080i LCD screen, and currently cheaper than many inferior specced models.
Regards,
Ian
DVB uses MPEG-2 compression, implementing a 4:2:2 chroma subsampling system. There is a good explanation here:
DVB SD uses MPEG2 compression, and it can also be 4:2:0.
DVB HD can be in MPEG2, but typically (in order to save bandwidth) it will be h264 (MPEG4).
I'm not entirely certain what the luma/chroma ratio is, but its probably 4:2:2.
I know the STB chips capable of decoding HD often use 4:4:4 frame buffers for OSD.
But, to me also, one of the greater benefits, which is not marketed much (probably because it is harder to do than greater resolution) is the greater colour resolution.
h.264
Quite correct, Phil, although I believe both Sky HD and the BBC HD test transmissions are using the MPEG4/H.264 AVC running at 4:2:0.
Full specs for the uninitiated (although it's pretty heavy going) can be found here: http://en.wikipedia.org/wiki/H.264
Regards,
Ian
I'm not entirely certain what the luma/chroma ratio is, but its probably 4:2:2.
DVB SD uses MPEG2 compression, and it can also be 4:2:0.
DVB HD can be in MPEG2, but typically (in order to save bandwidth) it will be h264 (MPEG4).
I'm not entirely certain what the luma/chroma ratio is, but its probably 4:2:2.
Quite correct, Phil, although I believe both Sky HD and the BBC HD test transmissions are using the MPEG4/H.264 AVC running at 4:2:0.
Full specs for the uninitiated (although it's pretty heavy going) can be found here: http://en.wikipedia.org/wiki/H.264
Regards,
Ian
Isn't most HD still encoded in only 10 bits per colour sample though? Same as SD?
I'll have to check what our HD broadcast encoder uses, I've only dealt with our SD encoders in detail.
I'll have to check what our HD broadcast encoder uses, I've only dealt with our SD encoders in detail.
In the UK, I think it's MPEG4/H.264 'High Profile' (HiP) 4:2:0 at 8 bits ber sample. Same bit depth as SD.Isn't most HD still encoded in only 10 bits per colour sample though? Same as SD?
I don't believe anything above 8bit will be implemented for the domestic decoder.
Regards,
Ian
If you've never seen it, it's very easy to arrange a 'chroma only' display on a standard set. The lack of definition is then very obvious, but when integrated with the luma information an optical illusion takes place which increases the apparent detail (I guess the brain defines the colour boundaries according to the B/W picture, using boundary-seeking algorithms). It's sometimes easy to forget just how much video depends on the brain's expecting to see a 'normal' moving image. HDTV reduces that dependency somewhat, and therein lies its benefit.
The lack of definition is then very obvious, but when integrated with the luma information an optical illusion takes place which increases the apparent detail
I could characterize NTSC as being somewhat the same... the chroma channels if viewed seperately don't look very detailed but bring in the limunance and it adds the snap to the image.
IMHO so far HDTV has been a step backwards in broadcast image quality. Overall the image looks better on initial viewing but after viewing a while you see how it begins to fall apart. 720P definately wins as the better of the two. They still have a long way to go with this stuff though.... When we actually start broadcasting 1080P then they may be on to something good.... That I'm afraid is a loooong time away.
Mark
The lack of definition is then very obvious, but when integrated with the luma information an optical illusion takes place which increases the apparent detail (I guess the brain defines the colour boundaries according to the B/W picture, using boundary-seeking algorithms).
This is true. We were always given the rather corny analogy of trying to distinguish the exact colour of thread from a single width of it - quite difficult for the human optical system to discern this. Only when multiple widths are laid side by side (e.g. on a bobbin) does the colour become clear.
This has always been a 'useful' weakness in the human perception of colour for the engineer to exploit. Using the 4:2:2 or 4:2:0 subsampling system provides useful savings in digital data rates. Data compression in the area least noticable to the human eye.
This effect was also exploited in the PAL and NTSC analogue colour systems where the colour bandwidth was severely restricted (to about 1 MHz). Again useful analogue data 'compression'.
Interestingly, I believe the first 'data' compression system introduced in television was the introduction of field interlace - effectively allowing double the frame rate (50Hz UK, 60Hz US) in the same analogue (RF) bandwidth as 25 or 50 fps.
Regards,
Ian
Maybe not. Research by the BBC (I think) suggests that better picture quality at lower data rates can be achieved by adopting 1080/50p instead of 1080/50i. It would appear that having to accomodate the interlaced structure within the data compression algorithm is now acting against the very reason it was introduced.When we actually start broadcasting 1080P then they may be on to something good.... That I'm afraid is a loooong time away.
Regards,
Ian
Hi,
this all seems to be falling into place.
I started out thinking that 1080 must be better than 768 lines.
Then realised that 1080i when divided by 1.6 effectively gives a similar resolution to 768 for moving pictures.
Now that the (software) producers are picking up that 1080p is better than 1080i and that more programming may become available at 768p and 1080p, it looks like waiting for this (new to UK) technology to settle down for a year or two may help make the decision easier, when the HDTV, +software, +my bank account are in synchronisation.
I shall keep in the sidelines meantime.
Thankyou all for your expertise and links (even the complicated ones).
this all seems to be falling into place.
I started out thinking that 1080 must be better than 768 lines.
Then realised that 1080i when divided by 1.6 effectively gives a similar resolution to 768 for moving pictures.
Now that the (software) producers are picking up that 1080p is better than 1080i and that more programming may become available at 768p and 1080p, it looks like waiting for this (new to UK) technology to settle down for a year or two may help make the decision easier, when the HDTV, +software, +my bank account are in synchronisation.
I shall keep in the sidelines meantime.
Thankyou all for your expertise and links (even the complicated ones).
IIRC Sky and the beeb are going to use 1080p25 for most HD and 720p50 for sports.
The datarate for raw 1080p50 is just silly!
But in reality its all rather academic as its put through the MPEG grinder and then stuffed down as little bandwidth as they can get away with to maximise profit by cramming in more channels.
The datarate for raw 1080p50 is just silly!
But in reality its all rather academic as its put through the MPEG grinder and then stuffed down as little bandwidth as they can get away with to maximise profit by cramming in more channels.
I started out thinking that 1080 must be better than 768 lines.
I did too. Just watch a foot ball game and the difference is immediately obvious. There are good things about each of the formats in use in the U.S. but fast motion 1080I 60 falls apart immediately into a mess of pixelization. The US 720P 60 system is vastly superior to 1080i and does not fall apart into pixelization like the 1080i system does.
If you think the bit rate for 1080p is crazy the bit rate for the 4K Sony digital cinema projector is around 360gb per second!! 2K Digital cinema is a cool 26GB per second max running JPEG 2000. Both use variable bit rates and the rates I posted are the maximum rates. The Sony requires 8- HDSDI interconnect cables while 2K DC requires just two.
Mark
- Status
- Not open for further replies.
- Home
- General Interest
- Everything Else
- The Moving Image
- General Video
- high definition tv HDTV