Why?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Why is it whenever I turn on my HD Digital TV, I'm always amazed at the resolution compared to those analog days?
Why is there printed on the reverse of my CD, The Wall by Pink Floyd. "The sound of the original recording has been preserved as close as possible, however due to it's high resolution, the compact disc can reveal limitations of the source tape"?
Why is there now a complete flip flop in certain quarters that find some sort purity of sound in old record players and charity shop vinyl?
 
WHY?

To me, I see HD TVs and a lot of blue ray discs (especially those made from classic films, apocalypse now comes to mind as I saw it recently in its new edition) seem unusual and unnatural..... bordering alien. Certainly not what I would call “life like” ... at least according to my vision / brain.

Do I think it’s because I perceive what is “less resolution” as more accurate? I don’t believe so.

There’s something in the upsampling / remastering / decoding that isn’t quite right.

4K resolution of a bad photograph is still a bad photograph. The increased resolution is fairly useless. I think capturing realism is a more multifaceted issue.

Some TV brands or models are less offensive than others for me. Sony and LG for example I’ve found to be more “natural” than Samsung. I’m not sure what technology differences account for it, and it was just my casual observations in looking for a TV for my home.

I’m not saying all vinyl is superior to CD or making any grandiose or blanket claims. Im not advocating “quantum” power conditioners. I’m simply saying that the path to realism is not only linked to high resolution of the playback format.
 
Last edited:
Our Samsung TV came with a calibration CD-R required to get it to produce far more realistic colours than it does with its default settings. The calibration CD-R was not provided by Samsung, but by the shop that sold the TV. You never needed anything like that when televisions still had cathode ray tubes.

We mainly use the TV to watch standard definition programmes, by the way. The lower resolution doesn't bother us at all.
 
To me, I see HD TVs and a lot of blue ray discs (especially those made from classic films, apocalypse now comes to mind as I saw it recently in its new edition) seem unusual and unnatural..... bordering alien. Certainly not what I would call “life like” ... at least according to my vision / brain.

Do I think it’s because I perceive what is “less resolution” as more accurate? I don’t believe so.

There’s something in the upsampling / remastering / decoding that isn’t quite right.

4K resolution of a bad photograph is still a bad photograph. The increased resolution is fairly useless. I think capturing realism is a more multifaceted issue.

Some TV brands or models are less offensive than others for me. Sony and LG for example I’ve found to be more “natural” than Samsung. I’m not sure what technology differences account for it, and it was just my casual observations in looking for a TV for my home.

I’m not saying all vinyl is superior to CD or making any grandiose or blanket claims. Im not advocating “quantum” power conditioners. I’m simply saying that the path to realism is not only linked to high resolution of the playback format.

I noticed this when I bought a new 4K TV last year (samsung series 6 40" smart tv). There was something unnatural about the motion, picture had amazing clarity but there was something wrong. I got used to it and now I don't see it anymore, interesting that I was probably not imagining it.
 
I consider standard definition to be poorer than old analog, while high definition does a good job of recreating it. 4K Ultra HD is something else!

The CD statement must be marketing hype! I'd happily live with the so-called limitations of the source tape.

The interest in old record players and old vinyl has more to do with nostalgia or fashion than purity of sound. However new vinyl on a quality turntable? That's a different matter!
 
I bought a new 4K TV last year (samsung series 6 40" smart tv). There was something unnatural about the motion, picture had amazing clarity but there was something wrong.

Two pre-Christmas sales ago I got a cheap ($229) Hisense 43 inch 4K TV at Walmart. I use it for a computer monitor and it is not connected to an antenna or cable. Some of the 4K content on YouTube is stunning in it's realism. Pictures and video that I have taken on my own equipment look lifelike and realistic.

Last year's pre-Christmas sale got me a $300 Samsung series 6 40 inch 4K TV, again not connected to any live TV source. As you stated, at first I preferred the Hisense, primarily because the Samsung was different. The two TV's are near each other, but connected to different PC's. You can see both easily enough so I ran the same material through both TV's and tweaked the Samsung to get close to the Hisense in picture. The Samsung has some "motion" features to simulate "120 Hz" on a 60 Hz TV. Turn them off, it will sharpen up the picture and 60Hz is fast enough unless you are a serious gamer, which I am not.

After using them both for several months they have swapped positions and I now prefer the Samsung for video editing. Part of this could be due to the new video card feeding the Samsung. A budget GTX1050 feeds the Samsung and it can do 4K at 60Hz. The Hisense is on an older PC that uses the "4600 graphics" built into the Intel processor. The Core i5-4670K isn't even specified for 4K, but it will do 4K at 30 Hz just fine.
 
Disabled Account
Joined 2017
I did tests recently regarding the capture of VHS video tapes on my computer and I can tell you from my experience that H.264 part 10 does a poor job even at 7500kbps (very high bitrate for D1 resolution(720x576)) at capturing fine grain detail in the VHS tape. In fact with thorough A/B comparisons I'm settling on archiving everything into MPEG-2 or just plain leaving it in magicYUV (YUV2 4:2:0 lossless)

This is with a 6 head VCR. More info can be found here:
Whats the LAST movie you have watched?


Should also be noted that I've decided to archive the video in MPEG-2 instead of H.264 for the very reason that the fine video detail has disappeared when encoding in H.264 Part 10.


I believe that the entire HDTV craze is nothing more than trickery involving sophisticated digital noise reduction that has been embedded in the codec itself. As others have said in this thread 480p and 720p can look worse than old analog video, this is true, very true, all of the blame lies on the fact that compression technology has more to do with the economies of broadcasting and streaming high bitrate codecs versus the old days when analog television had a significantly large amount of bandwidth to deal with and in the VHF era under good conditions with a good rotatable antenna could surpass digitally encoded 480p/576p in video quality.

We've all been deceived. Analog video is truly amazing once you've gotten hang of it, I should know I used to watch analog satellite tv, and remember that there were Analog HD formats out there, MUSE or Hi-Vision on Laserdisc, Extended Definition Betamax for Betamax:
Betamax - Wikipedia

I personally would love to see a consumer grade Analog HD videotape or DVD standard come onto the scene, just like the vinyl and cassette tape resurgence currently going on we could do the same for video.

The only other alternative would be to push those MPEG-2 bitrate sliders all the way to 15,000kbps and/or encode everything in lossless MagicYUV. MPEG-2 and H.264 are for sure not the answer to keeping all of the detail in a video. We need a much better codec.


If it takes 30GB of hard drive space to record just 1 and a 1/2 hours of analog video tape footage with all of the grainy detail intact, lossless, then something is terribly wrong with our way of thinking regarding preserving anything from the analog era. You might as well just stabilize it and copy it back onto analog tape for all the good we are doing.


After all storage is cheap these days, as they keep on saying.
 
Last edited:
analog television had a significantly large amount of bandwidth to deal with and in the VHF era

At least here in the US, analog TV on VHF or UHF and modern digital TV on either band all get 6 MHz of spectrum space for each RF channel.

An analog NTSC stream in the US had 525 lines of which 486 actually carried video. The analog modulation of each line was limited to under 5 MHz of signal bandwidth with a chunk taken out of the middle for the color information. This rate supported about 700 "pixels" of nearly infinite shading under IDEAL conditions. There are about 375,000 "bits" of analog information per TV frame, which is updated at an effective rate of 29.97 Hz. Other popular TV formats do have more lines with 625 being used in PAL.

A 1080 HD video stream however has 1080 "lines" by 1920 pixels. Each pixel carries 8 bits of information for each of the 3 primary colors, or 24 bits per pixel. This creates over 2 million 24 bit words per frame. Of course some form of compression is needed.....as you state, that's where the problem lies.

It is up to the individual broadcaster or cable company how they use their bandwidth. There are multiple standard choices. In the US, an over the air TV station can devote the entire 6 MHz channel to a single 1080P TV stream. Many of the big city TV stations do exactly that, and the picture quality is quite good. You have to critically watch a well produced video to find the compression artifacts.

Here in the middle of nowhere we have two over the air TV channels. OTA channel 9 stuffs a 1080i network stream, a 720P network stream, and a 480i stream through their 6 MHz allocation. Compression is quite obvious. OTA channel 7 stuffs a 1080i network stream, a 720P network stream, and TWO 480i streams through their 6 MHz allocation. The picture quality......

The cable company here stuffs about 300 TV streams, internet and phone service down cable that is over 20 years old. Some of their video streams make the OTA channels look good. They seem to allocate more bandwidth to the most watched channels, so if you don't watch the mainstream crap, and prefer some of the less popular channels, you get to watch obvious pixelization and occasional breakup of the entire screen.

It's all about the money......more streams per MHz equals more advertising revenue.
 
Our Samsung TV came with a calibration CD-R required to get it to produce far more realistic colours than it does with its default settings. The calibration CD-R was not provided by Samsung, but by the shop that sold the TV. You never needed anything like that when televisions still had cathode ray tubes.
You must be quite the newcomer to say so, or never ever involved in the TV Servicing area.

Calibration was VERY MUCH needed in Analog TVs, doubly so because adjustments "shifted" over time.

In the early 80´s I built an Elektor version of this:
TG-5100.JPG


which was light years more advanced compared to what I had inherited from an old TV Tech, the Argentine version of:
An externally hosted image should be here but it was not working when we last tested it.


would have loved this but Importing one was too complicated/expensive:
s-l300.jpg


they were most necessary to turn this (if you were lucky, might have been way worse):
maxresdefault.jpg


into something like this:
a1003f0d792087f00508f9dbac39d78e.jpg


:D
 
I’m simply saying that the path to realism is not only linked to high resolution of the playback format.

Film is art, "touching up" the cinematographer's work is no different than colorizing classics like The Maltese Falcon. I find most of what's going on in video (re: films) these days horrifying and unwatchable except as entertainment. Film also has a noise floor with somewhat a stochastic resonance effect which is destroyed by the noise reduction blocking effects of compression. The 4K TV's increased color gamut (which adds unnatural colors at will) and even more DSP only makes thinks worse IMO.

Even rarer than audiophiles into master tapes are film buffs into actual film, usually 16mm prints (anamorphic versions do exist).

I disagree about CRT's, edge to edge convergence was often a nightmare and the color gamuts were never a perfect match. In never owned anything but Sony Trinitrons which did the best IMO. That and a high end VCR were pretty good together (remember 1980's tapes were often $75 or more and ~$5 a day rental). Limited run time hurt beta but there were never as many pre-recorded titles available and of course virtually no porn.

AFAIK - The studios archive films as individual uncompressed frames and as of 10yr. ago were seeking a vendor to make a 100's of MHz tape player.
 
Last edited:
You must be quite the newcomer to say so, or never ever involved in the TV Servicing area.

Calibration was VERY MUCH needed in Analog TVs, doubly so because adjustments "shifted" over time.

I'm old enough to remember late-1970's/early 1980's Philips colour televisions with a panel with dozens of trimmable devices (mainly inductors) to trim out all kinds of abberations of the picture, but a normal user was not supposed to touch any of those. They came properly calibrated from the factory, and the only thing the user had to do, was to reduce the colour saturation.
 
I find most of what's going on in video (re: films) these days horrifying and unwatchable except as entertainment. Film also has a noise floor with somewhat a stochastic resonance effect which is destroyed by the noise reduction blocking effects of compression. The 4K TV's increased color gamut (which adds unnatural colors at will) and even more DSP only makes thinks worse IMO.

The OP asked why there was a quality disclaimer printed on his CD. Perhaps, in regard to watching films, a similar rider should be attached to a 4K TV.

"The quality of the original image has been preserved as close as possible, however due to its high resolution, the 4K system can reveal limitations of the source film."

If such a disclaimer can be applied to digital CD, then it can equally well be applied to digital TV!
 
new tvs need to be adjusted to look right. Of course crt tvs needed calibration. But with NTSC ( never the same colour twice ) it only mattered so much. It also drifted. Where I live the OTA HD channels look great, ( and sound good ) and no one would say it's worse than NTSC. If you want to see full quality 1080 HD get bluray and a good looking movie ( Dark Night ). 50 gig for 3 hours. As far as motion goes, good analog tv was always shot on film at 24 fps and then ( for NTSC ) telecine to tv at 30fps. So every 3rd frame was repeated. No one ever noticed the stutter. (Pal just shot the film at 25fps or stretched the content by 25/24s
 
Nice. Wish they would release a consumer version intended for 4K or 8K uncompressed.

There is no ports or buss on a PC that could support that, the only devices known are essentially one offs that cost who knows what. That's 5.096G bits per second for 8bit color and I never did find out if they ever found anyone to make a 1x player. The proposal was to coat 2" tape with DVDR material and run it at insane inches per second past a special laser scanner, frankly the whole thing seemed pie in the sky to me.

The scanning is very slow, I got to see some done in India for HD commercials.
 
"The quality of the original image has been preserved as close as possible, however due to its high resolution, the 4K system can reveal limitations of the source film."

I meant the opposite the disclaimer should read, "The advances in technology don't necessarily reproduce the original intent of the artists involved." Frankly serious film buffs consider almost all video to be like listening to nothing but 128K MP3's
 
Last edited:
Film is art, "touching up" the cinematographer's work is no different than colorizing classics like The Maltese Falcon. I find most of what's going on in video (re: films) these days horrifying and unwatchable except as entertainment. Film also has a noise floor with somewhat a stochastic resonance effect which is destroyed by the noise reduction blocking effects of compression. The 4K TV's increased color gamut (which adds unnatural colors at will) and even more DSP only makes thinks worse IMO.

Even rarer than audiophiles into master tapes are film buffs into actual film, usually 16mm prints (anamorphic versions do exist).


I’ll count myself as one of them.

A family member of mine is a noted cinematographer, and ~20 years ago I remember a discussion about why he continued to use all the old Panamax stuff when digital was taking over all the sets. Everyone at the table (family, not film folks) thought he was a bit ridiculous, perhaps nostalgic but he said you’d have to pull those tape cans from his cold, dead hands. He likened digital to adding a “second rate cinematographer who doesn’t know what the hell he is doing” (referring to the electronics). Basically, that when you use tape, what you shoot winds up in the tape, when you use digital what winds up on the tape is what you shoot plus a “digital interpretation” of the image that “fills in blanks where there are none”. Obviously the quotes are from memory but that’s my best recollection.

He went through all kinds of trouble to get those cameras, as the number of them and number of those able to maintain and repair them is dwindling close to extinction. Not to mention the number of projectors and projectionists who still know to play them, the number of which is even lower.

I saw “The Master” when it was toured with the 70mm print and equipment and I will stand by the statement that it was the best image I’ve seen ever produced in any format on any medium. It was truly stunning. The sound was also exceptional. (The film itself is nothing short of excellent as well, if you haven’t seen it I suggest checking it out.)

Don’t throw the baby out with the bath water, it’s not all analog fetishism, nostalgia or magical thinking. Sorting that all out requires a level of expertise and professionalism that few have, and so reductionism and drawing erroneous logical connections tends to prevail.

What’s All the Fuss About The Master and 70mm? -- Vulture
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.