I watched a few minutes of "Battle of Britain" (1969?) last night. The digital processing made this look like it was shot on video tape, very creepy in fact. If you're into the film/video difference you know what I mean as to noise, gamma and light/shadow effects. Several other titles had the same effect while other channels looked normal (it is my first week with a new Samsung LED backlit flat panel). The old noise (stocastic resonance?) and feel of film was gone.
On this matter I will whole-heartedly say digital sucks.
On this matter I will whole-heartedly say digital sucks.
On this matter I will whole-heartedly say digital sucks.
No Scott
For an old film turned into digital and massively DSP image processed, what comes out and makes you like/dislike it, is solely the result of some artistic production decisions.
Be sure that technically, it could have as much unsharpness, haziness, “film scratch”, frame rate unsteadiness ect as you would be fed up with. Massive batch DSP image processing/enchasement has very powerfull tools.
Stochastic signal quantization has been with pattern recognition since the beginning of artificial vision development, where all the DSP for image manipulation has sprung from.
At the front end, ADCs and digital image capture devices (flat panel detectors (FPDs) and high-density line-scan solid state detectors) are very well developed now. Detectors especially, are really much better than 5 years ago and an age ahead from what it was 10-15 years ago.
Think also for possible image alteration of a hypothetical well balanced DVD when it is transmitted from a station through cable.
Now, you see it on a LED screen. Try to watch it on a good CRT and compare.
George
I think Scott is complaining that it is too 'good', so looks unreal.
The likely explanation is that the person twiddling the knobs is a youngster who has grown up with video, so he will want to make everything look like video (or even computer graphics!). Except, of course, when he wants to make something look ancient when he will grossly overdo the 'film' effects so that 1960s looks like 1920s.
The likely explanation is that the person twiddling the knobs is a youngster who has grown up with video, so he will want to make everything look like video (or even computer graphics!). Except, of course, when he wants to make something look ancient when he will grossly overdo the 'film' effects so that 1960s looks like 1920s.
Scott,
Do you have the HDMI output from your DVD player set to 1080p? The higher the resolution the more "film-like" the picture will be.
John
It was Comcast and I forgot what resolution it locked in at. It's funny there are some cinematographers that think video is cold and unengaging and process the other way. Maybe rather than digital sucks, it's the folks turning the knobs.
I think a lot lies in the video vs film gamma, actually a very complex subject. The pro video cameras have several "film" modes so this is clearly important to the pros.
http://www.c-f-systems.com/Docs/GammaCFS-242.pdf
Most likely down to the bit rate of the transmission. All that "filmy" stuff is very hard to compress, so with lower bandwidth requirements to get as many channels on air at once to maximise revenue it's the first to go.
I know but the usual artifacts (like a gray sky having only 3 or 4 large regions) were not that obvious.
If it was via Comcast on a HD feed it would be 720p (like all cable/sat HD channels anywhere).
Shaw isn't even that, shipping mostly 1080i down the "pipe"
dave
This, i understand, is one of, if not the most popular hi-end capture tool.
ARRI Group: News
dave
An externally hosted image should be here but it was not working when we last tested it.
ARRI Group: News
dave
- Status
- This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
- Home
- Member Areas
- The Lounge
- MGM HD what's up?