G
Guest
Guest
Archived from groups: alt.video.digital-tv (More info?)
Roderick Stewart wrote:
>
>
>
> I don't understand how you are deriving this. Interlace has been used since the
> birth of television because it can achieve the appearance of twice as many
> lines - either twice the vertical resolution or twice the vertical refresh
> rate, without requiring twice the bandwidth to transmit the signal. or in other
> words, interlace would require half the bandwidth of an equivalent
> non-interlaced signal.
Interlace cannot achieve twice the resolution. If you have a source
that actually uses that resolution you of course get flicker at
half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
horrible flicker.
If there is vertical motion you get even worse "aliasing artifacts"
too.
>
> 625/50i (i.e. standard European broadcast signal) has 287.5 picture lines in
> each vertical scan, but the appearance of 575 picture lines because it is
> interlaced.
Yes, it has that appearance if you have a static picture with no
vertical detail at more than half the number of lines (i.e. a Kell
factor of 0.5).
For static pictures of course a proper de-interlacer can fix things
up. But it can;t fix things up if there is vertical motion and the
Kell factor is over 0.5.
Also, one notes, interlaced MPEG has 1/4 the vertical color
resolution of un-MPEGed stuff, rather than 1/2.
I have a progressive TV (720p) and watch mostly either real 720p
stuff, side-converted 1080i stuff, or 24p converted to 720p. When I
do watch 480i material, even upconverted by the very excellent
Fox upconvertors, I am really bothered by the artifacts. The
CCIR has these "grade" thinsg for video, rating by "irritation
factor". Interlaced basketball has the lowest possible rating,
extremely irritating, no matter what resolution the source or
what resolution the display, except for the two cases of
1080i original downconverted to 480p shown on a 480p display,
which looks fine but fuzzy, as it should, and 1080i on a 720p
display, which is still fairly irritating.
OF course, perhaps the reason Europeans in general seem not
to mind artifacts is that they tend to view exceedingly flickery
stuff (50 HZ) on rather small TVs. I surveyed the average
size of my friends' main TVs, and it is 50 inches. Do large numbers
of Europeans have 50 inch main TVs (i.e. biggest TV in the house)?
Doug McDonald
Roderick Stewart wrote:
>
>
>
> I don't understand how you are deriving this. Interlace has been used since the
> birth of television because it can achieve the appearance of twice as many
> lines - either twice the vertical resolution or twice the vertical refresh
> rate, without requiring twice the bandwidth to transmit the signal. or in other
> words, interlace would require half the bandwidth of an equivalent
> non-interlaced signal.
Interlace cannot achieve twice the resolution. If you have a source
that actually uses that resolution you of course get flicker at
half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
horrible flicker.
If there is vertical motion you get even worse "aliasing artifacts"
too.
>
> 625/50i (i.e. standard European broadcast signal) has 287.5 picture lines in
> each vertical scan, but the appearance of 575 picture lines because it is
> interlaced.
Yes, it has that appearance if you have a static picture with no
vertical detail at more than half the number of lines (i.e. a Kell
factor of 0.5).
For static pictures of course a proper de-interlacer can fix things
up. But it can;t fix things up if there is vertical motion and the
Kell factor is over 0.5.
Also, one notes, interlaced MPEG has 1/4 the vertical color
resolution of un-MPEGed stuff, rather than 1/2.
I have a progressive TV (720p) and watch mostly either real 720p
stuff, side-converted 1080i stuff, or 24p converted to 720p. When I
do watch 480i material, even upconverted by the very excellent
Fox upconvertors, I am really bothered by the artifacts. The
CCIR has these "grade" thinsg for video, rating by "irritation
factor". Interlaced basketball has the lowest possible rating,
extremely irritating, no matter what resolution the source or
what resolution the display, except for the two cases of
1080i original downconverted to 480p shown on a 480p display,
which looks fine but fuzzy, as it should, and 1080i on a 720p
display, which is still fairly irritating.
OF course, perhaps the reason Europeans in general seem not
to mind artifacts is that they tend to view exceedingly flickery
stuff (50 HZ) on rather small TVs. I surveyed the average
size of my friends' main TVs, and it is 50 inches. Do large numbers
of Europeans have 50 inch main TVs (i.e. biggest TV in the house)?
Doug McDonald