BBC considering HDTV rollout

Page 5 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Roderick Stewart wrote:
>
>
>
> I don't understand how you are deriving this. Interlace has been used since the
> birth of television because it can achieve the appearance of twice as many
> lines - either twice the vertical resolution or twice the vertical refresh
> rate, without requiring twice the bandwidth to transmit the signal. or in other
> words, interlace would require half the bandwidth of an equivalent
> non-interlaced signal.

Interlace cannot achieve twice the resolution. If you have a source
that actually uses that resolution you of course get flicker at
half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
horrible flicker.

If there is vertical motion you get even worse "aliasing artifacts"
too.


>
> 625/50i (i.e. standard European broadcast signal) has 287.5 picture lines in
> each vertical scan, but the appearance of 575 picture lines because it is
> interlaced.


Yes, it has that appearance if you have a static picture with no
vertical detail at more than half the number of lines (i.e. a Kell
factor of 0.5).

For static pictures of course a proper de-interlacer can fix things
up. But it can;t fix things up if there is vertical motion and the
Kell factor is over 0.5.

Also, one notes, interlaced MPEG has 1/4 the vertical color
resolution of un-MPEGed stuff, rather than 1/2.

I have a progressive TV (720p) and watch mostly either real 720p
stuff, side-converted 1080i stuff, or 24p converted to 720p. When I
do watch 480i material, even upconverted by the very excellent
Fox upconvertors, I am really bothered by the artifacts. The
CCIR has these "grade" thinsg for video, rating by "irritation
factor". Interlaced basketball has the lowest possible rating,
extremely irritating, no matter what resolution the source or
what resolution the display, except for the two cases of
1080i original downconverted to 480p shown on a 480p display,
which looks fine but fuzzy, as it should, and 1080i on a 720p
display, which is still fairly irritating.

OF course, perhaps the reason Europeans in general seem not
to mind artifacts is that they tend to view exceedingly flickery
stuff (50 HZ) on rather small TVs. I surveyed the average
size of my friends' main TVs, and it is 50 inches. Do large numbers
of Europeans have 50 inch main TVs (i.e. biggest TV in the house)?

Doug McDonald
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

In article <c7e22r$ffk$1@news.ks.uiuc.edu>,
Doug McDonald <mcdonald@scs.uiuc.edu> writes:
>
> The question I ask is: do you see artifacts? And the test
> is one where sharp (a sharp as the Kell factor chosen allows,
> of course) diaginal lines move either 1/2 or 1 line vertically
> each field. To get rid of artifacts, you have to filter down to
> where the interlace system has little more resolution than half the
> progressive one ... 55 to 60%.
>
You are (appropriate for 1955) assuming a fixed filter. There are
few fixed filter functions in video today (including especially MPEG2
whose detail reduction can actually INCREASE aliasing.) Also, the
vertical filtering CAN take advantage of detail in both fields
(unlike in the olden days, where the best place to do the vertical
filtering was in the TV camera tube, and the damage was rather
extreme.) A simple scheme couldn't easily vary the size of the
scanning spot, but today it is easy to do virtually.

In reality, the loss of vertical resolution can often be mitigated
to a very large extent, while mitigating interlace flicker when
it becomes apparent (with the loss of detail in that instance.)
If you have that much vertical detail, you'd likely have a severe
uglification from stairstepping anyway :). There are some cases
where there are true horizontal lines (instead of nearly diagonal),
but that is the obvious case of computer generated stuff. One big
disadvantage of interlace is that the stairstepping (aliasing) jumps
around and calls attention to itself. It is true that the aliasing
and interlace don't mix well. It is also true that there are cases
where interlace flicker reduction will reduce detail, but it isn't as
bad as it used to be.

I suspect that the major limitation in vertical resolution nowadays
would be more related to the necessary detail reduction for MPEG2
encoding and fitting into the available payload capability.

A good (easy to demo) example would be a very high quality NTSC
camera (in component mode) feeding a DV25 encoder that doesn't
explicitly rolloff the vertical detail when too much detail is captured.
The amount of stairstepping can be extreme, yet even if the signal
was progressive, it would persist, and lots of vertical filtering
would still be necessary. Designing a camera for DV25 isn't necessarily
the optimum for full component (or DV50 or DigiBeta or even betaSP),
since the other formats can deal with extreme vertical detail.

So, there are often more necessary limitations to vertical detail than
just interlace flicker reduction. I am not trying to claim that interlace
flicker can be eliminated without detail loss, but it isn't quite as
bad as it might have been when the flicker reduction was done by using
a large Vidicon (or equiv)/Image Orthicon spot size.

John
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Roderick Stewart wrote:
> In article <c7dhg1$932$1@news.ks.uiuc.edu>, Doug McDonald wrote:
>>> Yep - but the movement of the players is much more fluid at 50i/60i
>>> than at the equivalent 25p/30p - I can't imagine watching live
>>> sport at 25p/30p. You can't compare oranges with apples. Sure
>>> 50p/60p is better - but takes significantly more transmission
>>> bandwith than the same resolution 50i/60i system...
>>
>> No, it takes LESS bandwidth than a SAME RESOLUTION 50i/60i system ...
>> becaus teh interlace system needs twice as many, well many 1.7 or
>> 1.8 times as many ... vertical lines as the noninterlaced system,
>> because the interlaced system must be heavily vertically filtered to
>> get rid of artifacts; also, if the interlaced system is lightly
>> filtered and allows artifacts, it is still hard to encode because
>> MPEG is simply poor at encoding interlace.
>
> I don't understand how you are deriving this. Interlace has been used
> since the birth of television because it can achieve the appearance
> of twice as many lines - either twice the vertical resolution or
> twice the vertical refresh rate, without requiring twice the
> bandwidth to transmit the signal. or in other words, interlace would
> require half the bandwidth of an equivalent non-interlaced signal.

Doug is disputing the Kell Factor value used by most broadcasters and
broadcast engineers.

It is true that an interlaced system requires a degree of pre-filtering to
reduce the level of interline twitter (HF Vertical detail flickering at 25Hz
in the case of 50i) - which reduces the effective vertical resolution
somewhat compared to a similar progressive system. However his 55-60%
levels are massively lower than those I have seen anywhere else - and I
don't accept them as typical based on anecdotal experience.

What Doug seems to be saying is that ANY 25Hz artefacts are unacceptable -
rather than there being an acceptable level at which the resolution benefits
meet a trade-off with the level of twitter.

Sure if you want no 25Hz artefacts then you might as well go to a horridly
soft 288 progressive system. However if you accept some low level 25Hz
twitter, but with a degree of vertical pre-filtering to reduce it, as we do
in the interlaced world - then the resolution of the interlaced system is
higher than Doug quotes, and there is no doubt you get a sharper picture
than a progressive system at the same line and frame/field rate, or a better
temporal resolution for the same line rate and vertical resolution.

There is no disputing that if bandwith and line rates are no object
progressive at a given sampling structure is better than interlaced - but in
the real world there is a reason interlaced was universally adopted - and
continues to be adopted even at HD levels.

[snip]

Steve
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Stephen Neal wrote:

>
> Doug is disputing the Kell Factor value used by most broadcasters and
> broadcast engineers.
>
> It is true that an interlaced system requires a degree of pre-filtering to
> reduce the level of interline twitter (HF Vertical detail flickering at 25Hz
> in the case of 50i) - which reduces the effective vertical resolution
> somewhat compared to a similar progressive system. However his 55-60%
> levels are massively lower than those I have seen anywhere else - and I
> don't accept them as typical based on anecdotal experience.

You don't need 55-60% Kell factore to get rid of interline twitter
.... you need it to get rid of motion artifacts.



>
> What Doug seems to be saying is that ANY 25Hz artefacts are unacceptable -
> rather than there being an acceptable level at which the resolution benefits
> meet a trade-off with the level of twitter.

Yes, that is what I am saying. This of course depends on the meaning
of the word "are". I am not saying that ANY interlace artifacts
at all WERE unacceptable in 1948. They ARE unacceptable in a
digital world with 65 inch TV sets.

Doug McDonald
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Doug McDonald wrote:
> Stephen Neal wrote:
>> Doug McDonald wrote:
>>
>>> Stephen Neal wrote:
>>>
>>>
>>> No, it takes LESS bandwidth than a SAME RESOLUTION 50i/60i system
>>> ... becaus teh interlace system needs twice as many, well many 1.7
>>> or
>>> 1.8 times as many ... vertical lines as the noninterlaced system,
>>> because the interlaced system must be heavily vertically filtered to
>>> get rid of artifacts; also, if the interlaced system is lightly
>>> filtered and allows artifacts, it is still hard to encode because
>>> MPEG is simply poor at encoding interlace.
>>
>>
>> Are you sure of your figures?
>>
>
> Yes
>
>> Isn't the Kell Factor for an interlaced CRT based system normally
>> considered to be around 0.75? Which means an interlaced system has
>> approx 75% the resolution of a progressive system running at the
>> same number of vertical lines. This would equate to 1.33 times as
>> many lines required for an interlaced system to match the vertical
>> resolution of a progressive system.
>>
>> You suggest the Kell factor is nearer 55-60% - meaning an interlaced
>> system has only a-little-over-half the resolution of a progressive
>> system? This really implies there is only a tiny resolution
>> increase when comparing 576i and 288p - which is patently not the
>> case. I've not come across anywhere near such a low figure for a
>> Kell Factor anywhere else - though am happy to be pointed in the
>> direction of anything that confirms this.
>
> The Kell factors you note are, like so many numbers quoted by
> broadcast engineers, based on research that had certain
> pre-conceived notions of what was wanted. The results they got
> from their tests satisfied what they knew they could get.

So it is your view that they are wrong?

>
> The question I ask is: do you see artifacts? And the test
> is one where sharp (a sharp as the Kell factor chosen allows,
> of course) diaginal lines move either 1/2 or 1 line vertically
> each field. To get rid of artifacts, you have to filter down to
> where the interlace system has little more resolution than half the
> progressive one ... 55 to 60%.
>>

You are comparing a single specific situation - where there is a very fixed
level of movement - to rate the performance of a real world situation that
is expected to many different types of picture.... Isn't this a bit
ridiculous.

I am happy to accept that an interlaced system has a very reduced vertical
resolution ON CERTAIN PICTURE CONTENT - but not that this reduction is the
same across the board for all content. There is a trade-off between
acceptable levels of artefacts and the major resolution improvement for a
given line/pixel/sample rate that interlaced scanning can provide.

If line rate is not an issue then progressive will always outperform
interlaced in absolute terms - but line rate and bandwith are always
limitations in both engineering and transmission terms.


>>> There is zero reason for interlace to exist as a taking and
>>> transmission format in a digital world.
>>>
>>
>>
>> Whilst a progressive system would be nice - I see the huge benefits
>> of progressive production now that progressive displays are becoming
>> more prevalent - I also accept that interlaced systems DO work, and
>> if it is a choice of 576/25p and 575/50i the temporal benefits of
>> 50i are greater than the resolution improvements of 25p.
>
> You are not includig the artifact un-improvements.
>

If a 25p image is carried as a 50i image with some vertical filtering then
the only artefacts introduced when converted back to 25p is the vertical
softening. If this vertical filtering were performed in the 50i receiver,
effectively allowing a 25p image to be flagged as progressive, and a 50i as
interlaced we've have a great system.

Steve
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

In article <c7emnn$n9d$1@news.ks.uiuc.edu>, Doug McDonald wrote:
> Interlace cannot achieve twice the resolution. If you have a source
> that actually uses that resolution you of course get flicker at
> half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
> horrible flicker.

I said it can achieve the *appearance* of more lines on the screen for
the same transmission bandwidth, and this usually looks better than a
visible line structure, which is why it's been used by all broadcasters
for nearly seventy years. Yes, there will be interfield twitter for
some fine horizontal detail, but in typical real pictures this is not
obvious all the time, any more than the effects of MPEG encoding, for
example.

Of course there will always be particular types of image that show up
particular deficiencies of the system, but we cannot afford the luxury
of a television system in which *all* the deficiencies have been
reduced beyond human perception for all viewing conditions and all
types of picture content. The limiting factor, as usual, is money,
which determines, among other things, how much bandwidth is avaialble
to record and/or transmit the signal. We can't have a theoretically
"perfect" system in the real world, but must make the best use of
available resources to make the deficiencies imperceptible to most of
the people most of the time.

Rod.
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Roderick Stewart wrote:
>
>
> Of course there will always be particular types of image that show up
> particular deficiencies of the system, but we cannot afford the luxury
> of a television system in which *all* the deficiencies have been
> reduced beyond human perception for all viewing conditions and all
> types of picture content.

Let's not harp on the "we", white man!

While your statement is of course quite true ... we simply cannot
afford 1080x1920@600p ... we in the US have at least two formats
that are exceedingly good and have no significant artifacts
except the "wheels going backwards" one. Both 720@60p and
1080@30p are truly excellent, artifact-free formats, best used,
perhaps, for different purposes, but both excellent.

I have a 720p set and my local ABC station transmits 720p. The
results are excellent, even for basketball. Unfortunately they
are not really excellent for hockey, because 60Hz is just not
enough for following the puck. Hockey really need 1080@600p,
though I suspect 1080@1200p would be better. However, this is
likely to be mmot soon, since because hockey is so poor on TV it is
going to simply fade out of existance. The major sports, football,
baseball, and basketball all look great at 720p.

Doug McDonald
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Doug McDonald wrote:
> Stephen Neal wrote:
>>
>> It is true that an interlaced system requires a degree of
>> pre-filtering to reduce the level of interline twitter (HF Vertical
>> detail flickering at 25Hz in the case of 50i) - which reduces the
>> effective vertical resolution somewhat compared to a similar
>> progressive system. However his 55-60% levels are massively lower
>> than those I have seen anywhere else - and I don't accept them as
>> typical based on anecdotal experience.
>
> You don't need 55-60% Kell factore to get rid of interline twitter
> ... you need it to get rid of motion artifacts.
>

I should have been clearer - there is a point in an interlaced system where
vertical motion artefacts and interline twitter merge, as the interlacing
structure transfers mergese the temporal and spatial domains at certain
motion / detail levels. However this is not the case on all motion or
detail - and the motion / spatial artefacts are, IMHO, much less of a
problem than the spatial resolution of a progressive system running at the
same line and field rate, or the temporal resolution of a progressive system
running at the same line rate and same number of vertical lines (so half the
field/frame rate)

>> What Doug seems to be saying is that ANY 25Hz artefacts are
>> unacceptable - rather than there being an acceptable level at which
>> the resolution benefits meet a trade-off with the level of twitter.
>
> Yes, that is what I am saying. This of course depends on the meaning
> of the word "are". I am not saying that ANY interlace artifacts
> at all WERE unacceptable in 1948. They ARE unacceptable in a
> digital world with 65 inch TV sets.

1936 would be a better start date - that's when the BBC launched the first
regular "high definition" TV service using the Marconi/EMI 405/50i system -
which was the first regularly used interlaced system.

Sure interlaced artefacts are not preferrable - and if progressive systems
with the same spatial resolution and fewer artefacts (rather than increased
compression artefacts) then they should be adopted. However I don't believe
this means the benefits of interlaced scanning when it comes to bandwith
reduction and reduced scanning rates have to be ignored.

Steve
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Stephen Neal wrote:
>
> Sure interlaced artefacts are not preferrable - and if progressive systems
> with the same spatial resolution and fewer artefacts (rather than increased
> compression artefacts) then they should be adopted. However I don't believe
> this means the benefits of interlaced scanning when it comes to bandwith
> reduction and reduced scanning rates have to be ignored.


You are still thinking of "scanning". TV should NOT BE SCANNED.

The images should be obtained as snapshots of the whole field at
once, and displayed that way. The vertical and horizontal
directions should be fully equivalent. This is impossible with
interlace (well, you could do horizontal interlace too like the Japs
one tried with analog HDTV.) Of course, this is perfectly feasible
with modern digital cameras, transmission, and displays. True-pixel
displays do scan, but this can be done at a very fast rate, or one
could use (and some systems may) instantaneous frame transfer.

Doug McDonald
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Gegroet,

Op Do, 06 Mei 2004 19:54:14 -0500, schreef Doug McDonald:
> Interlace cannot achieve twice the resolution. If you have a source
> that actually uses that resolution you of course get flicker at
> half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
> horrible flicker.

I'm sorry but based on the sale of actual TV-sets, I don't think a lot of
people actually "see" this flicker.

I for one do not see the difference between 50 Hz and 100 Hz (not under
normal conditions); and if flicker would be a real problem, the sale of
100 Hz sets would be much higher. It isn't.

For all practicle sence, flicker is a non-issue!


> OF course, perhaps the reason Europeans in general seem not
> to mind artifacts is that they tend to view exceedingly flickery
> stuff (50 HZ) on rather small TVs. I surveyed the average
> size of my friends' main TVs, and it is 50 inches. Do large numbers
> of Europeans have 50 inch main TVs (i.e. biggest TV in the house)?

Well, the power-supply of my TV-set gave its last breath last week, so
I've been doing some "window-shopping" lately.

If you look at the sets on-sale now (for the living-room); most of them
are 70 cm CRT, either as 4/3 or 16/9.

Prices go from 250 to 350 euro (for 4/3), 400 to 600 euro (for a 16/9 set)
or from 650 upwards for a 100 Hz set.


Off course, there are more more expensive or cheaper models too; but I
wouldn't call them "mainstream".


> Doug McDonald
Cheerio! Kr. Bonne.
--
Kristoff Bonne, Bredene, BEL
H323 VoIP: callto://krbonne.homelinux.net/ [nl] [fr] [en] [de]
 

Ivan

Distinguished
Sep 26, 2003
101
0
18,630
Archived from groups: alt.video.digital-tv (More info?)

Kristoff Bonne <compaqnet.be@kristoff.bonne> wrote in message
news:pan.2004.05.07.15.52.58.494415@kristoff.bonne...
> Gegroet,
>
> Op Do, 06 Mei 2004 19:54:14 -0500, schreef Doug McDonald:
> > Interlace cannot achieve twice the resolution. If you have a source
> > that actually uses that resolution you of course get flicker at
> > half the frame rate: 25 or 30 HZ, and what the viewer sees is mostly
> > horrible flicker.
>
> I'm sorry but based on the sale of actual TV-sets, I don't think a lot of
> people actually "see" this flicker.
>
> I for one do not see the difference between 50 Hz and 100 Hz (not under
> normal conditions); and if flicker would be a real problem, the sale of
> 100 Hz sets would be much higher. It isn't.
>
> For all practicle sence, flicker is a non-issue!
>
>
Compared to the rest of the world 60Hz is in the minority anyway, even
leaving the rest of the 50Hz world out of it, I would hazard a guess that
there are more people in China alone viewing 50/625 receivers than the
combined countries who view 60/525. Also there are (at the last figure I
heard quoted) around three-quarters of a million Americans residing here in
the UK who don't appear to find it a problem.
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

In article <c7g8qi$78d$1@news.ks.uiuc.edu>, Doug McDonald wrote:
> While your statement is of course quite true ... we simply cannot
> afford 1080x1920@600p ... we in the US have at least two formats
> that are exceedingly good and have no significant artifacts
> except the "wheels going backwards" one. Both 720@60p and
> 1080@30p are truly excellent, artifact-free formats, best used,
> perhaps, for different purposes, but both excellent.

Don't you regard wheels going backwards as an important artifact then?

Rod.
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

Roderick Stewart wrote:

> In article <c7g8qi$78d$1@news.ks.uiuc.edu>, Doug McDonald wrote:
>
>>While your statement is of course quite true ... we simply cannot
>>afford 1080x1920@600p ... we in the US have at least two formats
>>that are exceedingly good and have no significant artifacts
>>except the "wheels going backwards" one. Both 720@60p and
>>1080@30p are truly excellent, artifact-free formats, best used,
>>perhaps, for different purposes, but both excellent.
>
>
> Don't you regard wheels going backwards as an important artifact then?


Of course its a significant artifact. That's what I
said what I did. Read it.

Doug McDonald
 
G

Guest

Guest
Archived from groups: alt.video.digital-tv (More info?)

"Doug McDonald" <mcdonald@scs.uiuc.edu> wrote in message
news:c7g95n$7e2$1@news.ks.uiuc.edu...
> Stephen Neal wrote:
> >
> > Sure interlaced artefacts are not preferrable - and if progressive
systems
> > with the same spatial resolution and fewer artefacts (rather than
increased
> > compression artefacts) then they should be adopted. However I don't
believe
> > this means the benefits of interlaced scanning when it comes to bandwith
> > reduction and reduced scanning rates have to be ignored.
>
>
> You are still thinking of "scanning". TV should NOT BE SCANNED.
>

Actually I don't think of TV as "scanning" - however I express CRT display
in this format because that is how it works.

I'm perfectly aware that modern dispay techniques (and modern capture
systems) don't employ a scanning spot system - however direct-view CRT
systems are still in the majority - and like it or not they use scanning for
display. (And most CRT TVs still used interlaced scanning)

> The images should be obtained as snapshots of the whole field at
> once, and displayed that way.

Yep - the whole field or frame integrated (ideally) across the whole
field/frame period (or as much as is practical) - rather than shuttered for
a shorter period which introduces more temporal artefacts.

> The vertical and horizontal
> directions should be fully equivalent.

In angular resolution ideally.

> This is impossible with
> interlace (well, you could do horizontal interlace too like the Japs
> one tried with analog HDTV.)

Yep - though capture and transmission/display don't have to be entirely
linked.

> Of course, this is perfectly feasible
> with modern digital cameras, transmission, and displays. True-pixel
> displays do scan, but this can be done at a very fast rate, or one
> could use (and some systems may) instantaneous frame transfer.
>

Yep - though the bandwith-saving arguments for interlaced scanning -
vertical resolution dynamically traded for temporal etc. still hold.

Steve