Should I use 720p or 1080i?

G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

I am going to be picking up my HD TIVO today and am curious if it's
best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
again is 720 more like 1440 or is simply have progressive better than
interlaced? Or is it simply a matter of opinion? I figured I'd stick
with one in my menu since I'll be using 480i on another TV and just
wonder which to go with. I'm not sure if it depends on what you're
watching or the TV you have but as for the TV's, I'll be using this
with two TV's
Sony KF-42WE610
Sony KF-50XBR800
 

curmudgeon

Distinguished
Apr 7, 2004
262
0
18,930
Archived from groups: alt.tv.tech.hdtv (More info?)

Why are you asking us? Are your eyes "broken"?
Believe it or not, your opinion counts more than anyone's here.

"MarkW" <markwco(removenospam)@comcast.net> wrote in message
news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
> I am going to be picking up my HD TIVO today and am curious if it's
> best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
> again is 720 more like 1440 or is simply have progressive better than
> interlaced? Or is it simply a matter of opinion? I figured I'd stick
> with one in my menu since I'll be using 480i on another TV and just
> wonder which to go with. I'm not sure if it depends on what you're
> watching or the TV you have but as for the TV's, I'll be using this
> with two TV's
> Sony KF-42WE610
> Sony KF-50XBR800
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

On Wed, 05 May 2004 21:01:50 GMT, MarkW <markwco(removenospam)@comcast.net> wrote:

>I am going to be picking up my HD TIVO today and am curious if it's
>best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
>again is 720 more like 1440 or is simply have progressive better than
>interlaced? Or is it simply a matter of opinion? I figured I'd stick
>with one in my menu since I'll be using 480i on another TV and just
>wonder which to go with. I'm not sure if it depends on what you're
>watching or the TV you have but as for the TV's, I'll be using this
>with two TV's

1080i is not better quality that 720p. There is a 2x factor between
"i" and "p" - aka interlaced and progressive (non-interlaced). Interlaced
only shows every other line on each pass. Alternating between the even
and odd lines. Higher resolution but only half the picture at any given
moment. Progressive shows all the lines on every pass. Lower resolution
but the whole picture all the time. Thus 1080i uses the same bandwidth
as 540p. Just presenting the information in a different way. So in "i"
mode, 720p would be 1440i. In order of increasing quality, here is the
list of the most common modes:

480i (aka standard non-hd)
480p
1080i
720p
1080p

It looks like Samsung will have a 1080p rear projection DLP by Christmas.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

George Thorogood (thorogood@mailinator.com) wrote in alt.tv.tech.hdtv:
> Thus 1080i uses the same bandwidth
> as 540p.

This is misleading. 1080i uses more bandwidth than 720p.

> Just presenting the information in a different way. So in "i"
> mode, 720p would be 1440i.

Not even close. 1280x1440/60i would require twice the scan lines which
would increase the effective, Kell-factored resolution to about the same as
1280x1080/30p.

The effective resolution for 1920x1080/60i is about the same as 1920x810/30p.

Also, not all 720p is the same. It can be 1280x720/24p, 1280x720/30p or
1280x720/60p. ABC and ESPN use the last one, which is mostly a waste of
time for showing movies, since the extra frame rate is basically wasted with
repeated frames. 1920x1080/60i does a better job with movie content because
you have more pixels and the amount of temporal information is low enough
that it is easily handled.

A 1920x1080/60p display using de-interlacing will look a lot better on a
source that started as a movie when given a 1920x1080/60i transmission
instead of a 1280x720/60p transmission.

--
Jeff Rife | "I feel the need...the need for
SPAM bait: | expeditious velocity"
AskDOJ@usdoj.gov | -- Brain
uce@ftc.gov |
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

On Wed, 05 May 2004 21:01:50 GMT, <markwco@comcast.net> wrote:
> I am going to be picking up my HD TIVO today and am curious if it's
> best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
> again is 720 more like 1440 or is simply have progressive better than
> interlaced? Or is it simply a matter of opinion? I figured I'd stick
> with one in my menu since I'll be using 480i on another TV and just
> wonder which to go with. I'm not sure if it depends on what you're
> watching or the TV you have but as for the TV's, I'll be using this
> with two TV's
> Sony KF-42WE610
> Sony KF-50XBR800

It depends on the native format of your TV. Very few displays will do both.
If your TV does 1080i, then set the TIVO to that. If it does 720p, then set
the TIVO to that.


John.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
George Thorogood <thorogood@mailinator.com> writes:
> On Wed, 05 May 2004 21:01:50 GMT, MarkW <markwco(removenospam)@comcast.net> wrote:
>
>>I am going to be picking up my HD TIVO today and am curious if it's
>>best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
>>again is 720 more like 1440 or is simply have progressive better than
>>interlaced? Or is it simply a matter of opinion? I figured I'd stick
>>with one in my menu since I'll be using 480i on another TV and just
>>wonder which to go with. I'm not sure if it depends on what you're
>>watching or the TV you have but as for the TV's, I'll be using this
>>with two TV's
>
> 1080i is not better quality that 720p. There is a 2x factor between
> "i" and "p" - aka interlaced and progressive (non-interlaced).
>
No... You see the stationary detail in both phases of the interlace for
1080i. Both interlaced and progressive are sampled also, so bear the overhead
of the sampling.

>
> Interlaced
> only shows every other line on each pass.
>
The items of detail don't change on every pass.

In essence, interlace is a tradeoff that trades-away temporal resolution
so as to provide better spatial resolution for a given frame scanning
structure.

When comparing 1080i vs. 720p, it is also important to NOT forget the
1280H pixels vs. the 1920H pixels. It is true that all of the detail
isn't always recorded/transmitted, but that is also true of all kinds
of detail given the MPEG2 encoding scheme.

It would be best (fairest) to claim that 720p and 1080i have essentially
the same vertical resolution for typical entertainment (non sports)
video material. On the other hand, 1080i has higher horizontal resolution
than 720p (in practice and in theory both.)

Given the above, except for sports (esp for filmed material), 1080i is
the GENERALLY best format. This is most true for film, where information
can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
performance. 720p just doesn't do quite as well.

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...

> No... You see the stationary detail in both phases of the interlace for
> 1080i.

And you get "judder" or "jitter."

> In essence, interlace is a tradeoff that trades-away temporal resolution
> so as to provide better spatial resolution for a given frame scanning
> structure.

Interlace is an obsolete, 70 year old "technology" used to thin out the
information stream to make it fit through a limited bandwidth pipeline.
Modern compression technology has eliminated the need for interlace.

> When comparing 1080i vs. 720p, it is also important to NOT forget the
> 1280H pixels vs. the 1920H pixels.

You keep repeating this falsehood for some reason, and I'll keep
repeating the correction. As it exists now, 1080i is 1440 by 1080
pixels. 1080i has the edge in horizontal resolution, but not by much.
720p has the edge in color resolution, in vertical resolution, and in
temporal resolution.

> Given the above, except for sports (esp for filmed material), 1080i is
> the GENERALLY best format. This is most true for film, where information
> can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
> performance.

You've obviously never had to reconstruct an image shot interlaced in-
camera. De-interlacers don't work well at all. The difficulty involved
with trying to un-do interlacing is one of the reasons 480i looks really
bad on the new HDTVs when an attempt is made to convert it to
progressive. It's also one of the reasons the production industry is
converting to 24 frames-per-second progressive.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
thorogood@mailinator.com says...

> In order of increasing quality, here is the
> list of the most common modes:
>
> 480i (aka standard non-hd)
> 480p
> 1080i
> 720p
> 1080p

I like your list. In addition to considerations involving picture
quality, bandwidth considerations also enter into the equation.
Of all the formats, 1080i is the biggest bandwidth hog. 1080/24p and
1080/30p are much more efficient in terms of bandwidth requirements. As
you know, a progressive video stream is easier to compress than an
interlaced stream.
 

toto

Distinguished
Apr 13, 2004
3
0
18,510
Archived from groups: alt.tv.tech.hdtv (More info?)

Mark, the first response was the best suggestion. Try both on both tvs
and see which works best for you. I hooked up 4 720P digital tvs and
my source quality was much much better set at 1080i. All the theories
explanations and arguments you hear will never provide an answer, but
your own eyes should. Use the same criteria on whether you prefer
blondes or red heads whether you should use DVI connection or
component.


--
toto


------------------------------------------------------------------------
toto's Profile: 1764
View this thread: 13607
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b0457ac71a6ecc2989702@news.gwtc.net>,
Steve Rimberg <rsteve@world1.net> writes:
> In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...
>
>> No... You see the stationary detail in both phases of the interlace for
>> 1080i.
>
> And you get "judder" or "jitter."
>
Not really -- dynamic filters are cool (effectively, there is already
dynamic filtering going on in MPEG2 encoding.) When the 'flicker'
starts being possible, then you increase the amount of interlace
filtering. Since the detail is aliased anyway, removing the motion
artifacts is actually almost information neturl.

>
>> In essence, interlace is a tradeoff that trades-away temporal resolution
>> so as to provide better spatial resolution for a given frame scanning
>> structure.
>
> Interlace is an obsolete, 70 year old "technology" used to thin out the
> information stream to make it fit through a limited bandwidth pipeline.
>
Interlace exists, and it works. It might be suboptimal, but much less
suboptimal than 720p60 on filmed material where 1080i30 certainly can/does
look better. In the future, a little de-interlacing will help those
old film/tape archives continue to be valuable.

>
> Modern compression technology has eliminated the need for interlace.
>
So what? If you want the real 'ideal', then the filmed stuff should
be broadcast at 1080p24. 720p60 is silly on most material, considering
that it is filmed (or film-look.)



>> When comparing 1080i vs. 720p, it is also important to NOT forget the
>> 1280H pixels vs. the 1920H pixels.
>
> You keep repeating this falsehood for some reason, and I'll keep
> repeating the correction. As it exists now, 1080i is 1440 by 1080
> pixels.
>
You keep on forgetting the necessary rolloff (which isn't as necessary
when the sampling structure is 1920H instead of the very very limited
1280H.) Either you don't realize, or are purposefully forgetting that
the rolloff for 1280H usualy has to start at about effectivelly 800 or
less TVL to avoid uglifying. Even if all of the 1920H doesn't exist,
the needed rolloff for avoiding sampling effects would typically start
being significant at the 1000-1100TVL level.

>
> 1080i has the edge in horizontal resolution, but not by much.
>
> 720p has the edge in color resolution, in vertical resolution, and in
> temporal resolution.
>
If you look at real numbers, you'll find that for 'resolution' figures,
even in the best case, 720p is a toss-up. Temporal resolution is essentially
the ONLY advantage of 720p60. Also, if you really have ever seen 720p,
the effects of the early rolloff for the 1280H anti-aliasing makes it
look like a 'fantastic 480p' instead of that 'window' 1080i or 1080p
effect.

>
>> Given the above, except for sports (esp for filmed material), 1080i is
>> the GENERALLY best format. This is most true for film, where information
>> can be reconstructed very nicely to give ALMOST 1080p type (not full, however)
>> performance.
>
> You've obviously never had to reconstruct an image shot interlaced in-
> camera. De-interlacers don't work well at all.
>
Plaeeze!!! 24fps material (be it 1080p24 or film) is easy to reconstruct
when it is played out on 1080i30 formats. The problem with DVDs is that
the proper flags aren't always used.

If you start with 60fps material, then 720p will work better for motion.
If you start with 24fps material (most scripted stuff), 1080i or 1080p
(depending upon display) is a better match. 720p is silly for 24fps
material, with no advantages.

If you start with 'video look', then it is best to match the recording
standards, where 1080i is probably the best all around except for motion,
and does give the best video look. For sports or scientific work (where
freeze frame is a primarily used feature), then 720p can be useful.

One more note: for broadcast HDTV, 1080i is a tight fit. High motion
and 1080i is a disaster sometimes, but not because of 1080i encoding
itself on the ATSC channel. It is because of the all-too-common
subchannels that force the 1080i to fit in 15mpbs.... That is very
marginal for 1080i. This gives a double bonus for sports on 720p60,
where it tends to fit into the ATSC channel, even with a subchannel
or two.

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b04599fca3ca7b2989703@news.gwtc.net>,
Ron Malvern <rmlvrn@nospam.com> writes:
> In article <jeqi901sh5sjvoj261ontdqo63g7cqgapr@4ax.com>,
> thorogood@mailinator.com says...
>
>> In order of increasing quality, here is the
>> list of the most common modes:
>>
>> 480i (aka standard non-hd)
>> 480p
>> 1080i
>> 720p
>> 1080p
>
> I like your list. In addition to considerations involving picture
> quality, bandwidth considerations also enter into the equation.
> Of all the formats, 1080i is the biggest bandwidth hog. 1080/24p and
> 1080/30p are much more efficient in terms of bandwidth requirements.
>
It bothers me that the TV stations/networks don't use 1080p24 more often
for their playout. However, I suspect that the mode-change issues had
caused them to decide to use 1080i30.

Using 1080p24 would keep the 720p60 zealots from worrying about the
temporal resolution that JUST DOESN"T exist in most scripted material.
Using 1080p24 would also allow for easier (superfically higher quality)
implementation of flicker free 1080p72 display. And, the limited resolution
issues of 720p60 would be unimportant.

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...

> It bothers me that the TV stations/networks don't use 1080p24 more often
> for their playout.

It bothers me as well. 1080/24p is the best of the HDTV formats now
available, not only because of it's resolution advantages, but also
because it has lessor bandwidth requirements. It is also more elegantly
manipulable for special effects during the post production process. And
finally, it more easily converted to the other formats in use around the
world: (Europe's proposed 1080/25p and 1080/50i, along with existing PAL
50i.)
Some cable networks are considering 1080/24p as a transmission format:
A&E is one such network, for its History Channel and other offerings. I
would be immensely grateful if they take that step forward. I've been
encouraging them to do so. They are one of our best customers.

> However, I suspect that the mode-change issues had
> caused them to decide to use 1080i30.

Most broadcasters ended up using 1080i because their most trusted
supplier, Sony, wouldn't sell them anything else. Sony was grossly
behind the technology curve when these decisions were made, but the
company can be quite stubborn about these things. Sony apparently had
an R&D investment in 1080i that it had to amortize.

> Using 1080p24 would keep the 720p60 zealots from worrying about the
> temporal resolution that JUST DOESN"T exist in most scripted material.

It does if the camera original was shot at 60p. But you are right when
you say there is no advantage when the camera original is 24p film.

> Using 1080p24 would also allow for easier (superfically higher quality)
> implementation of flicker free 1080p72 display.

Yes it would, and I'm looking forward to some experimentation along that
line with displays. I haven't seen any yet. But from my own experience
with computers, 72 frames per second seems to be a display rate that
satisfies a greater number of viewers than does 60.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
> In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
>
> > It bothers me that the TV stations/networks don't use 1080p24 more often
> > for their playout.
>
> It bothers me as well. 1080/24p is the best of the HDTV formats now
> available, not only because of it's resolution advantages, but also
> because it has lessor bandwidth requirements.

Well, for film-source material anyway. I don't think anybody would recommend
anything less than 30p for sports or other live action, and then you get
into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
select shots, while the extra pixels on 1920x1080 tend to help all the time.

--
Jeff Rife |
SPAM bait: | http://www.nabs.net/Cartoons/Dilbert/TokenRing.gif
AskDOJ@usdoj.gov |
uce@ftc.gov |
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

"Steve Rimberg" <rsteve@world1.net> wrote in message
news:MPG.1b0457ac71a6ecc2989702@news.gwtc.net...
> You keep repeating this falsehood for some reason, and I'll keep
> repeating the correction. As it exists now, 1080i is 1440 by 1080
> pixels. 1080i has the edge in horizontal resolution, but not by much.
> 720p has the edge in color resolution, in vertical resolution, and in
> temporal resolution.

Why do you say this? The 16:9 standards are of course 1920x1080 and
1280x720. Are you suggesting that most current HDTV screens cannot, in
practice, achieve more than 1440 horizontal pixels of resolution? Even if
true: (a) technology will presumably improve this over time; and (b) the
full 1920-pixel horizontal resolution is still useful when zooming.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

"John S. Dyson" <toor@iquest.net> wrote in message
news:c7eeo5$1kae$2@news.iquest.net...
>
> It bothers me that the TV stations/networks don't use 1080p24 more often
> for their playout. However, I suspect that the mode-change issues had
> caused them to decide to use 1080i30.
>
> Using 1080p24 would keep the 720p60 zealots from worrying about the
> temporal resolution that JUST DOESN"T exist in most scripted material.
> Using 1080p24 would also allow for easier (superfically higher quality)
> implementation of flicker free 1080p72 display. And, the limited
resolution
> issues of 720p60 would be unimportant.

Totally agreed. 1080p24 wold look fantastic for movies, providing the best
possible quality and efficiency of any HD standard, without having to worry
about end-user de-interlacing (which in practice may be fairly difficult to
do correctly even if the original source is film, unless HDTV handles this
considerably better than DVD does). The idea of a 72Hz monitor is also
hugely appealing. Maybe after more 1080p monitors become available, this
will be considered more seriously.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

"Jeff Rife" <wevsr@nabs.net> wrote in message
news:MPG.1b04cb1493300d3298b3fd@news.nabs.net...
> Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
> > In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
> >
> > > It bothers me that the TV stations/networks don't use 1080p24 more
often
> > > for their playout.
> >
> > It bothers me as well. 1080/24p is the best of the HDTV formats now
> > available, not only because of it's resolution advantages, but also
> > because it has lessor bandwidth requirements.
>
> Well, for film-source material anyway. I don't think anybody would
recommend
> anything less than 30p for sports or other live action, and then you get
> into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
> select shots, while the extra pixels on 1920x1080 tend to help all the
time.

Agreed. In fact, I wouldn't want anything less than the 48-60fps (whether
progressive or interlaced) range for sports or other dynamic motion. Even
30fps is on the low end of framerates. While interlace has its problems,
there is substantial benefit to the image at least half changing every
1/60th of a second that keeps it from looking too choppy.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

"toto" <toto.15uyby@news.satelliteguys.us> wrote in message
news:toto.15uyby@news.satelliteguys.us...
>
> Mark, the first response was the best suggestion. Try both on both tvs
> and see which works best for you. I hooked up 4 720P digital tvs and
> my source quality was much much better set at 1080i. All the theories
> explanations and arguments you hear will never provide an answer, but
> your own eyes should. Use the same criteria on whether you prefer
> blondes or red heads whether you should use DVI connection or
> component.

There is one reason that 1080i may look better on even a 480p set (or 720p)
than the "native" 480p signal on those sets, and that is color resolution.
Since HDTV signals encode color at half the resolution in each direction as
brightness (luma), if you display a 1080 signal on a lower resolution
screen, there will be an improvement in color resolution on those screens
compared to a lower resolution signal on the same monitor. The luma will be
resolution limited by the display device, but the chroma can be displayed at
nearly its full resolution on even a 480-line display.

Usually scaling a high-res signal down isn't too much of a problem (with
appropriate filtering), but scaling a lower-resolution signal up often leads
to obvious problems.
 

Poldy

Distinguished
Apr 10, 2004
111
0
18,630
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b04cb1493300d3298b3fd@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> wrote:

> Ron Malvern (rmlvrn@nospam.com) wrote in alt.tv.tech.hdtv:
> > In article <c7eeo5$1kae$2@news.iquest.net>, toor@iquest.net says...
> >
> > > It bothers me that the TV stations/networks don't use 1080p24 more often
> > > for their playout.
> >
> > It bothers me as well. 1080/24p is the best of the HDTV formats now
> > available, not only because of it's resolution advantages, but also
> > because it has lessor bandwidth requirements.
>
> Well, for film-source material anyway. I don't think anybody would recommend
> anything less than 30p for sports or other live action, and then you get
> into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
> select shots, while the extra pixels on 1920x1080 tend to help all the time.

Wait, are there any sets capable of making use of 1080p?

Also, the issue seems to be that a lot of local stations are not using
the full bandwidth for various reasons, the most prominent of them being
to multicast with secondary channels.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
> > Well, for film-source material anyway. I don't think anybody would recommend
> > anything less than 30p for sports or other live action, and then you get
> > into the 1080/60i vs. 720/60p issues, where 60p still only helps on a few
> > select shots, while the extra pixels on 1920x1080 tend to help all the time.
>
> Wait, are there any sets capable of making use of 1080p?

My computer monitor does it fine right now, and by the end of the year
there will be 1920x1080 DLP sets that will do progressive scan.

> Also, the issue seems to be that a lot of local stations are not using
> the full bandwidth for various reasons, the most prominent of them being
> to multicast with secondary channels.

I don't know about "a lot", but of the 5 local stations that do HD (ABC,
CBS, NBC, PBS, WB), only PBS and ABC multicast. Since ABC does 720p,
the 3Mbps weather radar doesn't do much to the quality of the HD. PBS
does suffer a bit.

Otherwise, the stations are using full bitrate (at least 18Mbps) for the
single stream. Even the local Fox does this with their 480p (which gives
a very nice picture, although not HD).

I have heard that some stations do stupid multicasting in that they send
both the HD and a 480i signal of the *same* programming. Since every
ATSC receiver can receive and decode the HD version, it's just a stupid
waste of bandwidth.

--
Jeff Rife | "She just dropped by to remind me that my life
SPAM bait: | is an endless purgatory, interrupted by profound
AskDOJ@usdoj.gov | moments of misery."
uce@ftc.gov | -- Richard Karinsky, "Caroline in the City"
 

Poldy

Distinguished
Apr 10, 2004
111
0
18,630
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b05a2c1e3a5d46198b400@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> wrote:

> > Wait, are there any sets capable of making use of 1080p?
>
> My computer monitor does it fine right now, and by the end of the year
> there will be 1920x1080 DLP sets that will do progressive scan.

So that would be a very small segment of the installed base which would
have such sets.

How would 1080p downconverted to 1080i look?