Should I use 720p or 1080i?

Page 2 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
> How would 1080p downconverted to 1080i look?

It depends on which flavor of 1080p. 1920x1080/24p sourced from film
material would look pretty much the same on either a 1080/60p or 1080/60i
display. The 1080/60p display would be slightly better, but not much,
because (again) there just isn't enough temporal information in 24p to
come close to stressing 60i. The same would be true of just about any
flavor of 1080p sourced from 24p film.

Native 1920x1080/30p would again look about the same on 1080/60p or
1080/60i displays, with the advantage (a bit more than the 24p sourced
material) to the 60p display.

If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
unlikely to show up in *any* transmission), then that's the only one that
would look much worse on a 1080i display.

--
Jeff Rife | "Five thousand dollars, huh? I'll bet we could
SPAM bait: | afford that if we pooled our money together...
AskDOJ@usdoj.gov | bought a gun...robbed a bank...."
uce@ftc.gov | -- Drew Carey
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

This has been a very interesting thread so far.
Could somebody be kind enough to explain a little bit more about the
"rolloff" and the factors contributing to it?

"John S. Dyson" <toor@iquest.net> wrote in message
news:c7eej4$1kae$1@news.iquest.net...
> In article <MPG.1b0457ac71a6ecc2989702@news.gwtc.net>,
> Steve Rimberg <rsteve@world1.net> writes:
> > In article <c7c2i1$u6i$8@news.iquest.net>, toor@iquest.net says...
> >
> >> No... You see the stationary detail in both phases of the interlace
for
> >> 1080i.
> >
> > And you get "judder" or "jitter."
> >
> Not really -- dynamic filters are cool (effectively, there is already
> dynamic filtering going on in MPEG2 encoding.) When the 'flicker'
> starts being possible, then you increase the amount of interlace
> filtering. Since the detail is aliased anyway, removing the motion
> artifacts is actually almost information neturl.
>
> >
> >> In essence, interlace is a tradeoff that trades-away temporal
resolution
> >> so as to provide better spatial resolution for a given frame scanning
> >> structure.
> >
> > Interlace is an obsolete, 70 year old "technology" used to thin out the
> > information stream to make it fit through a limited bandwidth pipeline.
> >
> Interlace exists, and it works. It might be suboptimal, but much less
> suboptimal than 720p60 on filmed material where 1080i30 certainly can/does
> look better. In the future, a little de-interlacing will help those
> old film/tape archives continue to be valuable.
>
> >
> > Modern compression technology has eliminated the need for interlace.
> >
> So what? If you want the real 'ideal', then the filmed stuff should
> be broadcast at 1080p24. 720p60 is silly on most material, considering
> that it is filmed (or film-look.)
>
>
>
> >> When comparing 1080i vs. 720p, it is also important to NOT forget the
> >> 1280H pixels vs. the 1920H pixels.
> >
> > You keep repeating this falsehood for some reason, and I'll keep
> > repeating the correction. As it exists now, 1080i is 1440 by 1080
> > pixels.
> >
> You keep on forgetting the necessary rolloff (which isn't as necessary
> when the sampling structure is 1920H instead of the very very limited
> 1280H.) Either you don't realize, or are purposefully forgetting that
> the rolloff for 1280H usualy has to start at about effectivelly 800 or
> less TVL to avoid uglifying. Even if all of the 1920H doesn't exist,
> the needed rolloff for avoiding sampling effects would typically start
> being significant at the 1000-1100TVL level.
>
> >
> > 1080i has the edge in horizontal resolution, but not by much.
> >
> > 720p has the edge in color resolution, in vertical resolution, and in
> > temporal resolution.
> >
> If you look at real numbers, you'll find that for 'resolution' figures,
> even in the best case, 720p is a toss-up. Temporal resolution is
essentially
> the ONLY advantage of 720p60. Also, if you really have ever seen 720p,
> the effects of the early rolloff for the 1280H anti-aliasing makes it
> look like a 'fantastic 480p' instead of that 'window' 1080i or 1080p
> effect.
>
> >
> >> Given the above, except for sports (esp for filmed material), 1080i is
> >> the GENERALLY best format. This is most true for film, where
information
> >> can be reconstructed very nicely to give ALMOST 1080p type (not full,
however)
> >> performance.
> >
> > You've obviously never had to reconstruct an image shot interlaced in-
> > camera. De-interlacers don't work well at all.
> >
> Plaeeze!!! 24fps material (be it 1080p24 or film) is easy to reconstruct
> when it is played out on 1080i30 formats. The problem with DVDs is that
> the proper flags aren't always used.
>
> If you start with 60fps material, then 720p will work better for motion.
> If you start with 24fps material (most scripted stuff), 1080i or 1080p
> (depending upon display) is a better match. 720p is silly for 24fps
> material, with no advantages.
>
> If you start with 'video look', then it is best to match the recording
> standards, where 1080i is probably the best all around except for motion,
> and does give the best video look. For sports or scientific work (where
> freeze frame is a primarily used feature), then 720p can be useful.
>
> One more note: for broadcast HDTV, 1080i is a tight fit. High motion
> and 1080i is a disaster sometimes, but not because of 1080i encoding
> itself on the ATSC channel. It is because of the all-too-common
> subchannels that force the 1080i to fit in 15mpbs.... That is very
> marginal for 1080i. This gives a double bonus for sports on 720p60,
> where it tends to fit into the ATSC channel, even with a subchannel
> or two.
>
> John
>
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

I have a front projection CRT display that can display both 1080i and 720p.
It is connected to a HD cable box which outputs both 1080i and 720p
depending on the original source format. (Thus there is no conversion
happening at my end.) Here are my observations:
1. 720p may required less bandwidth due to better MPEG compression of a
progressive signal, however, it requires a display capable of synching to a
MUCH higher frequency than 1080i.
2. All of the material that I have viewed looks better at 1080i than at
720p.
3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
4. 1080i looks better with HD video sources than with film sources. (This
may be due to the fact that my equipment doesn't compensate well for the 3/2
pull-down artifacts you get when you convert a 24fps film to a 30fps
signal.)
5. I watched all of the recent major sporting events (Super Bowl, the NCAA
tournament, etc) in HD and have zero complaints about picture quality.


"MarkW" <markwco(removenospam)@comcast.net> wrote in message
news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
> I am going to be picking up my HD TIVO today and am curious if it's
> best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
> again is 720 more like 1440 or is simply have progressive better than
> interlaced? Or is it simply a matter of opinion? I figured I'd stick
> with one in my menu since I'll be using 480i on another TV and just
> wonder which to go with. I'm not sure if it depends on what you're
> watching or the TV you have but as for the TV's, I'll be using this
> with two TV's
> Sony KF-42WE610
> Sony KF-50XBR800
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <8X7nc.110201$207.1414989@wagner.videotron.net>,
"jean dumont" <jean.dumont@videotron.ca> writes:
> This has been a very interesting thread so far.
> Could somebody be kind enough to explain a little bit more about the
> "rolloff" and the factors contributing to it?
>
Okay, I am pure 'techie', but I'll REALLY try to explain.

VERY NONTECHNICAL:
In a video sense, high frequency rolloff is a 'decrease' of detail,
or sharpness for sharp edges. When the edges are very sharp, the
rolloff will tend to spread out the edge, and make it look a little
wider. When details are spread out, then the even smaller details
are smoothed into nothingness.

Slightly more technical:
There are different ways of doing 'rolloff' necessary for different
purposes. One (straightforward) way of doing the 'rolloff' is to
electronically smooth the signal with a linear phase filter (one with
constant time delay vs. frequency, otherwise the spreading will be
more severe than need be.) Other ways to process the signal is to
keep the fine details, but to intensively 'chop off' the strength
of the high frequency (sharp edges) and spread only the strongest
transistions. This can help to preserve SOME of the fine detail,
but also help to shape the frequency spectrum.

Even more technical:
MPEG2 encoding and even most digitized video requires pre filtering
before the digitization process, because if too much high frequency
detail is presented, then large area beat products appear (like
moire effects.) At the smaller scale, these effects look like
stairstepping. MPEG encoding by itself, even if presented with
properly bandlimited video, it can actually produce these aliasing
effects when it has to chop too much detail from the signal (to
fit within proper bandwidth.) So, MPEG encoding can actually produce
ALIASING!!!

Anyway, this is more than you asked for, and probably not enough of
what you really need. However, I am a pure techie, with the tech
mentality for the last 40yrs (and I am slightly older than that.)
I am unable to think in 'non technical' terms without duress :).

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

Thanks for taking the time to explain John.

So, if I understand you correctly, rolloff is used to prevent artifacts that
would occur during the Mpeg2 compression process or is there other purposes
for it?

Would the Mpeg produced aliasing account for the rainbow effect that we see
sometimes with fine details (ties, jackets etc...) over digital cable?
I always assumed this was a comb filter issue (thinking the signal might
have been compressed down to composite along the distribution line).

"John S. Dyson" <toor@iquest.net> wrote in message
news:c7jbi0$3bn$2@news.iquest.net...
> In article <8X7nc.110201$207.1414989@wagner.videotron.net>,
> "jean dumont" <jean.dumont@videotron.ca> writes:
> > This has been a very interesting thread so far.
> > Could somebody be kind enough to explain a little bit more about the
> > "rolloff" and the factors contributing to it?
> >
> Okay, I am pure 'techie', but I'll REALLY try to explain.
>
> VERY NONTECHNICAL:
> In a video sense, high frequency rolloff is a 'decrease' of detail,
> or sharpness for sharp edges. When the edges are very sharp, the
> rolloff will tend to spread out the edge, and make it look a little
> wider. When details are spread out, then the even smaller details
> are smoothed into nothingness.
>
> Slightly more technical:
> There are different ways of doing 'rolloff' necessary for different
> purposes. One (straightforward) way of doing the 'rolloff' is to
> electronically smooth the signal with a linear phase filter (one with
> constant time delay vs. frequency, otherwise the spreading will be
> more severe than need be.) Other ways to process the signal is to
> keep the fine details, but to intensively 'chop off' the strength
> of the high frequency (sharp edges) and spread only the strongest
> transistions. This can help to preserve SOME of the fine detail,
> but also help to shape the frequency spectrum.
>
> Even more technical:
> MPEG2 encoding and even most digitized video requires pre filtering
> before the digitization process, because if too much high frequency
> detail is presented, then large area beat products appear (like
> moire effects.) At the smaller scale, these effects look like
> stairstepping. MPEG encoding by itself, even if presented with
> properly bandlimited video, it can actually produce these aliasing
> effects when it has to chop too much detail from the signal (to
> fit within proper bandwidth.) So, MPEG encoding can actually produce
> ALIASING!!!
>
> Anyway, this is more than you asked for, and probably not enough of
> what you really need. However, I am a pure techie, with the tech
> mentality for the last 40yrs (and I am slightly older than that.)
> I am unable to think in 'non technical' terms without duress :).
>
> John
>
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

"Brian K. White" <nospam@foxfire74.com> wrote in message news:<Vr9nc.2579$GL4.94@fe2.columbus.rr.com>...
> I have a front projection CRT display that can display both 1080i and 720p.
> It is connected to a HD cable box which outputs both 1080i and 720p
> depending on the original source format. (Thus there is no conversion
> happening at my end.) Here are my observations:
> 1. 720p may required less bandwidth due to better MPEG compression of a
> progressive signal, however, it requires a display capable of synching to a
> MUCH higher frequency than 1080i.
> 2. All of the material that I have viewed looks better at 1080i than at
> 720p.
> 3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
> 4. 1080i looks better with HD video sources than with film sources. (This
> may be due to the fact that my equipment doesn't compensate well for the 3/2
> pull-down artifacts you get when you convert a 24fps film to a 30fps
> signal.)
> 5. I watched all of the recent major sporting events (Super Bowl, the NCAA
> tournament, etc) in HD and have zero complaints about picture quality.
>
>
> "MarkW" <markwco(removenospam)@comcast.net> wrote in message
> news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
> > I am going to be picking up my HD TIVO today and am curious if it's
> > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
> > again is 720 more like 1440 or is simply have progressive better than
> > interlaced? Or is it simply a matter of opinion? I figured I'd stick
> > with one in my menu since I'll be using 480i on another TV and just
> > wonder which to go with. I'm not sure if it depends on what you're
> > watching or the TV you have but as for the TV's, I'll be using this
> > with two TV's
> > Sony KF-42WE610
> > Sony KF-50XBR800


I'm just going to jump in here and say something.
I think the only format that makes sense is 1920x1080(24p).
Now all I want to see are 16:9 crt vga screens. It shouldn't be hard
at all.
I don't know why they aren't around because you know people get all
excited when they see that 16:9 ratio. And it could play
1920x1080(24p). It wouldn't cost that much either. Less than $300.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <e5bnc.90531$j11.1722178@weber.videotron.net>,
"jean dumont" <jean.dumont@videotron.ca> writes:
> Thanks for taking the time to explain John.
>
> So, if I understand you correctly, rolloff is used to prevent artifacts that
> would occur during the Mpeg2 compression process or is there other purposes
> for it?
>
The 'rolloff' or 'anti-aliasing' is helpful to avoid artifacts during
the MPEG2 compression process and EVEN the digitization process. If you
sample an analog signal that has too much detail, the result will contain
lots of weird, sometimes incomprehensible effects. When there is only
slightly too much detail for the sampling rate, the effect will look
like moire, extremely harsh details, perhaps mild banding. When there
is far too much detail, the large scale artifacts and banding, along
with extreme moire can appear.

>
> Would the Mpeg produced aliasing account for the rainbow effect that we see
> sometimes with fine details (ties, jackets etc...) over digital cable?
>
That 'rainbow' effect with luma interfering with chroma is due to
the fact that composite video was in the signal path. The composite
to component (or s-video) converters often leave the interfering
luma to mix with the chroma, and that does produce color flashing.

However, in a very techincal sense, that effect is a cousin to the
moire or other effects from undersampling. (Given an ideal composite
decoder, when luma detail actually does spill into the chroma spectrum,
that would be somewhat similar in some regards to other sampling
effects.)

>
> I always assumed this was a comb filter issue (thinking the signal might
> have been compressed down to composite along the distribution line).
>
You were likely correct regarding that issue.

Raw MPEG2 (or undersampling) artifacts tend not to look like composite
decoding artifacts. You'll not see chroma magically appearing based
upon luma signals in MPEG2. One possible artifact might include too
much luma or chroma causing artifacts in the other domain -- because
of too little payload capability (squeezing too much data into too
small a pipeline causes the MPEG2 encoder to make quality tradeoffs.)

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

Thanks for the explanations John.

I presume that if the sampling rate would be high enough, no rolloff would
be necessary.
If this is true, then the bottom line is that 19Mbits/s is not a high enough
bitrate to allow for the full resolution of 1080ix1920 or even 720px1280?
Why not allocate enough bandwith to do without rolloff?



"John S. Dyson" <toor@iquest.net> wrote in message
news:c7jic8$56b$2@news.iquest.net...
> In article <e5bnc.90531$j11.1722178@weber.videotron.net>,
> "jean dumont" <jean.dumont@videotron.ca> writes:
> > Thanks for taking the time to explain John.
> >
> > So, if I understand you correctly, rolloff is used to prevent artifacts
that
> > would occur during the Mpeg2 compression process or is there other
purposes
> > for it?
> >
> The 'rolloff' or 'anti-aliasing' is helpful to avoid artifacts during
> the MPEG2 compression process and EVEN the digitization process. If you
> sample an analog signal that has too much detail, the result will contain
> lots of weird, sometimes incomprehensible effects. When there is only
> slightly too much detail for the sampling rate, the effect will look
> like moire, extremely harsh details, perhaps mild banding. When there
> is far too much detail, the large scale artifacts and banding, along
> with extreme moire can appear.
>
> >
> > Would the Mpeg produced aliasing account for the rainbow effect that we
see
> > sometimes with fine details (ties, jackets etc...) over digital cable?
> >
> That 'rainbow' effect with luma interfering with chroma is due to
> the fact that composite video was in the signal path. The composite
> to component (or s-video) converters often leave the interfering
> luma to mix with the chroma, and that does produce color flashing.
>
> However, in a very techincal sense, that effect is a cousin to the
> moire or other effects from undersampling. (Given an ideal composite
> decoder, when luma detail actually does spill into the chroma spectrum,
> that would be somewhat similar in some regards to other sampling
> effects.)
>
> >
> > I always assumed this was a comb filter issue (thinking the signal might
> > have been compressed down to composite along the distribution line).
> >
> You were likely correct regarding that issue.
>
> Raw MPEG2 (or undersampling) artifacts tend not to look like composite
> decoding artifacts. You'll not see chroma magically appearing based
> upon luma signals in MPEG2. One possible artifact might include too
> much luma or chroma causing artifacts in the other domain -- because
> of too little payload capability (squeezing too much data into too
> small a pipeline causes the MPEG2 encoder to make quality tradeoffs.)
>
> John
 

Poldy

Distinguished
Apr 10, 2004
111
0
18,630
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <MPG.1b063deaa821c11e98b405@news.nabs.net>,
Jeff Rife <wevsr@nabs.net> wrote:

> If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
> unlikely to show up in *any* transmission), then that's the only one that
> would look much worse on a 1080i display.

There are zero prospects for this as a broadcast format right?

Unless they re-organized the spectrum, which everybody wants to use for
other services. So not likely that they'd allocate more bandwidth for
TV.

And neither HD DVD nor Blue Ray has indicated which resolutions would be
supported in their formats. If they can produce formats and content
that reproduces all the 1080p formats, you would hope they'd do it,
rather than settling for just the ATSC formats which are designed for a
more bandwidth-limited medium.
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

poldy (poldy@kfu.com) wrote in alt.tv.tech.hdtv:
> > If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
> > unlikely to show up in *any* transmission), then that's the only one that
> > would look much worse on a 1080i display.
>
> There are zero prospects for this as a broadcast format right?

For OTA, in the US, for the forseeable future, yes, the chance is zero.

It could be seen via cable or DBS, but unless people build displays that
accept it as input, there's no reason to transmit it.

> Unless they re-organized the spectrum, which everybody wants to use for
> other services. So not likely that they'd allocate more bandwidth for
> TV.

*Technically*, a change in the ATSC standard could allow 1080/60p in the
current spectrum. But, 1080/60p would have some real problems at 19Mbps...
sort of like what DVD would look like if the absolute maximum bitrate was
5Mbps.

So, it ain't gonna happen.

--
Jeff Rife |
SPAM bait: | http://www.netfunny.com/rhf/jokes/99/Apr/columbine.html
AskDOJ@usdoj.gov |
uce@ftc.gov |
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <poldy-934BFA.14265508052004@netnews.comcast.net>,
poldy <poldy@kfu.com> writes:
> In article <MPG.1b063deaa821c11e98b405@news.nabs.net>,
> Jeff Rife <wevsr@nabs.net> wrote:
>
>> If we ever get a 1920x1080/60p source (it's not a legal ATSC mode, so it's
>> unlikely to show up in *any* transmission), then that's the only one that
>> would look much worse on a 1080i display.
>
> There are zero prospects for this as a broadcast format right?
>
> Unless they re-organized the spectrum, which everybody wants to use for
> other services. So not likely that they'd allocate more bandwidth for
> TV.
>
Most likely, we won't see 1080p60 as a broadcast format because
of the reasons that you state, and the diminishing beneficial
returns except on very large screens. For scripted material,
it is most likely originally 24p anyway, so 30i transport with
24p source material can allow for fairly good reconstruction
to the 24p quality.

For sports material, it would be nice to have 1080p60, but the
real benefits are likely minimal over and above 720p60.

In reality, the biggest (by far) problem with 1080i30 in the
US ATSC system is for sports where there is too much movement
and the available signal payload capability is insufficient.
The interlace itself isn't the culprit. If there was perhaps
25mbps instead of only (most likely) 15-16mbps, then the
uglification of the sports when in 1080i30 wouldn't be so
problematical.

SOOO, 720p60 makes a really good tradeoff, where there is less
spatial detail visible (especially for live broadcasts), but
the payload needs for the much reduced amount of data given
numerous technical factors are less, and 15mbps does a good
job for 720p60.

Where 1080i30 dies horrible is in NFL broadcasts where there
is alot of moving crowd in the background while the camera
is moving. This artifacting is NOT due to interlace per se,
but interlace MIGHT contribute to it. The artifacting is
due to inadequate payload capability.

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

In article <dCfnc.101577$j11.1917930@weber.videotron.net>,
"jean dumont" <jean.dumont@videotron.ca> writes:
> Thanks for the explanations John.
>
> I presume that if the sampling rate would be high enough, no rolloff would
> be necessary.
>
Well - the problem is that for various reasons, the sampling rate
actually implies the sampling structure. The 1920H pixels and
1080V pixels define the sampling structure, and the entire structure
is updated at 1/30 or 1/60 of a second (or in europe, 1/25 or 1/50.)
(Just thought of using the term 'softening' for 'rolloff.' Maybe
that will help :)).

So, the 1920H pixels describes one of the parameters of the sampling
structure, and if there is detail that overruns that structure
(and actually, mathmatically that happens more often than one
might think), then the artifacts occur.

The frequency spectrum of the signal can wrap around the sampling
frequency much easier than one might think. So, the pre-filtering
before digitization and encoding is quite important. Actually,
some equipment can ASSUME that the input spectrum won't wrap,
but a general purpose digitization device will have to do some
filtering.

(My numbers below are necessarily very approximate, but do give
a sense of the disadvantage in detail that a 1280H sampling
structure gives. Even if there isn't alot of detail at 1920H
level, and the detail has diminished significantly by the 1440H
range, the necessary rolloff doesnt' have to be significant
at important frequencies like the 1280H structure would require.)

It probably doesn't take a large amount of filtering for video
to mitigate the artifacting. Audio is probably much more critical.
However, video filters are necessarily 'gentle' (they have slow
changes in response versus frequency), and so to make sure that
the video signal is -20dB at the sampling frequency, then the
rolloff might have to be significant at 1/2 or lower frequency. (e.g.
to make sure that video response is -20dB at 1280H to help mitigate
the aliasing effects, then it might be necessary to start rolling
off at relatively low spatial frequencies (maybe 500H equivalent),
and perhaps 6 or 10dB down at 800-900H equivalent.) The needed
rolloff is proportionally higher for the 1920H structure, and
requires significantly less 'softening' of the signal.


>
> If this is true, then the bottom line is that 19Mbits/s is not a high enough
> bitrate to allow for the full resolution of 1080ix1920 or even 720px1280?
> Why not allocate enough bandwith to do without rolloff?
>
That 19mbps is related to the ENCODED bandwidth, and only indirectly related
to the sampling structure. The reason why MPEG2 has troubles fitting
into 19mbps (especiall troubles at 15mbps or so) is that MPEG2 can only
remove so much spatial and temporal detail (and coefficients) before the
image quality suffers.

John
 
G

Guest

Guest
Archived from groups: alt.tv.tech.hdtv (More info?)

Jumpin Murphy! Thank you group! . . Finally an intelligent post with useful
information and no flaming!

Jodster
<okner@newshosting.com> wrote in message
news:19c13a19.0405081903.4bd8b392@posting.google.com...
> "Brian K. White" <nospam@foxfire74.com> wrote in message
news:<Vr9nc.2579$GL4.94@fe2.columbus.rr.com>...
> > I have a front projection CRT display that can display both 1080i and
720p.
> > It is connected to a HD cable box which outputs both 1080i and 720p
> > depending on the original source format. (Thus there is no conversion
> > happening at my end.) Here are my observations:
> > 1. 720p may required less bandwidth due to better MPEG compression of a
> > progressive signal, however, it requires a display capable of synching
to a
> > MUCH higher frequency than 1080i.
> > 2. All of the material that I have viewed looks better at 1080i than at
> > 720p.
> > 3. Both 1080i and 720p blow 480p (progressive scan DVD quality) away.
> > 4. 1080i looks better with HD video sources than with film sources.
(This
> > may be due to the fact that my equipment doesn't compensate well for the
3/2
> > pull-down artifacts you get when you convert a 24fps film to a 30fps
> > signal.)
> > 5. I watched all of the recent major sporting events (Super Bowl, the
NCAA
> > tournament, etc) in HD and have zero complaints about picture quality.
> >
> >
> > "MarkW" <markwco(removenospam)@comcast.net> wrote in message
> > news:6oli90d6taheg95i05pu5fvlmqbvbocm47@4ax.com...
> > > I am going to be picking up my HD TIVO today and am curious if it's
> > > best to use 720p or 1080 for HDTVs? 1080 sounds far better but then
> > > again is 720 more like 1440 or is simply have progressive better than
> > > interlaced? Or is it simply a matter of opinion? I figured I'd stick
> > > with one in my menu since I'll be using 480i on another TV and just
> > > wonder which to go with. I'm not sure if it depends on what you're
> > > watching or the TV you have but as for the TV's, I'll be using this
> > > with two TV's
> > > Sony KF-42WE610
> > > Sony KF-50XBR800
>
>
> I'm just going to jump in here and say something.
> I think the only format that makes sense is 1920x1080(24p).
> Now all I want to see are 16:9 crt vga screens. It shouldn't be hard
> at all.
> I don't know why they aren't around because you know people get all
> excited when they see that 16:9 ratio. And it could play
> 1920x1080(24p). It wouldn't cost that much either. Less than $300.