Archived from groups: alt.tv.tech.hdtv (
More info?)
Frank Provasek wrote:
> You have been wasting my time for a week that denying that
> conversion looks bad because
> of the frame rate problems, because "if that were the only
> source of the problem, then everything would
> become crystal clear in stationary scenes." Which if you look at a
> raw output IT DOES,
You seem to be trying to claim that I denied motion blur from frame
rate change was a major problem. Which I did not. The issue is that
you denied the existence of any OTHER problem besides motion blur,
saying it is the sole issue.
> and since a picture that resharpens
> and blurs when motion stops and starts is MORE annoying, a
> gaussian blur is applied to the motionless
> scenes to SUBJECTIVELY even out the "look." Then you argue
> about THAT, but later say
> " The artificial blur applied to pal/ntsc conversion is
> INDEPENDENT of motion blur!"
>
> Yes ..that is exactly what I said...it's an artificial GAUSSIAN
> blur to SUBJECTIVELY match the motion blur.
No, it's not what you said, you said that the gaussian blur is put in
to MATCH the motion blur. You now say that the blur is meant to
"subjectively" resemble the motion blur, which to me looks like backing
off quite a bit from your original wording. This is much more
plausible... but don't claim that's what either of us previously said.
Your use of the word "independent" seems to be in a much narrower sense
than mine... though it gets closer once you add "subjective". And
since you already made one response to my objection to the idea that
they match, I wonder why you didn't clarify this last time, if a rough
subjective resemblance was all you meant to assert.
All of which is a distraction from the actual point of discussion,
which is whether there is a blurring issue OTHER THAN motion blur. My
assertion was that there's a separate sharpness issue caused by the
interpolative resolution change, which requires some blurring to cover
it up, at least in cases where the original image isn't antialiased to
perfection (which it generally is not, especially if the source comes
directly from a video camera). You've left that question out entirely.
> Definition of Gaussian Blur
>
>
http/www.maths.abdn.ac.uk/~igc/tch/mx4002/notes/node99.html
I don't think the exact flavor of static blur is a very relevant issue.
(And you don't need to explain how it's defined, I know that stuff.)
Suffice it to say that any resemblance to motion blur is subjective
indeed.
> Then you argue that motion pictures blur more than video due to
> the long exposure time. And claim
> "the blur of one frame very commonly ends near where the blur of
> the next frame starts"
>
> IMPOSSIBLE (on a standard film camera) which typically uses a 170 or
> 180 degree shutter.
Well, all I can tell you is what I see on the finished film, which is
that the blurs often appear more continuous than not when viewed frame
by frame. And that even mild motion results in huge amounts of
blurring in a typical film scene. (It doesn't look too bad because the
eye naturally tends to blur motion similarly, at least in dimmer
light.) And if that blur is missing, the result looks bizarre, as in
"28 Days Later".
If you know about film camera shutters, perhaps you can tell me: I know
that early cameras used a simple chopper wheel as a shutter, so the
"degrees" of shutter was literal... and I don't think these shutters
were in the optical centers of lenses as is the case in, like, a
Hasselblad. So with that design, the length of PARTIAL exposure of the
film exceeded the nominal degrees of the shutter, because it took time
for the edge of the blade to cross the lens aperture. Might that still
be the case with modern cameras? I wouldn't have assumed so, because
it seems crude, but what I see on film suggests that they may still be
using that kind of exposure, perhaps because it improves the subjective
quality of motion blur by having the ends fade in and out instead of
switching on and off abruptly (which is indeed how it looks to me) --
much like the way that gaussian blur looks better than defocus blur.
If this is the case with modern cameras, it would explain why I see the
blur extending further than the nominal degrees of shutter. Do you
know if that's the case?