Archived from groups: alt.tv.tech.hdtv (
More info?)
Chris Thomas <cthomas@mminternet.com> wrote in
news:MPG.1b73239cecbac78b98984c@news.mminternet.com:
> In article <7bidnTzYVJUE5pTcRVn-uw@comcast.com>,
> david_please_dont_email_me@i_hate_spam.com says...
>> Tim923 wrote:
>> > Was 30fps/60hz chosen for television since it could convert film
>> > (24fps) well to TV using (3/60+2/60)/2 = 1/24, or was that just a
>> > clever formula discovered after the fact.
>>
>> Our electricity in the US is 60Hz AC.
>
> Back when TV was being developed, there was a lot of concern that
> 60hz hub pickup from many sources (radiation, leaky capacitors,
> poorly shielded transformers, et al) would a) interfere with the
> ability of TV sets to maintain vertical sync, and b) cause horizontal
> bars that rolled up or down the screen, if the vertical sync were not
> exactly the same as the power line (60hz in the US, 50 hz in Europe).
>
> That concern evaporated, and with the introduction of color (NTSC) in
> the 50's, the vertical sync frequency was changed to approx. 59.94
> Hz. (This wierd choice comes from dividing down the color subcarrier
> frequency, again to minimize a diferent type of interference.)
>
> The 3:2 pulldown is result of the 60Hz choice, not the other way
> round.
Actually, I had a TV on the bench for two weeks back in 1967 that I could
NOT figure out. Turned out it was a bad filter cap in the power supply
that was JUST bad enough to cause the TV to sync to the power line
instead of the actual signals. Found it the minute I got frustrated
enough to stick a scope probe into it!
--
Dave Oldridge+
ICQ 1800667
A false witness is worse than no witness at all.