Flicker of computer monitors seen on tv

Nov 4, 2009
1
0
18,510
Why is it that when computer monitors seen on tv, like on a news channel, you see the monitors flicker but when in front of an actual computer we don't perceive the flicker? How does the tv modify this perception? I understand that flicker rate of monitors are higher than the contrast flicker fusion frequency of about 60Hz in humans but how does the tv's flicker rate modify that so we do detect the computer's flickering?

Optometry student
 

frozenlead

Distinguished
It's not really about the human eye - it's about camera technology.
Like you said, most older CRTs have a default refresh rate of 60Hz, or 60 refreshes/second.
A camera works by taking lots of pictures in a small amount of time.

Most cameras, though, do not record images at 60 images/second. They're usually around 30 or 24. Since the screen refresh rate is higher than the image capture rate, the effect you see on a television is essentially a "slow motion" capture of the screen refreshing itself, which causes the effect you see.

If the shutter speed of the camera in question were as fast as the monitor's refresh rate, the lines would disappear.

Now, this is only visible with CRT monitors - LCDs are actually a lot slower when changing colors and images, which hides the effect from the camera.

Edit: We don't perceive the flicker sitting in front of a monitor because your eye actually can take in images that fast.
 

TRENDING THREADS