> >> I wonder why films can get away with 48 Hz, but CRTs are annoying at 
> >> 70 Hz?
> > 
> > I think it has something to do with motion blur.  A CCD takes very
> > little exposure for a frame, while film, I think, has a longer
> > exposure period.  So you get more motion blur with film than with a
> > CCD.  Also, there may be something to the way the TV scans the image,
> > while the film projector flashes it all at once.  Note that I could be
> > totally full of crap here, so you may want to look elsewhere.  There
> > are people who know the answer to this question.
> > 
> A pro movie camera has a shutter speed adjustment.  It is not totally 
> independent but rather a percent of twice the frame rate.  So, at 24 fps 
> you might shoot action shots at ~1/200 of a second which would reduce 
> the blur considerably.  This does have problems -- we have all seen the 
> wagon wheels turning backwards in westerns. :-)
> 
> So, I think that it is the nature of a scanned CRT that is the issue.  I 
> don't really see flicker at 60 fps, but it bothers me.  70fps is OK for 
> me but I understand that others see it flicker.  This is odd since the 
> CRT screen has some persistence and a movie screen has none.

I have a 70 Hz CRT (I'm assuming progressive).  I don't consciously see
flicker, but I get eyestrain and headaches.  30 Hz interlaced TV is fine,
films are fine.

I've been looking forward to getting a 1920x1200 LCD and zero flicker,
once they fix the dead pixel problem, and ghosting, and the prices
come down some more.  But if they fix ghosting by adding flicker...
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to