Robert (Jamie) Munro wrote:
There was a cinema standard that called Showscan that ran at 60 instead of 24fps for similar reasons. And IMAX do a thing called IMAX HD that runs at 48fps. These systems both require a lot of lighting, and a lot of film stock to shoot, so I don't think they are likely to be popular, except in special cases like theme-park ride-films.
Well, that's film for you. :-) Virtually all digital video recording systems use lossy compression though, so if higher frame rates compress better (as we suspect), your costs don't increase linearly with frame rate.
I wonder if highly shuttered video produces better results on TVs that do motion compensated 100Hz stuff. E.g. if you delivered them 25p but with the shutter open for 10ms rather than 40ms, they will be able to make a much better job of the motion compensation, producing something very close to true 100Hz video, but with no need for extra bandwidth or changes to the transmission chain over what we have already. Should broadcasters consider shooting with this kind of TV in mind?
That's when you'd need the extra light, because you're throwing away most of it by shuttering. But yes, the fact that motion-interpolating displays prefer highly shuttered video is something that broadcasters could start bearing in mind if those displays gain a large market share. Personally I'd be disappointed though - you're effectively using the low frame rate as a form of lossy compression, and there are far more elegant methods of compressing video than throwing frames away and then trying to regenerate them in the display.
Also, have you considered how annoyed the directors who want a highly-shuttered look will be? :-) (Think "Top Gear", etc.)
Another thought I had was what about capturing motion separately to the picture, at a lower spatial, but higher temporal resolution. Perhaps using a strobed infra-red ilumination to generate smething like MPEG P & B frames, and a full colour camera to generate I frames at a low frame rate.
Not sure that strobed IR would be the way to do it (limited range outdoors or under tungsten studio lighting, limited correlation with the visible light images due to materials' differing IR reflectivities) but yes, there's all sorts of interesting ways to sample (or resample) the various aspects of the video signal differently once you get your thinking away from fixed frame rates. Chroma sub-sampling is a good analogy, and Bayer-patterning could be regarded as an interesting way to sub-sample chroma within the camera's sensor. I think you always need to bear in mind that you're effectively implementing a compression scheme though, and question whether or not it's going to be an effective one.
I blog about this kind of video fundamentals stuff occasionally, if anyone's (still) interested - http://elvum.net.
S - Sent via the backstage.bbc.co.uk discussion group. To unsubscribe, please visit http://backstage.bbc.co.uk/archives/2005/01/mailing_list.html. Unofficial list archive: http://www.mail-archive.com/[email protected]/

