I'm not sure if this is better asked here or on cocoa-dev, but here
goes.
I have subclassed QCView so that I can do some extra drawing on top of
the composition as well as to control the current render time. My
subclass's renderer looks like this:
- (BOOL) renderAtTime:(NSTimeInterval)time arguments:
(NSDictionary*)arguments
{
// make sure the time value is modulo the duration of the effect
time = fmod(time, _effectDurationInSeconds);
// use completion fractions from 0...1
time /= _effectDurationInSeconds;
BOOL result = [super renderAtTime:time arguments:arguments];
if (result)
{
// do further drawing here...
}
return result;
}
I'm working with compositions that expect patch time values in the
range of 0.0 to 1.0 (think fraction done). The actual time it takes
for the composition to run one cycle might be 1 second or 1 minute,
which is something the user controls.
As a test I modified a composition to print the patch time. On a
10.5.8 system the patch time displayed is always in the range of 0.0
to 1.0; on a 10.6.3 system the patch time keeps incrementing past 1.0.
If I set a breakpoint in the code, above, the time value being passed
into [super renderAtTime:arguments:] is, indeed, getting pinned to 0.0
to 1.0, even though the rendered frames are showing larger values. So
I'm now unsure why the composition is apparently seeing something
different. Any ideas about where to look?
steve
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com
This email sent to [email protected]