First off, apologies if my terminology is all wrong, I'm only a couple of days 
into exploring Quartz Composer.

I've created a simple composition which has a defined image output. In my Cocoa 
project I've managed to display the image properly using a 
QCQuartzCompositionLayer but ideally I'd like to be able to grab the image on 
demand without rendering it to screen. I therefore started looking into 
QCRenderer but am now struggling to get things working.

My initial attempt was as follows:

NSString* compName = @"ScreenImages";
NSString* compositionPath =  [[NSBundle mainBundle] pathForResource:compName 
ofType:@"qtz"];
@try
{
  CGColorSpaceRef colorSpace = 
CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
  QCRenderer *renderer = [[QCRenderer alloc] initWithComposition:composition 
colorSpace:colorSpace];
    
  [renderer renderAtTime:0.0 arguments:nil];
    
  NSLog(@"%@", [renderer outputKeys]);
  NSImage *image = [renderer valueForOutputKey:@"Screenshot"];
  if (image) {
    BOOL success = [[image 
TIFFRepresentationUsingCompression:NSTIFFCompressionLZW factor:1.0]  
writeToFile:@"/Users/Simon/Desktop/ImageTest.tiff" atomically:YES];
    if(success == NO) {
      NSLog(@"Failed");
    }    
  }
  [renderer release];
  CGColorSpaceRelease(colorSpace);
}
@catch(id exception)
{
  NSLog(@"Error running Quartz Composition '%s': %@", compName, exception);
}

This keels over when I try to create the renderer with:
Error running Quartz Composition '‡CÅpˇ': -[QCRenderer 
initWithComposition:colorSpace:]: Inconsistent state


My second attempt is based on the code at http://developer.casgrain.com/?p=4. 
This never managed to retrieve an NSImage from the composition:

  CGImageRef resultImage = NULL;
  
  NSString* compName = @"ScreenImages";
  NSString* compositionPath =  [[NSBundle mainBundle] pathForResource:compName 
ofType:@"qtz"];
  
  NSOpenGLPixelFormatAttribute attributes[] = {NSOpenGLPFAAccelerated, 
NSOpenGLPFANoRecovery, (NSOpenGLPixelFormatAttribute)0};
  NSOpenGLPixelFormat* format = [[NSOpenGLPixelFormat alloc] 
initWithAttributes:attributes];
  NSOpenGLContext* context = [[NSOpenGLContext alloc] initWithFormat:format 
shareContext:nil];
  
  @try
  {
    QCRenderer* renderer = [[QCRenderer alloc] initWithOpenGLContext:context 
pixelFormat:format file:compositionPath];
    [renderer renderAtTime:0.0 arguments:nil];
    NSImage* image = [renderer valueForOutputKey:@"Screenshot"];
    if (image) {
      BOOL success = [[image 
TIFFRepresentationUsingCompression:NSTIFFCompressionLZW factor:1.0]  
writeToFile:@"/Users/Simon/Desktop/ImageTest.tiff" atomically:YES];
      if(success == NO) {
        NSLog(@"Failed");
      }    
    }
    [renderer release];
  }
  @catch(id exception)
  {
    NSLog(@"Error running Quartz Composition '%s': %@", compName, exception);
  }
  
  // Done, clean up
  [context release];
  [format release];


So, does anyone have any idea what I'm doing wrong or can anyone give me some 
advice about retrieving an image from a composition in a QCRenderer.

Here's hoping and thanks for reading.

Simon Wolf _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to