Hey everyone,

I'm writing out 16bit half Zip(scanline) deep exrs(2.0) from nuke. When I try 
loading them back with the DeepRead, I get one of the following error messages 
if the file is larger than 2048 MB:

DeepRead1: Invalid argument.
DeepRead1: Error reading sample count data from file"...". Unexpected end of 
file.
DeepRead1:Domain Error.
DeepRead1:No such file or directory.

Has anyone had any similar problems? 
My setup is the following:

set cut_paste_input [stack 0]
version 7.0 v4
push $cut_paste_input
Noise {
 zoffset {{curve x1 0 x1000 5}}
 center {960 540}
 name Noise1
 selected true
 xpos -348
 ypos -117
}
DeepFromFrames {
 samples 221
 range_last 1000
 alpha_mode additive
 zmax 100
 name DeepFromFrames1
 selected true
 xpos -348
 ypos -85
}
DeepWrite {
 file /noiseFog_v001_patrickh.%04d.exr
 file_type exr
 name DeepWrite1
 selected true
 xpos -348
 ypos -18
}

changing the samples in the DeepFromFrames to 220 will bring the file size 
below 2048 MB and the files will load just fine.

cheers
Patrick


_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to