Hey,

yup I'm on Windows, however I'm on Windows-7-6.1.7601-SP1 so 64bit. And as it 
seams the issue you mention only exists on windows 32bit.
Also I can load standard "flat" exrs of over 4GB just fine. So it seems to be 
an issue with deep exrs. Or and exr 2.0? Are you using exr 2.0 for the standard 
Read and Write nodes aswell or is it still normal exr?
Maybe you can try my set up under linux, I just forgot to mention I used 
1920x1080 as root format.
Thanks for your help
cheers
Patrick

----- Original Message -----
From: [email protected]
To: [email protected]
Date: 21.02.2013 11:52:50
Subject: Re: [Nuke-users] error loading exr 2.0 bigger than 2GB


> Is this on Windows?  Looks as though there are some file size issues related 
> to VS STL in the Windows EXR libraries:
> 
> On 21/02/13 09:52, Ben Woodhall wrote:
>> Have you seen any issues with files over 2GB? There's an issue raised on
>> nuke-users. I doubt that it's down to FAT32. ;-)
> 
> Might be an issue with EXR and the MS STL version...:
> 
> http://osdir.com/ml/video.openexr.user/2008-04/msg00004.html
> 
> Peter
> -- 
> Peter Pearson, Software Engineer
> The Foundry, 6th Floor, The Communications Building,
> 48 Leicester Square, London, UK, WC2H 7LT
> Tel: +44 (0)20 7434 0449   Web: www.thefoundry.co.uk
> 
> The Foundry Visionmongers Ltd.
> Registered in England and Wales No: 4642027
> 
> On 20 Feb 2013, at 12:56, Patrick Heinen wrote:
> 
>> Hey everyone,
>> 
>> I'm writing out 16bit half Zip(scanline) deep exrs(2.0) from nuke. When I 
>> try loading them back with the DeepRead, I get one of the following error 
>> messages if the file is larger than 2048 MB:
>> 
>> DeepRead1: Invalid argument.
>> DeepRead1: Error reading sample count data from file"...". Unexpected end of 
>> file.
>> DeepRead1:Domain Error.
>> DeepRead1:No such file or directory.
>> 
>> Has anyone had any similar problems? 
>> My setup is the following:
>> 
>> set cut_paste_input [stack 0]
>> version 7.0 v4
>> push $cut_paste_input
>> Noise {
>> zoffset {{curve x1 0 x1000 5}}
>> center {960 540}
>> name Noise1
>> selected true
>> xpos -348
>> ypos -117
>> }
>> DeepFromFrames {
>> samples 221
>> range_last 1000
>> alpha_mode additive
>> zmax 100
>> name DeepFromFrames1
>> selected true
>> xpos -348
>> ypos -85
>> }
>> DeepWrite {
>> file /noiseFog_v001_patrickh.%04d.exr
>> file_type exr
>> name DeepWrite1
>> selected true
>> xpos -348
>> ypos -18
>> }
>> 
>> changing the samples in the DeepFromFrames to 220 will bring the file size 
>> below 2048 MB and the files will load just fine.
>> 
>> cheers
>> Patrick
>> 
>> 
>> _______________________________________________
>> Nuke-users mailing list
>> [email protected], http://forums.thefoundry.co.uk/
>> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
> 
> -- 
> Ben Woodhall
> Software Engineer
> The Foundry, 6th Floor, The Communications Building,
> 48 Leicester Square, London, UK, WC2H 7LT
> Tel: +44(0)20 7968 6828 - Fax: +44(0)20 7930 8906
> Web: www.thefoundry.co.uk
> Email: [email protected]
> 
> The Foundry Visionmongers Ltd.
> Registered in England and Wales No: 4642027
> 
> 
> 
> _______________________________________________
> Nuke-users mailing list
> [email protected], http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to