[
https://issues.apache.org/jira/browse/DAFFODIL-2401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Josh Adams resolved DAFFODIL-2401.
----------------------------------
Resolution: Fixed
This has been "fixed" in commit 1038ecb1266632855fb1d1b513d4c8f0437ece7d which
now provides an error recommending the use of blobs instead of hexBinary.
The real fix for this issue with large images in NITF is to update the NITF
schema to use blobs.
> 4GB NITF file crashes Daffodil. 8GB and 10GB files do not.
> ----------------------------------------------------------
>
> Key: DAFFODIL-2401
> URL: https://issues.apache.org/jira/browse/DAFFODIL-2401
> Project: Daffodil
> Issue Type: Bug
> Components: Back End
> Affects Versions: 2.7.0
> Environment: Centos 7.8.
> Reporter: Brent Nordin
> Assignee: Josh Adams
> Priority: Critical
> Fix For: 3.0.0
>
> Attachments: jpeg.dfdl.xsd, nitf.dfdl.xsd, stderr, test10G.ntf.zip.Z,
> test4G.ntf.zip.Z, test8G.ntf.zip.Z
>
>
> NITF parse crashes Daffodil using this command line:
> apache-daffodil-2.7.0-incubating-bin/bin/daffodil parse -s nitf.dfdl.xsd
> test4G.ntf > 4G.ntf.xml
> The test file is 4GB. Testing with 8GB and 10GB NITF images is successful. I
> have attached the test images as zipped and then compressed files. Beware
> when you expand them - you will have 3 files totaling 22GB.
> The attached nitf.dfdl.xsd file includes a work-around from S. Lawrence (for
> the memory caching issue described elsewhere) and another fix that has
> already been rolled in to the NITF Schema repo.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)