[jira] [Updated] (ATLAS-2926) ZipSink: Very Large Entities Cause Out Of Memory Exception

2019-09-25 Thread Carol Drummond (Jira)


 [ 
https://issues.apache.org/jira/browse/ATLAS-2926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Carol Drummond updated ATLAS-2926:
--
Labels:   (was: release-notes)

> ZipSink: Very Large Entities Cause Out Of Memory Exception
> --
>
> Key: ATLAS-2926
> URL: https://issues.apache.org/jira/browse/ATLAS-2926
> Project: Atlas
>  Issue Type: Bug
>  Components:  atlas-core
>Affects Versions: 0.8.2, 0.8.3, trunk
>Reporter: Ashutosh Mestry
>Assignee: Ashutosh Mestry
>Priority: Major
> Fix For: 0.8.3, 1.2.0, 2.0.0
>
> Attachments: ATLAS-2926-ZipSink-OOM.patch
>
>
> *Steps to Duplicate*
>  # Setup Atlas with very large data. Atlas one entity should be about 300 MB 
> in size.
>  # Perform export with parameters that will include the large entity.
> Following error will be encountered:
> {code:java}
> java.lang.OutOfMemoryError: Requested array size exceeds VM limit     at 
> java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)     at 
> java.lang.StringCoding.encode(StringCoding.java:344)     at 
> java.lang.StringCoding.encode(StringCoding.java:387)     at 
> java.lang.String.getBytes(String.java:958)     at 
> org.apache.atlas.repository.impexp.ZipSink.addToZipStream(ZipSink.java:106)   
>   at org.apache.atlas.repository.impexp.ZipSink.saveToZip(ZipSink.java:95)    
>  at org.apache.atlas.repository.impexp.ZipSink.add(ZipSink.java:55)     at 
> org.apache.atlas.repository.impexp.ExportService.addEntity(ExportService.java:467)
> {code}
> *Additional Information*
> During conversion of entity string to bytes, a known JDK error is encountered.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (ATLAS-2926) ZipSink: Very Large Entities Cause Out Of Memory Exception

2019-09-25 Thread Carol Drummond (Jira)


 [ 
https://issues.apache.org/jira/browse/ATLAS-2926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Carol Drummond updated ATLAS-2926:
--
Labels: release-notes  (was: )

> ZipSink: Very Large Entities Cause Out Of Memory Exception
> --
>
> Key: ATLAS-2926
> URL: https://issues.apache.org/jira/browse/ATLAS-2926
> Project: Atlas
>  Issue Type: Bug
>  Components:  atlas-core
>Affects Versions: 0.8.2, 0.8.3, trunk
>Reporter: Ashutosh Mestry
>Assignee: Ashutosh Mestry
>Priority: Major
>  Labels: release-notes
> Fix For: 0.8.3, 1.2.0, 2.0.0
>
> Attachments: ATLAS-2926-ZipSink-OOM.patch
>
>
> *Steps to Duplicate*
>  # Setup Atlas with very large data. Atlas one entity should be about 300 MB 
> in size.
>  # Perform export with parameters that will include the large entity.
> Following error will be encountered:
> {code:java}
> java.lang.OutOfMemoryError: Requested array size exceeds VM limit     at 
> java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)     at 
> java.lang.StringCoding.encode(StringCoding.java:344)     at 
> java.lang.StringCoding.encode(StringCoding.java:387)     at 
> java.lang.String.getBytes(String.java:958)     at 
> org.apache.atlas.repository.impexp.ZipSink.addToZipStream(ZipSink.java:106)   
>   at org.apache.atlas.repository.impexp.ZipSink.saveToZip(ZipSink.java:95)    
>  at org.apache.atlas.repository.impexp.ZipSink.add(ZipSink.java:55)     at 
> org.apache.atlas.repository.impexp.ExportService.addEntity(ExportService.java:467)
> {code}
> *Additional Information*
> During conversion of entity string to bytes, a known JDK error is encountered.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (ATLAS-2926) ZipSink: Very Large Entities Cause Out Of Memory Exception

2018-11-25 Thread Madhan Neethiraj (JIRA)


 [ 
https://issues.apache.org/jira/browse/ATLAS-2926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Madhan Neethiraj updated ATLAS-2926:

Fix Version/s: (was: trunk)
   2.0.0
   1.2.0

> ZipSink: Very Large Entities Cause Out Of Memory Exception
> --
>
> Key: ATLAS-2926
> URL: https://issues.apache.org/jira/browse/ATLAS-2926
> Project: Atlas
>  Issue Type: Bug
>  Components:  atlas-core
>Affects Versions: 0.8.2, 0.8.3, trunk
>Reporter: Ashutosh Mestry
>Assignee: Ashutosh Mestry
>Priority: Major
> Fix For: 0.8.3, 1.2.0, 2.0.0
>
> Attachments: ATLAS-2926-ZipSink-OOM.patch
>
>
> *Steps to Duplicate*
>  # Setup Atlas with very large data. Atlas one entity should be about 300 MB 
> in size.
>  # Perform export with parameters that will include the large entity.
> Following error will be encountered:
> {code:java}
> java.lang.OutOfMemoryError: Requested array size exceeds VM limit     at 
> java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)     at 
> java.lang.StringCoding.encode(StringCoding.java:344)     at 
> java.lang.StringCoding.encode(StringCoding.java:387)     at 
> java.lang.String.getBytes(String.java:958)     at 
> org.apache.atlas.repository.impexp.ZipSink.addToZipStream(ZipSink.java:106)   
>   at org.apache.atlas.repository.impexp.ZipSink.saveToZip(ZipSink.java:95)    
>  at org.apache.atlas.repository.impexp.ZipSink.add(ZipSink.java:55)     at 
> org.apache.atlas.repository.impexp.ExportService.addEntity(ExportService.java:467)
> {code}
> *Additional Information*
> During conversion of entity string to bytes, a known JDK error is encountered.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (ATLAS-2926) ZipSink: Very Large Entities Cause Out Of Memory Exception

2018-10-18 Thread Ashutosh Mestry (JIRA)


 [ 
https://issues.apache.org/jira/browse/ATLAS-2926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ashutosh Mestry updated ATLAS-2926:
---
Attachment: ATLAS-2926-ZipSink-OOM.patch

> ZipSink: Very Large Entities Cause Out Of Memory Exception
> --
>
> Key: ATLAS-2926
> URL: https://issues.apache.org/jira/browse/ATLAS-2926
> Project: Atlas
>  Issue Type: Bug
>  Components:  atlas-core
>Affects Versions: 0.8.2, trunk, 0.8.3
>Reporter: Ashutosh Mestry
>Assignee: Ashutosh Mestry
>Priority: Major
> Fix For: trunk, 0.8.3
>
> Attachments: ATLAS-2926-ZipSink-OOM.patch
>
>
> *Steps to Duplicate*
>  # Setup Atlas with very large data. Atlas one entity should be about 300 MB 
> in size.
>  # Perform export with parameters that will include the large entity.
> Following error will be encountered:
> {code:java}
> java.lang.OutOfMemoryError: Requested array size exceeds VM limit     at 
> java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)     at 
> java.lang.StringCoding.encode(StringCoding.java:344)     at 
> java.lang.StringCoding.encode(StringCoding.java:387)     at 
> java.lang.String.getBytes(String.java:958)     at 
> org.apache.atlas.repository.impexp.ZipSink.addToZipStream(ZipSink.java:106)   
>   at org.apache.atlas.repository.impexp.ZipSink.saveToZip(ZipSink.java:95)    
>  at org.apache.atlas.repository.impexp.ZipSink.add(ZipSink.java:55)     at 
> org.apache.atlas.repository.impexp.ExportService.addEntity(ExportService.java:467)
> {code}
> *Additional Information*
> During conversion of entity string to bytes, a known JDK error is encountered.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)