[ 
https://issues.apache.org/jira/browse/FLINK-29689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vrinda Palod updated FLINK-29689:
---------------------------------
    Description: 
Currently , when we try to convert an excel with records more than 100,000,000 
record type, it throws error and doesnt convert the file to csv format.

 

This can be fixed by adding the below line before reading the excel in the 
code. 

IOUtils.setByteArrayMaxOverride(2147483647);

 

The value hardcoded is the max length of int array above which we cant handle 
for now. 

 

!image-2022-10-19-18-04-33-092.png!

  was:Currently , 


> NIFI Performance issue - ConvertExcelToCSVProcessor to handle more data
> -----------------------------------------------------------------------
>
>                 Key: FLINK-29689
>                 URL: https://issues.apache.org/jira/browse/FLINK-29689
>             Project: Flink
>          Issue Type: Improvement
>          Components: API / DataSet
>            Reporter: Vrinda Palod
>            Priority: Major
>              Labels: CSV, excel, nifi, poi
>         Attachments: image-2022-10-19-18-03-13-983.png, 
> image-2022-10-19-18-04-33-092.png
>
>
> Currently , when we try to convert an excel with records more than 
> 100,000,000 record type, it throws error and doesnt convert the file to csv 
> format.
>  
> This can be fixed by adding the below line before reading the excel in the 
> code. 
> IOUtils.setByteArrayMaxOverride(2147483647);
>  
> The value hardcoded is the max length of int array above which we cant handle 
> for now. 
>  
> !image-2022-10-19-18-04-33-092.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to