vinothchandar edited a comment on issue #971: Setting 
"hoodie.parquet.max.file.size" to a value >= 2 GiB leads to no data being 
   Okay able to repro using even 2.4. Its an integer overflow somewhere in the 
config passing path. What happens is the workload profile computes negative 
number of records assigned and thus skips assigning them. Tracking hows it 
happening still, bit puzzling since the `HoodieWriteConfig` level its all long
   scala> String.valueOf(3 *1024 * 1024 * 1024)
   res1: String = -1073741824
   can you try just doing `((Long) (3 *1024 * 1024 * 1024L)).toString();` in 
Java or `(3 *1024 * 1024 * 1024L).toString` in scala? 

This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

With regards,
Apache Git Services

Reply via email to