rjmblc commented on issue #6341:
URL: https://github.com/apache/hudi/issues/6341#issuecomment-1260903771

   @jdattani That option is set to false in my hoodie.properties files.
   @nsivabalan Adding few more details to my issue. I updated 
hoodie.datasource.write.operation': 'delete' but still unable to delete the 
records.
   
   Is there a problem that the pyspark script is unable to get the partition 
details?
   
   1. The input hudi table is created by a flink streaming job (I have no 
control over it) and below is the source code for the DDL.
   
[1.Flink_Input_Source_DDL.zip](https://github.com/apache/hudi/files/9665299/1.Flink_Input_Source_DDL.zip)
   
   2. Pyspark script to delete the records
   
[2.hudi_delete_pyspark_script.zip](https://github.com/apache/hudi/files/9665306/2.hudi_delete_pyspark_script.zip)
   
   3. Hudi table properties file
   
[3.hoodie_properties.zip](https://github.com/apache/hudi/files/9665309/3.hoodie_properties.zip)
   
   4.Empty delta commit file
   
[4.delta_commit_file.zip](https://github.com/apache/hudi/files/9665313/4.delta_commit_file.zip)
   
   
   Thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to