kazdy commented on issue #8019:
URL: https://github.com/apache/hudi/issues/8019#issuecomment-1440075626

   Hi
   
   Yes you can call spark procedures using pyspark, simply wrap your call 
command in 
   `spark.sql("call <procedure>(<args>>)")`.
   Just make sure spark session is properly configured for Hudi (catalog, sql 
extensions, as described in quickstart for spark)
   
   Here are spark procedures available to users, most are not documented yet:
   
https://github.com/apache/hudi/tree/master/hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/procedures
   
   For disaster recovery you are probably looking for:
   CreateSavepointProcedure.scala
   DeleteSavepointProcedure.scala
   RollbackToSavepointProcedure.scala
   ShowSavepointsProcedure.scala


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to