arun990 commented on issue #3127:
URL: https://github.com/apache/hudi/issues/3127#issuecomment-865488983


   Hi, checked with hive sync as true also after setting up spark 2.4.0 with 
hive.
   Getting the same error as mentioned below. 
   Please advise.
   
df.write.format("org.apache.hudi").options(**hudiOptions).mode("append").save(sys.argv[5])
     File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", 
line 736, in save
     File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", 
line 1257, in __call__
     File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 
63, in deco
     File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", 
line 32 8, in get_return_value
   py4j.protocol.Py4JJavaError: An error occurred while calling o156.save.
   : java.lang.NoSuchMethodError: 
org.apache.spark.sql.execution.datasources.DataSourceUtils$.PARTITIONING_COLUMNS_KEY()Ljava/lang/String;
           at 
org.apache.hudi.DataSourceWriteOptions$.translateSqlOptions(DataSourceOptions.scala:206)
           at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:139)
           at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand. 
run(SaveIntoDataSourceCommand.scala:45)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffect 
Result$lzycompute(commands.scala:70)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffect 
Result(commands.scala:68)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
           at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
           at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
           at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to