Hi,

I have a production job that is registering four different dataframes as
tables in pyspark 1.6.2 . when we upgraded to spark 2.0 only three of the
four dataframes are getting registered. the fourth dataframe is not getting
registered. There are no code changes whatsoever. The only change is the
spark verion. When I revert the spark version to 1.6.2 the dataframe is
getting registered properly.  Did anyone face a similar issue? Is this a bug
in spark 2.0 or is it just a compatibility issue?



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/sqlContext-registerDataFrameAsTable-is-not-working-properly-in-pyspark-2-0-tp18938.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to