[ 
https://issues.apache.org/jira/browse/SPARK-17538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Srinivas Rishindra Pothireddi updated SPARK-17538:
--------------------------------------------------
    Labels: pyspark  (was: )

> sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-17538
>                 URL: https://issues.apache.org/jira/browse/SPARK-17538
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0, 2.0.1, 2.1.0
>         Environment: os - linux
> cluster -> yarn and local
>            Reporter: Srinivas Rishindra Pothireddi
>            Priority: Critical
>              Labels: pyspark
>             Fix For: 2.0.1, 2.1.0
>
>
> I have a production job in spark 1.6.2 that registers four dataframes as 
> tables. After testing the job in spark 2.0.0 one of the dataframes is not 
> getting registered as a table.
> output of sqlContext.tableNames() just after registering the fourth dataframe 
> in spark 1.6.2 is
> temp1,temp2,temp3,temp4
> output of sqlContext.tableNames() just after registering the fourth dataframe 
> in spark 2.0.0 is
> temp1,temp2,temp3
> so when the table 'temp4' is used by the job at a later stage an 
> AnalysisException is raised in spark 2.0.0
> There are no changes in the code whatsoever. 
>  
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to