[ 
https://issues.apache.org/jira/browse/SPARK-17538?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-17538:
------------------------------
             Shepherd:   (was: Matei Zaharia)
                Flags:   (was: Important)
    Affects Version/s:     (was: 2.0.1)
                           (was: 2.1.0)
     Target Version/s:   (was: 2.0.1, 2.1.0)
               Labels:   (was: pyspark)
             Priority: Major  (was: Critical)
        Fix Version/s:     (was: 2.0.1)
                           (was: 2.1.0)

[~sririshindra] please read 
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark  
There's a lot wrong with how you filled this out.

> sqlContext.registerDataFrameAsTable is not working sometimes in pyspark 2.0.0
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-17538
>                 URL: https://issues.apache.org/jira/browse/SPARK-17538
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>         Environment: os - linux
> cluster -> yarn and local
>            Reporter: Srinivas Rishindra Pothireddi
>
> I have a production job in spark 1.6.2 that registers four dataframes as 
> tables. After testing the job in spark 2.0.0 one of the dataframes is not 
> getting registered as a table.
> output of sqlContext.tableNames() just after registering the fourth dataframe 
> in spark 1.6.2 is
> temp1,temp2,temp3,temp4
> output of sqlContext.tableNames() just after registering the fourth dataframe 
> in spark 2.0.0 is
> temp1,temp2,temp3
> so when the table 'temp4' is used by the job at a later stage an 
> AnalysisException is raised in spark 2.0.0
> There are no changes in the code whatsoever. 
>  
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to