[ https://issues.apache.org/jira/browse/SYSTEMML-668?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mike Dusenberry updated SYSTEMML-668: ------------------------------------- Description: In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance -has changed from {{sqlContext._scala_SQLContext}} to {{sqlContext._ssql_ctx}}- has always been officially exposed via {{sqlContext._ssql_ctx}}. However, we have been using an unofficial variable, {{sqlContext._scala_SQLContext}}, which has been renamed in 2.0, breaking any previous code using the former construct, such as our Python {{MLOutput.getDF(...)}} method. Therefore, we just need to update our PySpark API to use the official access point. (was: In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance -has changed from {{sqlContext._scala_SQLContext}} to {{sqlContext._ssql_ctx}}- has always been official exposed via {{sqlContext._ssql_ctx}}. However, we have been using an unofficial variable, {{sqlContext._scala_SQLContext}}, which has been renamed in 2.0, breaking any previous code using the former construct, such as our Python {{MLOutput.getDF(...)}} method. Therefore, we just need to update our PySpark API to use the official access point.) > Python MLOutput.getDF() Can't Access JVM SQLContext > --------------------------------------------------- > > Key: SYSTEMML-668 > URL: https://issues.apache.org/jira/browse/SYSTEMML-668 > Project: SystemML > Issue Type: Bug > Reporter: Mike Dusenberry > Assignee: Mike Dusenberry > Priority: Minor > > In PySpark, access to the JVM SQLContext from a PySpark SQLContext instance > -has changed from {{sqlContext._scala_SQLContext}} to > {{sqlContext._ssql_ctx}}- has always been officially exposed via > {{sqlContext._ssql_ctx}}. However, we have been using an unofficial > variable, {{sqlContext._scala_SQLContext}}, which has been renamed in 2.0, > breaking any previous code using the former construct, such as our Python > {{MLOutput.getDF(...)}} method. Therefore, we just need to update our > PySpark API to use the official access point. -- This message was sent by Atlassian JIRA (v6.3.4#6332)