[ https://issues.apache.org/jira/browse/SPARK-10577?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14742042#comment-14742042 ]
Maciej Bryński edited comment on SPARK-10577 at 9/12/15 12:46 PM: ------------------------------------------------------------------ Same without Hive support. Py4JJavaError: An error occurred while calling o30.sql. {code} : java.lang.RuntimeException: [1.42] failure: ``union'' expected but `(' found select * from t1 join broadcast(t2) on t1.k1 = t2.k2 ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) at org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67) at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:169) at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:169) {code} was (Author: maver1ck): Same without Hive support. Py4JJavaError: An error occurred while calling o30.sql. {code} : java.lang.RuntimeException: [1.42] failure: ``union'' expected but `(' found select * from t1 join broadcast(t2) on t1.k1 = t2.k2 ^ at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36) at org.apache.spark.sql.catalyst.DefaultParserDialect.parse(ParserDialect.scala:67) at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:169) at org.apache.spark.sql.SQLContext$$anonfun$3.apply(SQLContext.scala:169) {code} > [PySpark, SQL] DataFrame hint for broadcast join > ------------------------------------------------ > > Key: SPARK-10577 > URL: https://issues.apache.org/jira/browse/SPARK-10577 > Project: Spark > Issue Type: Improvement > Components: PySpark, SQL > Affects Versions: 1.5.0 > Reporter: Maciej Bryński > > As in https://issues.apache.org/jira/browse/SPARK-8300 > there should by possibility to add hint for broadcast join in: > - Spark SQL > - Pyspark -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org