[ https://issues.apache.org/jira/browse/SPARK-41144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-41144: ------------------------------------ Assignee: Apache Spark > UnresolvedHint should not cause query failure > --------------------------------------------- > > Key: SPARK-41144 > URL: https://issues.apache.org/jira/browse/SPARK-41144 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.4.0 > Reporter: XiDuo You > Assignee: Apache Spark > Priority: Major > > > {code:java} > CREATE TABLE t1(c1 bigint) USING PARQUET; > CREATE TABLE t2(c2 bigint) USING PARQUET; > SELECT /*+ hash(t2) */ * FROM t1 join t2 on c1 = c2;{code} > > > failed with msg: > {code:java} > org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to > exprId on unresolved object > at > org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute.exprId(unresolved.scala:147) > at > org.apache.spark.sql.catalyst.analysis.Analyzer$AddMetadataColumns$.$anonfun$hasMetadataCol$4(Analyzer.scala:1005) > at > org.apache.spark.sql.catalyst.analysis.Analyzer$AddMetadataColumns$.$anonfun$hasMetadataCol$4$adapted(Analyzer.scala:1005) > at scala.collection.Iterator.exists(Iterator.scala:969) > at scala.collection.Iterator.exists$(Iterator.scala:967) > at scala.collection.AbstractIterator.exists(Iterator.scala:1431) > at scala.collection.IterableLike.exists(IterableLike.scala:79) > at scala.collection.IterableLike.exists$(IterableLike.scala:78) > at scala.collection.AbstractIterable.exists(Iterable.scala:56) > at > org.apache.spark.sql.catalyst.analysis.Analyzer$AddMetadataColumns$.$anonfun$hasMetadataCol$3(Analyzer.scala:1005) > at > org.apache.spark.sql.catalyst.analysis.Analyzer$AddMetadataColumns$.$anonfun$hasMetadataCol$3$adapted(Analyzer.scala:1005) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org