[ https://issues.apache.org/jira/browse/SPARK-34476?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17288482#comment-17288482 ]
Apache Spark commented on SPARK-34476: -------------------------------------- User 'tedyu' has created a pull request for this issue: https://github.com/apache/spark/pull/31613 > Duplicate referenceNames are given for ambiguousReferences > ---------------------------------------------------------- > > Key: SPARK-34476 > URL: https://issues.apache.org/jira/browse/SPARK-34476 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.0 > Reporter: Ted Yu > Priority: Major > > When running test with Spark extension that converts custom function to json > path expression, I saw the following in test output: > {code} > 2021-02-19 21:57:24,550 (Time-limited test) [INFO - > org.yb.loadtest.TestSpark3Jsonb.testJsonb(TestSpark3Jsonb.java:102)] plan is > == Physical Plan == > org.apache.spark.sql.AnalysisException: Reference > 'phone->'key'->1->'m'->2->>'b'' is ambiguous, could be: > mycatalog.test.person.phone->'key'->1->'m'->2->>'b', > mycatalog.test.person.phone->'key'->1->'m'->2->>'b'.; line 1 pos 8 > {code} > Please note the candidates following 'could be' are the same. > Here is the physical plan for a working query where phone is a jsonb column: > {code} > TakeOrderedAndProject(limit=2, orderBy=[id#6 ASC NULLS FIRST], > output=[id#6,address#7,key#0]) > +- *(1) Project [id#6, address#7, phone->'key'->1->'m'->2->'b'#12 AS key#0] > +- BatchScan[id#6, address#7, phone->'key'->1->'m'->2->'b'#12] Cassandra > Scan: test.person > - Cassandra Filters: [[phone->'key'->1->'m'->2->>'b' >= ?, 100]] > - Requested Columns: [id,address,phone->'key'->1->'m'->2->'b'] > {code} > The difference for the failed query is that it tries to use > {code}phone->'key'->1->'m'->2->>'b'{code} in the projection (which works as > part of filter). -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org