Github user dilipbiswal commented on a diff in the pull request:
https://github.com/apache/spark/pull/12460#discussion_r60335414
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -254,6 +251,21 @@ class SparkSqlAstBuilder extends AstBuilder {
}
}
+ /**
+ * A column path can be specified as an parameter to describe command.
It is a dot separated
+ * elements where the last element can be a String.
+ * TODO - check with Herman
--- End diff --
@hvanhovell Hi Herman, I tried very simple scenarios of using nested
columns and it seems to work ok. Let me paste the output here.
map
====
``` SQL
create table mp_t1 (a map <int, string>, b string) row format delimited
collection items terminated by '$' map keys terminated by '#';
load data local inpath '/data/mapfile' overwrite into table mp_t1;
select * from mp_t1;
a b
{100:"spark"} ABC
describe extended mp_t1.a.$key$;
Result
======
$key$ int from deserializer
```
Struct
====
``` SQL
create table ct_t (a struct<n1: string, n2: string>, b string) stored as
textfile;
insert into ct_t values (('abc', 'efg'), 'ABC');
spark-sql> select * from ct_t;
{"n1":"abb","n2":"efg"} ABC
spark-sql> describe extended ct_t.a.n1;
OK
n1 string from deserializer
```
Herman, based on hive syntax diagram, i was expecting the following
command to work.
describe extended mp_t1.a.'$key$';
However, i get a parse exception and when i remove the quotes it works like
following.
describe extended mp_t1.a.$key$
Given this, we can simply change the grammar to use a dot separated list of
identifiers, right ? Please
let me know what you think..
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]