Hi list,

I have some data with a field name of f:price (it's actually part of a JSON
structure loaded from ElasticSearch via elasticsearch-hadoop connector, but
I don't think that's significant here). I'm struggling to figure out how to
express that in a Spark SQL SELECT statement without generating an error
(and haven't been able to find any similar examples in the documentation).

val productsRdd = sqlContext.sql("SELECT
Locales.Invariant.Metadata.item.f:price FROM products LIMIT 10")

gives me the following error...

java.lang.RuntimeException: [1.41] failure: ``UNION'' expected but `:' found

Changing the column name is one option, but I have other systems depending
on this right now so it's not a trivial exercise. :(

I'm using Spark 1.2.

Thanks in advance for any advice / help.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Column-name-including-a-colon-in-a-SELECT-clause-tp21576.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to