Apparently nabble ate my code samples.

In Spark (1.5.2) the column is represented correctly:
sqlContext.sql("SELECT * FROM tempdata").collect()
[{"PageHtml":"{\\"time\\":0}"}]

However, when queried from Hive I get the same column but without any of the 
escape characters:
Beeline (or PyHive) > SELECT * FROM tempdata LIMIT 1
[{"PageHtml":"{"time":0}"}]

Thanks for the heads up

From: Ted Yu <yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>>
Date: Monday, January 4, 2016 at 11:54 AM
To: Scott Lyons <scl...@microsoft.com<mailto:scl...@microsoft.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: HiveThriftServer fails to quote strings

bq. without any of the escape characters:

Did you intend to show some sample ?

As far as I can tell, there was no sample or image in previous email.

FYI

On Mon, Jan 4, 2016 at 11:36 AM, sclyon 
<scl...@microsoft.com<mailto:scl...@microsoft.com>> wrote:
Hello all,

I've got a nested JSON structure in parquet format that I'm having some
issues with when trying to query it through Hive.

In Spark (1.5.2) the column is represented correctly:


However, when queried from Hive I get the same column but without any of the
escape characters:


Naturally this breaks my JSON parsers and I'm unable to use the data. Has
anyone encountered this error before? I tried looking through the source but
all I can find that I think is related is the HiveContext.toHiveString
method.

Any advice would be appreciated!

Thanks,
Scott



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/HiveThriftServer-fails-to-quote-strings-tp25877.html<https://na01.safelinks.protection.outlook.com/?url=http%3a%2f%2fapache-spark-user-list.1001560.n3.nabble.com%2fHiveThriftServer-fails-to-quote-strings-tp25877.html&data=01%7c01%7csclyon%40microsoft.com%7cfe9979d3dc684890499308d31540dceb%7c72f988bf86f141af91ab2d7cd011db47%7c1&sdata=ALnbYtyC3ym6cC%2fht3THBuRMtqGaTz2Lpkj%2fSxmhnPc%3d>
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>


Reply via email to