Thanks a lot. Finally, I can create parquet table using your command
-driver-class-path.
I am using hadoop 2.3. Now, I will try to load data into the tables.
Thanks,
lyc
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Does-HiveContext-support-Parquet
Hi Silvio,
I re-downloaded hive-0.12-bin and reset the related path in spark-env.sh.
However, I still got some error. Do you happen to know any step I did wrong?
Thank you!
My detailed step is as follows:
#enter spark-shell (successful)
/bin/spark-shell --master spark://S4:7077 --jars
I followed your instructions to try to load data as parquet format through
hiveContext but failed. Do you happen to know my uncorrectness in the
following steps?
The steps I am following is like:
1. download parquet-hive-bundle-1.5.0.jar
2. revise hive-site.xml including this:
property
Thanks for your help.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Does-HiveContext-support-Parquet-tp12209p12231.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Since SqlContext supports less SQL than Hive (if I understand correctly), I
plan to run more queries by hql. However, is that possible to create some
tables as Parquet in hql? What kind of commands should I use? Thanks in
advance for any information.
--
View this message in context:
Thank you for your reply.
Do you know where I can find some detailed information about how to use
Parquet in HiveContext?
Any information is appreciated.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Does-HiveContext-support-Parquet-tp12209p12216.html