Re: Sqoop potgres table to hive in parquet format problem.

2016-12-18 Thread Sharath Punreddy
Looks like you are using Avro to read Parquet file. Sincerely, Sharath Punreddy Email:srpunre...@gmail.com Phone: 918-973-3399 On Tue, Dec 13, 2016 at 9:04 AM, ws wrote: > Hive: 2.1.0 > Sqoop: 1.4.6 > > ### > hive> select * from dimemployee; > OK > Failed with

Re: Requesting write access to Hive wiki

2016-12-18 Thread Lefty Leverenz
You've got it. Welcome to the Hive wiki team, Michael! -- Lefty On Sun, Dec 18, 2016 at 1:25 PM, mikey d wrote: > Requesting write access to Hive wiki > > Request send already to user-subscr...@hive.apache.org > > If there is anything futher needed, please let me know. >

Re: Requesting write access to Hive wiki

2016-12-18 Thread mikey d
Requesting write access to Hive wiki Request send already to user-subscr...@hive.apache.org If there is anything futher needed, please let me know. User: mdeguzis On Sun, Dec 18, 2016 at 4:24 PM, mikey d wrote: > Requesting write access to Hive wiki > > Request send

Re: Specifying orc.stripe.size in Spark

2016-12-18 Thread Mich Talebzadeh
You can use straight sql command to create ORC table in Hive. Assuming you have registered a temp table val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc) s.registerTempTable("tmp") sqltext = """ CREATE TABLE test.dummy2 ( ID INT , CLUSTERED INT , SCATTERED INT ,

Specifying orc.stripe.size in Spark

2016-12-18 Thread Daniel Haviv
Hi, When writing a dataframe using: df.write.orc("/path/to/orc") How can I specify orc parameters like orc.stripe.size ? Thank you, Daniel