Hello,
Please I need your help with the Kerberos authentication with Hive.
I am following this guide:
https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1
But I am getting this error:
Caused by: org.ietf.jgss.GSSException: No valid
> Has there been any study of how much compressing Hive Parquet tables with
> snappy reduces storage space or simply the table size in quantitative terms?
http://www.slideshare.net/oom65/file-format-benchmarks-avro-json-orc-parquet/20
Since SNAPPY is just LZ77, I would assume it would be
Mich,
Here are the benchmarks that I did using three different types of data:
http://www.slideshare.net/HadoopSummit/file-format-benchmark-avro-json-orc-parquet
I assume you are comparing parquet-snappy vs parquet-none.
.. Owen
On Wed, Jan 25, 2017 at 1:37 PM, Mich Talebzadeh
Hi,
Has there been any study of how much compressing Hive Parquet tables with
snappy reduces storage space or simply the table size in quantitative terms?
Thanks
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> Error 40003]: Only External tables can have an explicit location
…
> using hive 1.2. I got this error. This was definitely not a requirement
> before
Are you using Apache hive or some vendor fork?
Some BI engines demand there be no aliasing for tables, so each table needs a
unique location
Wow. This is gold.
Dudu
From: Dmitry Tolpeko [mailto:dmtolp...@gmail.com]
Sent: Wednesday, January 25, 2017 6:47 PM
To: user@hive.apache.org
Subject: Hive table for a single file: CREATE/ALTER TABLE differences
I accidentally noticed one feature:
(it is well know
n
that in CREATE TABLE
Does anyone here have a recommendation for a windows ODBC driver that will work
with Hive 1.2.1?
Thanks,
~ Shawn
[cid:imagedc5393.GIF@aa961dfa.4bb670d3]
Shawn Lavelle
Software Development
4101 Arrowhead Drive
Medina, Minnesota 55340-9457
Phone: 763 551 0559
Fax: 763 551 0750
Email:
I accidentally noticed one feature:
(it is well know
n
that in CREATE TABLE you must specify a directory for the table LOCATION
otherwise you get: "Can't make directory for path 's3n://dir/file' since it
is a file.")
But at the same time, ALTER TABLE SET LOCATION 's3n://dir/file' works
Error 40003]: Only External tables can have an explicit location
using hive 1.2. I got this error. This was definitely not a requirement
before
Why way this added? External table ONLY used to be dropping the table will
not drop the physical files.
Hi folks,
I’m working on some workflow that needs to hit an API for fetching hive table
schema.
Currently, I’m using HCatalog Tempelton API
"apiURL:50111/templeton/v1/ddl/database/dbName/table/tableName?user.name=hive”
but as the rate of requests is increasing (currently the max is 10 per
10 matches
Mail list logo