Hadoop 3.0 brings anyway some interesting benefits such as reduced storage
needs (you dont need to replicate anymore 3 times for reliability reasons), so
that may be convincing.
> On 22. Jul 2018, at 08:28, 彭鱼宴 <461292...@qq.com> wrote:
>
> Hi Tanvi,
>
> Thanks! I will check that and have a
Hi Tanvi,
Thanks! I will check that and have a talk with my colleagues to consider about
the upgrading.
Best,
Zhefu Peng
-- --
??: "Tanvi Thacker";
: 2018??7??21??(??) 3:24
??: "user";
: Re: Does Hive 3.0 only works
Hi,
Here is a confusion I encounter these days: I don't install or build snappy on
my hadoop cluster, but when I tested and compared about the compression ratio
of Parquet and ORC storage format. During the test, I can set the way of
compression for two storage format, for example, using