aded Kryo is being used here.
>
> /cc +Marcelo Vanzin and +Steve Loughran
> , who may know more.
>
> On Wed, Apr 6, 2016 at 6:08 PM Soam Acharya wrote:
>
>> Hi folks,
>>
>> I have a build of Spark 1.6.1 on which spark sql seems to be functional
>> outside
Hi folks,
I have a build of Spark 1.6.1 on which spark sql seems to be functional
outside of windowing functions. For example, I can create a simple external
table via Hive:
CREATE EXTERNAL TABLE PSTable (pid int, tty string, time string, cmd string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
Hi folks,
I understand that invoking sqlContext.cacheTable("tableName") will load the
table into a compressed in-memory columnar format. When Spark is launched
via spark shell in YARN client mode, is the table loaded into the local
Spark driver process in addition to the executors in the Hadoop cl