Just submitted a PR to fix this https://github.com/apache/spark/pull/3059
On Sun, Nov 2, 2014 at 12:36 AM, Jean-Pascal Billaud
wrote:
> Great! Thanks.
>
> Sent from my iPad
>
> On Nov 1, 2014, at 8:35 AM, Cheng Lian wrote:
>
> Hi Jean,
>
> Thanks for reporting this. This is indeed a bug: some c
Great! Thanks.
Sent from my iPad
> On Nov 1, 2014, at 8:35 AM, Cheng Lian wrote:
>
> Hi Jean,
>
> Thanks for reporting this. This is indeed a bug: some column types (Binary,
> Array, Map and Struct, and unfortunately for some reason, Boolean), a
> NoopColumnStats is used to collect column st
Hi Jean,
Thanks for reporting this. This is indeed a bug: some column types (Binary,
Array, Map and Struct, and unfortunately for some reason, Boolean), a
NoopColumnStats is used to collect column statistics, which causes this
issue. Filed SPARK-4182 to track this issue, will fix this ASAP.
Cheng
Hmmm, this looks like a bug. Can you file a JIRA?
On Thu, Oct 30, 2014 at 4:04 PM, Jean-Pascal Billaud
wrote:
> Hi,
>
> While testing SparkSQL on top of our Hive metastore, I am getting
> some java.lang.ArrayIndexOutOfBoundsException while reusing a cached RDD
> table.
>
> Basically, I have a t
Hi,
While testing SparkSQL on top of our Hive metastore, I am getting
some java.lang.ArrayIndexOutOfBoundsException while reusing a cached RDD
table.
Basically, I have a table "mtable" partitioned by some "date" field in hive
and below is the scala code I am running in spark-shell:
val sqlContex