Getting Exceptions/WARN during random runs for same dataset

2016-01-29 Thread Khusro Siddiqui
Hi Everyone,

Environment used: Datastax Enterprise 4.8.3 which is bundled with Spark
1.4.1 and scala 2.10.5.

I am using Dataframes to query Cassandra, do processing and store the
result back into Cassandra. The job is being submitted using spark-submit
on a cluster of 3 nodes. While doing so I get three WARN messages:

WARN  2016-01-28 19:08:18 org.apache.spark.scheduler.TaskSetManager: Lost
task 99.0 in stage 2.0 (TID 107, 10.2.1.82): java.io.InvalidClassException:
org.apache.spark.sql.types.TimestampType$; unable to create instance

Caused by: java.lang.reflect.InvocationTargetException

Caused by: java.lang.UnsupportedOperationException: tail of empty list


For example, if I am running the same job, for the same input set of data,
say 20 times,

- 11 times it will run successfully without any WARN messages

- 4 times it will run successfully with the above messages

- 6 times it will run successfully by randomly giving one or two of the
exceptions above


In all the 20 runs, the output data is coming as expected and there is no
error in that. My concern is, why is it not giving these messages every
time I do a spark-submit but only at times. Also, the stack trace does not
point to any specific point in my line of code. Full stack trace is as
follows. Please let me know if you need any other information


WARN  2016-01-28 19:08:24 org.apache.spark.scheduler.TaskSetManager: Lost
task 188.0 in stage 16.0 (TID 637, 10.2.1.82):
java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
unable to create instance

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1788)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)

at scala.collection.immutable.$colon$colon.readObject(List.scala:362)

at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)

at scala.collection.immutable.$colon$colon.readObject(List.scala:362)

at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at 

Re: Getting Exceptions/WARN during random runs for same dataset

2016-01-29 Thread Shixiong(Ryan) Zhu
It's a known issue. See https://issues.apache.org/jira/browse/SPARK-10719

On Thu, Jan 28, 2016 at 5:44 PM, Khusro Siddiqui  wrote:

> It is happening on random executors on random nodes. Not on any specific
> node everytime.
> Or not happening at all
>
> On Thu, Jan 28, 2016 at 7:42 PM, Ted Yu  wrote:
>
>> Did the UnsupportedOperationException's happen from the executors on all the
>> nodes or only one node ?
>>
>> Thanks
>>
>> On Thu, Jan 28, 2016 at 5:13 PM, Khusro Siddiqui 
>> wrote:
>>
>>> Hi Everyone,
>>>
>>> Environment used: Datastax Enterprise 4.8.3 which is bundled with Spark
>>> 1.4.1 and scala 2.10.5.
>>>
>>> I am using Dataframes to query Cassandra, do processing and store the
>>> result back into Cassandra. The job is being submitted using spark-submit
>>> on a cluster of 3 nodes. While doing so I get three WARN messages:
>>>
>>> WARN  2016-01-28 19:08:18 org.apache.spark.scheduler.TaskSetManager:
>>> Lost task 99.0 in stage 2.0 (TID 107, 10.2.1.82):
>>> java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
>>> unable to create instance
>>>
>>> Caused by: java.lang.reflect.InvocationTargetException
>>>
>>> Caused by: java.lang.UnsupportedOperationException: tail of empty list
>>>
>>>
>>> For example, if I am running the same job, for the same input set of
>>> data, say 20 times,
>>>
>>> - 11 times it will run successfully without any WARN messages
>>>
>>> - 4 times it will run successfully with the above messages
>>>
>>> - 6 times it will run successfully by randomly giving one or two of
>>> the exceptions above
>>>
>>>
>>> In all the 20 runs, the output data is coming as expected and there is
>>> no error in that. My concern is, why is it not giving these messages every
>>> time I do a spark-submit but only at times. Also, the stack trace does not
>>> point to any specific point in my line of code. Full stack trace is as
>>> follows. Please let me know if you need any other information
>>>
>>>
>>> WARN  2016-01-28 19:08:24 org.apache.spark.scheduler.TaskSetManager:
>>> Lost task 188.0 in stage 16.0 (TID 637, 10.2.1.82):
>>> java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
>>> unable to create instance
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1788)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)
>>>
>>> at
>>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at
>>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at
>>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>>>
>>> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>>>
>>> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>
>>> at
>>> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at
>>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>
>>> at
>>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>>
>>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>>
>>> at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>>
>>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>

Getting Exceptions/WARN during random runs for same dataset

2016-01-28 Thread Khusro Siddiqui
Hi Everyone,

Environment used: Datastax Enterprise 4.8.3 which is bundled with Spark
1.4.1 and scala 2.10.5.

I am using Dataframes to query Cassandra, do processing and store the
result back into Cassandra. The job is being submitted using spark-submit
on a cluster of 3 nodes. While doing so I get three WARN messages:

WARN  2016-01-28 19:08:18 org.apache.spark.scheduler.TaskSetManager: Lost
task 99.0 in stage 2.0 (TID 107, 10.2.1.82): java.io.InvalidClassException:
org.apache.spark.sql.types.TimestampType$; unable to create instance

Caused by: java.lang.reflect.InvocationTargetException

Caused by: java.lang.UnsupportedOperationException: tail of empty list


For example, if I am running the same job, for the same input set of data,
say 20 times,

- 11 times it will run successfully without any WARN messages

- 4 times it will run successfully with the above messages

- 6 times it will run successfully by randomly giving one or two of the
exceptions above


In all the 20 runs, the output data is coming as expected and there is no
error in that. My concern is, why is it not giving these messages every
time I do a spark-submit but only at times. Also, the stack trace does not
point to any specific point in my line of code. Full stack trace is as
follows. Please let me know if you need any other information


WARN  2016-01-28 19:08:24 org.apache.spark.scheduler.TaskSetManager: Lost
task 188.0 in stage 16.0 (TID 637, 10.2.1.82):
java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
unable to create instance

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1788)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)

at scala.collection.immutable.$colon$colon.readObject(List.scala:362)

at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)

at scala.collection.immutable.$colon$colon.readObject(List.scala:362)

at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)

at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)

at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)

at 

Re: Getting Exceptions/WARN during random runs for same dataset

2016-01-28 Thread Khusro Siddiqui
It is happening on random executors on random nodes. Not on any specific
node everytime.
Or not happening at all

On Thu, Jan 28, 2016 at 7:42 PM, Ted Yu  wrote:

> Did the UnsupportedOperationException's happen from the executors on all the
> nodes or only one node ?
>
> Thanks
>
> On Thu, Jan 28, 2016 at 5:13 PM, Khusro Siddiqui 
> wrote:
>
>> Hi Everyone,
>>
>> Environment used: Datastax Enterprise 4.8.3 which is bundled with Spark
>> 1.4.1 and scala 2.10.5.
>>
>> I am using Dataframes to query Cassandra, do processing and store the
>> result back into Cassandra. The job is being submitted using spark-submit
>> on a cluster of 3 nodes. While doing so I get three WARN messages:
>>
>> WARN  2016-01-28 19:08:18 org.apache.spark.scheduler.TaskSetManager: Lost
>> task 99.0 in stage 2.0 (TID 107, 10.2.1.82): java.io.InvalidClassException:
>> org.apache.spark.sql.types.TimestampType$; unable to create instance
>>
>> Caused by: java.lang.reflect.InvocationTargetException
>>
>> Caused by: java.lang.UnsupportedOperationException: tail of empty list
>>
>>
>> For example, if I am running the same job, for the same input set of
>> data, say 20 times,
>>
>> - 11 times it will run successfully without any WARN messages
>>
>> - 4 times it will run successfully with the above messages
>>
>> - 6 times it will run successfully by randomly giving one or two of
>> the exceptions above
>>
>>
>> In all the 20 runs, the output data is coming as expected and there is no
>> error in that. My concern is, why is it not giving these messages every
>> time I do a spark-submit but only at times. Also, the stack trace does not
>> point to any specific point in my line of code. Full stack trace is as
>> follows. Please let me know if you need any other information
>>
>>
>> WARN  2016-01-28 19:08:24 org.apache.spark.scheduler.TaskSetManager: Lost
>> task 188.0 in stage 16.0 (TID 637, 10.2.1.82):
>> java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
>> unable to create instance
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1788)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)
>>
>> at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>>
>> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>>
>> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:497)
>>
>> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>>
>> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>>
>> at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>>
>> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>
>> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>>
>> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>>
>> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>>
>> at
>> 

Re: Getting Exceptions/WARN during random runs for same dataset

2016-01-28 Thread Ted Yu
Did the UnsupportedOperationException's happen from the executors on all the
nodes or only one node ?

Thanks

On Thu, Jan 28, 2016 at 5:13 PM, Khusro Siddiqui  wrote:

> Hi Everyone,
>
> Environment used: Datastax Enterprise 4.8.3 which is bundled with Spark
> 1.4.1 and scala 2.10.5.
>
> I am using Dataframes to query Cassandra, do processing and store the
> result back into Cassandra. The job is being submitted using spark-submit
> on a cluster of 3 nodes. While doing so I get three WARN messages:
>
> WARN  2016-01-28 19:08:18 org.apache.spark.scheduler.TaskSetManager: Lost
> task 99.0 in stage 2.0 (TID 107, 10.2.1.82): java.io.InvalidClassException:
> org.apache.spark.sql.types.TimestampType$; unable to create instance
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> Caused by: java.lang.UnsupportedOperationException: tail of empty list
>
>
> For example, if I am running the same job, for the same input set of data,
> say 20 times,
>
> - 11 times it will run successfully without any WARN messages
>
> - 4 times it will run successfully with the above messages
>
> - 6 times it will run successfully by randomly giving one or two of
> the exceptions above
>
>
> In all the 20 runs, the output data is coming as expected and there is no
> error in that. My concern is, why is it not giving these messages every
> time I do a spark-submit but only at times. Also, the stack trace does not
> point to any specific point in my line of code. Full stack trace is as
> follows. Please let me know if you need any other information
>
>
> WARN  2016-01-28 19:08:24 org.apache.spark.scheduler.TaskSetManager: Lost
> task 188.0 in stage 16.0 (TID 637, 10.2.1.82):
> java.io.InvalidClassException: org.apache.spark.sql.types.TimestampType$;
> unable to create instance
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1788)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345)
>
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>
> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
>
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>
> at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
>
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
>
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
>
> at