Hi,

we have an issue with our current deployment of zeppelin on k8s, and more
precisely with spark interpreter.

For reference - the spark context is: scala 2.12.10 / spark 2.4.7

We have a weird behaviour, running the spark interpreter in per note, scoped

To reproduce currently - we restart the spark interpreter in scoped per
note, and create two notebooks (A & B) with the same following code:

%spark
> import spark.implicits._
>
> List(1, 2, 3).toDS.map(_ + 1).show
>

1- we run notebook A successfully
2 - we run notebook B  - it fails with class cast exception

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
> in stage 24.0 failed 4 times, most recent failure: Lost task 0.3 in stage
> 24.0 (TID 161, 10.11.18.133, executor 2): java.lang.ClassCastException:
> cannot assign instance of java.lang.invoke.SerializedLambda to field
> org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance
> of org.apache.spark.rdd.MapPartitionsRDD at
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
> at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
>

Anyone having a working zeppelin deployment with k8s / spark 2.4 - scala
2.12 ?

or let anyone interested to make some $$$ to help us fix the issue?

cheers

Reply via email to