Re: java.lang.IllegalArgumentException: Unsupported class file major version 55

2019-02-10 Thread Felix Cheung
And it might not work completely. Spark only officially supports JDK 8.

I’m not sure if JDK 9 and + support is complete?



From: Jungtaek Lim 
Sent: Thursday, February 7, 2019 5:22 AM
To: Gabor Somogyi
Cc: Hande, Ranjit Dilip (Ranjit); user@spark.apache.org
Subject: Re: java.lang.IllegalArgumentException: Unsupported class file major 
version 55

ASM 6 doesn't support Java 11. In master branch (for Spark 3.0) there's 
dependency upgrade on ASM 7 and also some efforts (if my understanding is 
right) to support Java 11, so you may need to use lower version of JDK (8 
safest) for Spark 2.4.0, and try out master branch for preparing Java 11.

Thanks,
Jungtaek Lim (HeartSaVioR)

2019년 2월 7일 (목) 오후 9:18, Gabor Somogyi 
mailto:gabor.g.somo...@gmail.com>>님이 작성:
Hi Hande,

"Unsupported class file major version 55" means java incompatibility.
This error means you're trying to load a Java "class" file that was compiled 
with a newer version of Java than you have installed.
For example, your .class file could have been compiled for JDK 8, and you're 
trying to run it with JDK 7.
Are you sure 11 is the only JDK which is the default?

Small number of peoples playing with JDK 11 but not heavily tested and used.
Spark may or may not work but not suggested for production in general.

BR,
G


On Thu, Feb 7, 2019 at 12:53 PM Hande, Ranjit Dilip (Ranjit) 
mailto:ha...@avaya.com>> wrote:
Hi,

I am developing one java process which will consume data from Kafka using 
Apache Spark Streaming.
For this I am using following:

Java:
openjdk version "11.0.1" 2018-10-16 LTS
OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK 64-Bit 
Server VM Zulu11.2+3 (build 11.0.1+13-LTS, mixed mode)

Maven: (Spark Streaming)

org.apache.spark
spark-streaming-kafka-0-10_2.11
2.4.0


org.apache.spark
spark-streaming_2.11
2.4.0


I am able to compile project successfully but when I try to run I get following 
error:

{"@timestamp":"2019-02-07T11:54:30.624+05:30","@version":"1","message":"Application
 run 
failed","logger_name":"org.springframework.boot.SpringApplication","thread_name":"main","level":"ERROR","level_value":4,"stack_trace":"java.lang.IllegalStateException:
 Failed to execute CommandLineRunner at 
org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:816)
 at 
org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:797)
 at org.springframework.boot.SpringApplication.run(SpringApplication.java:324) 
at 
com.avaya.measures.AgentMeasures.AgentMeasuresApplication.main(AgentMeasuresApplication.java:41)
 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:566) at 
org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) 
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) at

org.springframework.boot.loader.Launcher.launch(Launcher.java:50) at 
org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)\r\nCaused 
by: java.lang.IllegalArgumentException: Unsupported class file major version 55 
at

 org.apache.xbean.asm6.ClassReader.(ClassReader.java:166) at 
org.apache.xbean.asm6.ClassReader.(ClassReader.java:148) at 
org.apache.xbean.asm6.ClassReader.(ClassReader.java:136) at 
org.apache.xbean.asm6.ClassReader.(ClassReader.java:237) at 
org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49) 
at 
org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
 at 
org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
 at 
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
 at 
scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
 at 
scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
 at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) 
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at 
scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134) at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) 
at 
org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
 at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175) at 
org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238) at 
org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631) at 
org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355) at 

Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Gabor Somogyi
Another approach is adding artificial exception into the application's
source code like this:

val query = input.toDS.map(_ / 0).writeStream.format("console").start()

G


On Sun, Feb 10, 2019 at 9:36 PM Serega Sheypak 
wrote:

> Hi BR,
> thanks for your reply. I want to mimic the issue and kill tasks at a
> certain stage. Killing executor is also an option for me.
> I'm curious how do core spark contributors test spark fault tolerance?
>
>
> вс, 10 февр. 2019 г. в 16:57, Gabor Somogyi :
>
>> Hi Serega,
>>
>> If I understand your problem correctly you would like to kill one
>> executor only and the rest of the app has to be untouched.
>> If that's true yarn -kill is not what you want because it stops the whole
>> application.
>>
>> I've done similar thing when tested/testing Spark's HA features.
>> - jps -vlm | grep
>> "org.apache.spark.executor.CoarseGrainedExecutorBackend.*applicationid"
>> - kill -9 pidofoneexecutor
>>
>> Be aware if it's a multi-node cluster check whether at least one process
>> runs on a specific node(it's not required).
>> Happy killing...
>>
>> BR,
>> G
>>
>>
>> On Sun, Feb 10, 2019 at 4:19 PM Jörn Franke  wrote:
>>
>>> yarn application -kill applicationid ?
>>>
>>> > Am 10.02.2019 um 13:30 schrieb Serega Sheypak <
>>> serega.shey...@gmail.com>:
>>> >
>>> > Hi there!
>>> > I have weird issue that appears only when tasks fail at specific
>>> stage. I would like to imitate failure on my own.
>>> > The plan is to run problematic app and then kill entire executor or
>>> some tasks when execution reaches certain stage.
>>> >
>>> > Is it do-able?
>>>
>>> -
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>


Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Serega Sheypak
Hi BR,
thanks for your reply. I want to mimic the issue and kill tasks at a
certain stage. Killing executor is also an option for me.
I'm curious how do core spark contributors test spark fault tolerance?


вс, 10 февр. 2019 г. в 16:57, Gabor Somogyi :

> Hi Serega,
>
> If I understand your problem correctly you would like to kill one executor
> only and the rest of the app has to be untouched.
> If that's true yarn -kill is not what you want because it stops the whole
> application.
>
> I've done similar thing when tested/testing Spark's HA features.
> - jps -vlm | grep
> "org.apache.spark.executor.CoarseGrainedExecutorBackend.*applicationid"
> - kill -9 pidofoneexecutor
>
> Be aware if it's a multi-node cluster check whether at least one process
> runs on a specific node(it's not required).
> Happy killing...
>
> BR,
> G
>
>
> On Sun, Feb 10, 2019 at 4:19 PM Jörn Franke  wrote:
>
>> yarn application -kill applicationid ?
>>
>> > Am 10.02.2019 um 13:30 schrieb Serega Sheypak > >:
>> >
>> > Hi there!
>> > I have weird issue that appears only when tasks fail at specific stage.
>> I would like to imitate failure on my own.
>> > The plan is to run problematic app and then kill entire executor or
>> some tasks when execution reaches certain stage.
>> >
>> > Is it do-able?
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>


Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Gabor Somogyi
Hi Serega,

If I understand your problem correctly you would like to kill one executor
only and the rest of the app has to be untouched.
If that's true yarn -kill is not what you want because it stops the whole
application.

I've done similar thing when tested/testing Spark's HA features.
- jps -vlm | grep
"org.apache.spark.executor.CoarseGrainedExecutorBackend.*applicationid"
- kill -9 pidofoneexecutor

Be aware if it's a multi-node cluster check whether at least one process
runs on a specific node(it's not required).
Happy killing...

BR,
G


On Sun, Feb 10, 2019 at 4:19 PM Jörn Franke  wrote:

> yarn application -kill applicationid ?
>
> > Am 10.02.2019 um 13:30 schrieb Serega Sheypak  >:
> >
> > Hi there!
> > I have weird issue that appears only when tasks fail at specific stage.
> I would like to imitate failure on my own.
> > The plan is to run problematic app and then kill entire executor or some
> tasks when execution reaches certain stage.
> >
> > Is it do-able?
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Jörn Franke
yarn application -kill applicationid ?

> Am 10.02.2019 um 13:30 schrieb Serega Sheypak :
> 
> Hi there!
> I have weird issue that appears only when tasks fail at specific stage. I 
> would like to imitate failure on my own. 
> The plan is to run problematic app and then kill entire executor or some 
> tasks when execution reaches certain stage.
> 
> Is it do-able? 

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Spark on YARN, HowTo kill executor or individual task?

2019-02-10 Thread Serega Sheypak
Hi there!
I have weird issue that appears only when tasks fail at specific stage. I
would like to imitate failure on my own.
The plan is to run problematic app and then kill entire executor or some
tasks when execution reaches certain stage.

Is it do-able?