Sean - thanks. definitely related to SPARK-12154.

Is there a way to continue use Jersey 1 for existing working environment?

Or, what's the best way to patch up existing Jersey 1 environment to Jersey
2?

This does break all of our Spark jobs running Spark 2.0 on YARN.

                                                                                
                                                              
                                                                                
                                                              
                                                                                
                                                              
                                                                                
                                                              
                                                                                
                                                              
                                                                                
                                                              
                                   JESSE CHEN                                   
                                                              
                                   Big Data Performance | IBM Analytics         
                                                              
                                                                                
                                                              
                                   Office:  408 463 2296                        
                                                              
                                   Mobile: 408 828 9068                         
                                                              
                                   Email:   jfc...@us.ibm.com                   
                                                              
                                                                                
                                                              
                                                                                
                                                              






From:   Sean Owen <so...@cloudera.com>
To:     Jesse F Chen/San Francisco/IBM@IBMUS
Cc:     spark users <user@spark.apache.org>, dev
            <d...@spark.apache.org>, Roy Cecil <roy.ce...@ie.ibm.com>, Matt
            Cheah <mch...@palantir.com>
Date:   05/09/2016 02:19 PM
Subject:        Re: spark 2.0 issue with yarn?



Hm, this may be related to updating to Jersey 2, which happened 4 days
ago: https://issues.apache.org/jira/browse/SPARK-12154

That is a Jersey 1 class that's missing. How are you building and running
Spark?

I think the theory was that Jersey 1 would still be supplied at runtime. We
may have to revise the exclusions.

On Mon, May 9, 2016 at 9:24 PM, Jesse F Chen <jfc...@us.ibm.com> wrote:
  I had been running fine until builds around 05/07/2016....

  If I used the "--master yarn" in builds after 05/07, I got the following
  error...sounds like something jars are missing.

  I am using YARN 2.7.2 and Hive 1.2.1.

  Do I need something new to deploy related to YARN?

  bin/spark-sql -driver-memory 10g --verbose --master yarn --packages
  com.databricks:spark-csv_2.10:1.3.0 --executor-memory 4g --num-executors
  20 --executor-cores 2

  16/05/09 13:15:21 INFO server.Server: jetty-8.y.z-SNAPSHOT
  16/05/09 13:15:21 INFO server.AbstractConnector: Started
  SelectChannelConnector@0.0.0.0:4041
  16/05/09 13:15:21 INFO util.Utils: Successfully started service 'SparkUI'
  on port 4041.
  16/05/09 13:15:21 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started
  at http://bigaperf116.svl.ibm.com:4041
  Exception in thread "main" java.lang.NoClassDefFoundError:
  com/sun/jersey/api/client/config/ClientConfig
  at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient
  (TimelineClient.java:45)
  at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit
  (YarnClientImpl.java:163)
  at org.apache.hadoop.service.AbstractService.init
  (AbstractService.java:163)
  at org.apache.spark.deploy.yarn.Client.submitApplication
  (Client.scala:150)
  at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start
  (YarnClientSchedulerBackend.scala:56)
  at org.apache.spark.scheduler.TaskSchedulerImpl.start
  (TaskSchedulerImpl.scala:148)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate
  (SparkSession.scala:762)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init
  (SparkSQLEnv.scala:57)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>
  (SparkSQLCLIDriver.scala:281)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main
  (SparkSQLCLIDriver.scala:138)
  at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main
  (SparkSQLCLIDriver.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke
  (NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke
  (DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
  $SparkSubmit$$runMain(SparkSubmit.scala:727)
  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1
  (SparkSubmit.scala:183)
  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  Caused by: java.lang.ClassNotFoundException:
  com.sun.jersey.api.client.config.ClientConfig
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 22 more
  16/05/09 13:15:21 INFO storage.DiskBlockManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Shutdown hook called
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-ac33b501-b9c3-47a3-93c8-fa02720bf4bb
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory /tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c
  16/05/09 13:15:21 INFO util.ShutdownHookManager: Deleting
  directory 
/tmp/spark-65cb43d9-c122-4106-a0a8-ae7d92d9e19c/userFiles-46dde536-29e5-46b3-a530-e5ad6640f8b2








                                                                                
                                                            
                                                                                
                                                            
                                                                                
                                                            
                                                                                
                                                            
                             JESSE CHEN                                         
                                                            
                             Big Data Performance | IBM Analytics               
                                                            
                                                                                
                                                            
                             Office: 408 463 2296                               
                                                            
                             Mobile: 408 828 9068                               
                                                            
                             Email: jfc...@us.ibm.com                           
                                                            
                                                                                
                                                            
                                                                                
                                                            












Reply via email to