Re: Spark2 with YARN
That stopped the one error from occurring. Thanks On Fri, May 25, 2018 at 12:20 AM, Jeff Zhangwrote: > Just disable the timeline service of yarn: set yarn.timeline-service.enabled > to false in yarn-site.xml and restart hadoop > > > Pat Ferrel 于2018年5月25日周五 上午7:56写道: > >> I’m having a java.lang.NoClassDefFoundError in a different context and >> different class. Have you tried this without Yarn? Sorry I can’t find the >> rest of this thread. >> >> >> From: Miller, Clifford >> >> Reply: user@predictionio.apache.org >> >> Date: May 24, 2018 at 4:16:58 PM >> To: user@predictionio.apache.org >> >> Subject: Spark2 with YARN >> >> I've setup a cluster using Hortonworks HDP with Ambari all running in >> AWS. I then created a separate EC2 instance and installed PIO 0.12.1, >> hadoop, elasticsearch, hbase, and spark2. I copied the configurations from >> the HDP cluster and then pio-start-all. The pio-start-all completes >> successfully and running "pio status" also shows success. I'm following >> the "Text Classification Engine Tutorial". I've imported the data. I'm >> using the following command to train: "pio train -- --master yarn". After >> running the command I get the following exception. Does anyone have any >> ideas of what I may have missed during my setup? >> >> Thanks in advance. >> >> # >> Exception follows: >> >> Exception in thread "main" java.lang.NoClassDefFoundError: >> com/sun/jersey/api/client/config/ClientConfig >> at org.apache.hadoop.yarn.client.api.TimelineClient. >> createTimelineClient(TimelineClient.java:45) >> at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl. >> serviceInit(YarnClientImpl.java:163) >> at org.apache.hadoop.service.AbstractService.init( >> AbstractService.java:163) >> at org.apache.spark.deploy.yarn.Client.submitApplication( >> Client.scala:152) >> at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend. >> start(YarnClientSchedulerBackend.scala:56) >> at org.apache.spark.scheduler.TaskSchedulerImpl.start( >> TaskSchedulerImpl.scala:156) >> at org.apache.spark.SparkContext.(SparkContext.scala:509) >> at org.apache.predictionio.workflow.WorkflowContext$. >> apply(WorkflowContext.scala:45) >> at org.apache.predictionio.workflow.CoreWorkflow$. >> runTrain(CoreWorkflow.scala:59) >> at org.apache.predictionio.workflow.CreateWorkflow$.main( >> CreateWorkflow.scala:251) >> at org.apache.predictionio.workflow.CreateWorkflow.main( >> CreateWorkflow.scala) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at sun.reflect.NativeMethodAccessorImpl.invoke( >> NativeMethodAccessorImpl.java:62) >> at sun.reflect.DelegatingMethodAccessorImpl.invoke( >> DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:498) >> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ >> deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) >> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( >> SparkSubmit.scala:187) >> at org.apache.spark.deploy.SparkSubmit$.submit( >> SparkSubmit.scala:212) >> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. >> scala:126) >> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >> Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client. >> config.ClientConfig >> at java.net.URLClassLoader.findClass(URLClassLoader.java:381) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:424) >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:357) >> ... 20 more >> >> ## >> >> -- Clifford Miller Mobile | 321.431.9089
Re: Spark2 with YARN
Just disable the timeline service of yarn: set yarn.timeline-service.enabled to false in yarn-site.xml and restart hadoop Pat Ferrel于2018年5月25日周五 上午7:56写道: > I’m having a java.lang.NoClassDefFoundError in a different context and > different class. Have you tried this without Yarn? Sorry I can’t find the > rest of this thread. > > > From: Miller, Clifford > > Reply: user@predictionio.apache.org > > Date: May 24, 2018 at 4:16:58 PM > To: user@predictionio.apache.org > > Subject: Spark2 with YARN > > I've setup a cluster using Hortonworks HDP with Ambari all running in > AWS. I then created a separate EC2 instance and installed PIO 0.12.1, > hadoop, elasticsearch, hbase, and spark2. I copied the configurations from > the HDP cluster and then pio-start-all. The pio-start-all completes > successfully and running "pio status" also shows success. I'm following > the "Text Classification Engine Tutorial". I've imported the data. I'm > using the following command to train: "pio train -- --master yarn". After > running the command I get the following exception. Does anyone have any > ideas of what I may have missed during my setup? > > Thanks in advance. > > # > Exception follows: > > Exception in thread "main" java.lang.NoClassDefFoundError: > com/sun/jersey/api/client/config/ClientConfig > at > org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45) > at > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163) > at > org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) > at > org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:152) > at > org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56) > at > org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) > at org.apache.spark.SparkContext.(SparkContext.scala:509) > at > org.apache.predictionio.workflow.WorkflowContext$.apply(WorkflowContext.scala:45) > at > org.apache.predictionio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:59) > at > org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:251) > at > org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) > at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) > at > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Caused by: java.lang.ClassNotFoundException: > com.sun.jersey.api.client.config.ClientConfig > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ... 20 more > > ## > >
Re: Spark2 with YARN
I’m having a java.lang.NoClassDefFoundError in a different context and different class. Have you tried this without Yarn? Sorry I can’t find the rest of this thread. From: Miller, CliffordReply: user@predictionio.apache.org Date: May 24, 2018 at 4:16:58 PM To: user@predictionio.apache.org Subject: Spark2 with YARN I've setup a cluster using Hortonworks HDP with Ambari all running in AWS. I then created a separate EC2 instance and installed PIO 0.12.1, hadoop, elasticsearch, hbase, and spark2. I copied the configurations from the HDP cluster and then pio-start-all. The pio-start-all completes successfully and running "pio status" also shows success. I'm following the "Text Classification Engine Tutorial". I've imported the data. I'm using the following command to train: "pio train -- --master yarn". After running the command I get the following exception. Does anyone have any ideas of what I may have missed during my setup? Thanks in advance. # Exception follows: Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45) at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:152) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) at org.apache.spark.SparkContext.(SparkContext.scala:509) at org.apache.predictionio.workflow.WorkflowContext$.apply(WorkflowContext.scala:45) at org.apache.predictionio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:59) at org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:251) at org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 20 more ##