Hi,

I second Luke here: you have to use Spark 1.x or use the PR supporting Spark 
2.x.

Regards
JB

On 12/04/2017 08:14 PM, Lukasz Cwik wrote:
It seems like your trying to use Spark 2.1.0. Apache Beam currently relies on users using Spark 1.6.3. There is an open pull request[1] to migrate to Spark 2.2.0.

1: https://github.com/apache/beam/pull/4208/

On Mon, Dec 4, 2017 at 10:58 AM, Opitz, Daniel A <[email protected] <mailto:[email protected]>> wrote:

    We are trying to submit a Spark job through YARN with the following 
command:____

    __ __

    /spark-submit --conf spark.yarn.stagingDir=/path/to/stage  --verbose --class
    com.my.class --jars /path/to/jar1,path/to/jar2
    /path/to/main/jar/application.jar____/

    __ __

    The application is being populated in the YARN scheduler however it never
    appears to start the Application Master container. The trace is:____

    __ __

    /17/12/04 11:21:32 WARN Client: Neither spark.yarn.jars nor
    spark.yarn.archive is set, falling back to uploading libraries under
    SPARK_HOME.____/

    /17/12/04 11:27:16 ERROR SparkContext: Error initializing SparkContext.____/

    /org.apache.spark.SparkException: Yarn application has already ended! It
    might have been killed or unable to launch application master.____/

    /    at
    
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)____/

    /    at
    
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)____/

    /    at
    
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)____/

    /    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)____/

    /    at
    
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)____/

    /    at
    
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)____/

    /    at
    
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)____/

    /    at 
org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:197)____/

    /    at 
org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:86)____/

    /    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)____/

    /    at com.my.class.main(myclass.java:202)____/

    /    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)____/

    /    at
    
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)____/

    /    at
    
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)____/

    /    at java.lang.reflect.Method.invoke(Method.java:498)____/

    /    at
    
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)____/

    /    at
    org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)____/

    /    at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)____/

    /    at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)____/

    /    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)____/

    /17/12/04 11:27:16 WARN YarnSchedulerBackend$YarnSchedulerEndpoint:
    Attempted to request executors before the AM has registered!____/

    /17/12/04 11:27:16 WARN MetricsSystem: Stopping a MetricsSystem that is not
    running____/

    /Exception in thread "main" org.apache.spark.SparkException: Yarn
    application has already ended! It might have been killed or unable to launch
    application master.____/

    /    at
    
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)____/

    /    at
    
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)____/

    /    at
    
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)____/

    /    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)____/

    /    at
    
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)____/

    /    at
    
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)____/

    /    at
    
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:68)____/

    /    at 
org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:197)____/

    /    at 
org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:86)____/

    /    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)____/

    /    at at com.my.class.main(myclass.java:202)____/

    /    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)____/

    /    at
    
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)____/

    /    at
    
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)____/

    /    at java.lang.reflect.Method.invoke(Method.java:498)____/

    /    at
    
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)____/

    /    at
    org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)____/

    /    at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)____/

    /    at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)____/

    /    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)____/

    __ __

    We are able to run the sample Spark Pi job to completion without errors.____

    __ __

    Version:____

    __ __

    $ spark-submit --version____

    Welcome to____

           ____              ______

          / __/__  ___ _____/ /______

         _\ \/ _ \/ _ `/ __/  '_/____

        /___/ .__/\_,_/_/ /_/\_\   version 2.1.0-mapr-1703____

           /_/____

    __ __

    Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_101____

    __ __

    __ __

    We think the issue may be with how we are setting up our pipeline:____

    __ __

    /PipelineOptions options = PipelineOptionsFactory.create();____/

    /    options.setRunner(SparkRunner.class);____/

    /    Pipeline p = Pipeline.create(options);____/

    /__ __/

    We run our Pipeline with:____

    __ __

    /p.run(options);//____/

    __ __

    The pipeline was running successfully with the DirectRunner.____

    __ __

    We made sure to include */beam-runners-spark/* as a Maven Dependency.____

    *__ __*

    Any ideas?______

    __ __


    This e-mail, including attachments, may include confidential and/or
    proprietary information, and may be used only by the person or entity
    to which it is addressed. If the reader of this e-mail is not the intended
    recipient or his or her authorized agent, the reader is hereby notified
    that any dissemination, distribution or copying of this e-mail is
    prohibited. If you have received this e-mail in error, please notify the
    sender by replying to this message and delete this e-mail immediately.



--
Jean-Baptiste Onofré
[email protected]
http://blog.nanthrax.net
Talend - http://www.talend.com

Reply via email to