We are trying to use spark-jobserver for one of our requirement. We referred
*https://github.com/fedragon/spark-jobserver-examples* and modified little
to match our requirement as below -

/****** ProductionRDDBuilder.scala *******/
package sparking
package jobserver

// Import required libraries.
import org.apache.spark.SparkContext, SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._
import org.apache.spark.rdd.RDD

// Main class definition.
trait ProductionRDDBuilder {

  def build(sc: SparkContext): RDD[(AutoId, Production)] = {
        // Defining required connections.
        val conf = new SparkConf(true).set("spark.cassandra.connection.host",
"127.0.0.1")
                                        .setAppName("jobserver test demo")
                                        .setMaster("local[4]")
        val sc = new SparkContext(conf)
                
        // Preparing RDDs.
        val vProduction = sc.cassandraTable[Production]("java_api",
"productionlog")
        vProduction.keyBy(f => f.autoid)
  }
  
}

Now, when we POST built jar file to spark-jobserver using curl and run *curl
-X POST
'localhost:8090/jobs?appName=sparking&classPath=sparking.jobserver.GetOrCreateProduction&context=production-context'*,
the job getting failed with *"cause":
"com/datastax/spark/connector/package$", ..., "causingClass":
"java.lang.NoClassDefFoundError",
    "message": "Failed to create named RDD 'production'"* error. 

As per our knowledge, above problem must be related to classpath JARs during
runtime. In *https://github.com/spark-jobserver/spark-jobserver* link, it
has been mentioned about EXTRA_JAR environment variable.

How do we set EXTRA_JAR environment variable for spark-jobserver SBT when
running in Windows Server 2012?
Does EXTRA_JAR environment variable set will help to resolve above class not
found exception?

Your suggestions would be appreciated.

Sasi 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXTRA-JAR-environment-variable-for-spark-jobserver-tp20989.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to