Add all your jars like this and pass it to the SparkContext List<String> *jars* = > Lists.newArrayList("/home/akhld/mobi/localcluster/x/spark-0.9.1-bin-hadoop2/assembly/target/scala-2.10/spark-assembly-0.9.1-hadoop2.2.0.jar", > > "/home/akhld/mobi/localcluster/codes/pig/build/ivy/lib/Pig/twitter4j-core-3.0.3.jar", > > "/home/akhld/mobi/localcluster/codes/pig/build/ivy/lib/Pig/twitter4j-stream-3.0.3.jar", > "/home/akhld/mobi/workspace2014/TwitterStreamer/bin/tstream.jar"); > SparkConf spconf = new SparkConf(); > spconf.setMaster("local[2]"); > spconf.setAppName("TwitterStreamer"); > > spconf.setSparkHome("/home/akhld/mobi/localcluster/x/spark-0.9.1-bin-hadoop2"); > > * spconf.setJars(jars.toArray(new String[jars.size()]));* > spconf.set("spark.executor.memory", > "1g"); > final JavaSparkContext ssc = new JavaSparkContext(spconf,new > Duration(1000)); >
Thanks Best Regards On Thu, Jul 3, 2014 at 3:01 PM, Kevin Jung <itsjb.j...@samsung.com> wrote: > I found a web page for hint. > http://ardoris.wordpress.com/2014/03/30/how-spark-does-class-loading/ > I learned SparkIMain has internal httpserver to publish class object but > can't figure out how I use it in java. > Any ideas? > > Thanks, > Kevin > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Case-class-in-java-tp8720p8724.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >