Case class in java

2014-07-03 Thread Kevin Jung
Hi,
I'm trying to convert scala spark job into java.
In case of scala, I typically use 'case class' to apply schema to RDD.
It can be converted into POJO class in java, but what I really want to do is
dynamically creating POJO classes like scala REPL do.
For this reason, I import javassist to create POJO class in runtime easily.
But the problem is Worker nodes can't find this class.
The error message is..
 host workernode2.com: java.lang.ClassNotFoundException: GeneratedClass_no1
 java.net.URLClassLoader$1.run(URLClassLoader.java:366)
java.net.URLClassLoader$1.run(URLClassLoader.java:355)
java.security.AccessController.doPrivileged(Native Method)
java.net.URLClassLoader.findClass(URLClassLoader.java:354)
java.lang.ClassLoader.loadClass(ClassLoader.java:423)
java.lang.ClassLoader.loadClass(ClassLoader.java:356)
java.lang.Class.forName0(Native Method)
java.lang.Class.forName(Class.java:266)
Generated class's classloader is
'Thread.currentThread().getContextClassLoader()'.
I expect it can be visible for Driver-node but Worker node's executor can
not see it.
Are changing classloader for loading Generated class and broadcasting
Generated class by spark context effective?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Case-class-in-java-tp8720.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Case class in java

2014-07-03 Thread Kevin Jung
I found a web page for hint.
http://ardoris.wordpress.com/2014/03/30/how-spark-does-class-loading/
I learned SparkIMain has internal httpserver to publish class object but
can't figure out how I use it in java.
Any ideas?

Thanks,
Kevin



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Case-class-in-java-tp8720p8724.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Case class in java

2014-07-03 Thread Kevin Jung
This will load listed jars when SparkContext is created.
In case of REPL, we define and import classes after SparkContext created.
According to above mentioned site, Executor install class loader in
'addReplClassLoaderIfNeeded' method using spark.repl.class.uri
configuration.
Then I will try to make class server distributing *dynamically created
classes* in my driver application to Executors as spark REPL.

Thanks,
Kevin




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Case-class-in-java-tp8720p8765.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.