Hi,

I am trying to run a custom Spark application on a Spark standalone cluster
on Amazon's EC2 infrastructure. So far I have successfully executed the
application on several m1.medium instances (each with one core). However,
when I try executing the very same application on some c1.medium instances
(each with two cores), a JVM crash occurs with no further information. 

Is there any known issue with combining JNI and multiple cored instances?

Code piece:

/JavaRDD<String> result = file.mapPartitions(
      new FlatMapFunction <Iterator&lt;String> , String>()
      {

      @Override
      public Iterable<String> call(Iterator <String> t) throws Exception {      
             

      ArrayList <String> tmp= new ArrayList <String>();

       while(t.hasNext()){
            tmp.add(t.next());  
       }

            return Arrays.asList(myNativeFunc(arg1,arg2));
       }

}).cache();/



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/EC2-JNI-crashes-JVM-with-multi-core-instances-tp13463.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to