aashudwivedi removed a comment on issue #13853: Error while running the mxnet 
spark examples and test cases
URL: 
https://github.com/apache/incubator-mxnet/issues/13853#issuecomment-457561018
 
 
   The complete stack trace of the error is : 
   ```Testing started at 6:07 PM ...
   /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/bin/java 
-Djava.library.path=/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/native/osx-x86_64-cpu/target
 "-javaagent:/Applications/IntelliJ IDEA 
CE.app/Contents/lib/idea_rt.jar=55436:/Applications/IntelliJ IDEA 
CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath 
"/Users/ashutdwi/Library/Application 
Support/IdeaIC2018.2/Scala/lib/runners.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/tools.jar:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/test-classes:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/classes:/Users/ashutdwi/.m2/repository/org/apache/mxnet/mxnet-core/INTERNAL/mxnet-core-INTERNAL.jar:/Users/ashutdwi/.m2/repository/commons-io/commons-io/2.1/commons-io-2.1.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-mllib_2.11/1.6.3/spark-mllib_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-core_2.11/1.6.3/spark-core_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/Users/ashutdwi/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/Users/ashutdwi/.m2/repository/org/apache/avro/avro/1.7.7/avro-1.7.7.jar:/Users/ashutdwi/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/Users/ashutdwi/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/ashutdwi/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/ashutdwi/.m2/repository/com/twitter/chill_2.11/0.5.0/chill_2.11-0.5.0.jar:/Users/ashutdwi/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/ashutdwi/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/ashutdwi/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/ashutdwi/.m2/repository/org/objenesis/objenesis/1.2/objenesis-1.2.jar:/Users/ashutdwi/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/Users/ashutdwi/.m2/repository/org/apache/xbean/xbean-asm5-shaded/4.4/xbean-asm5-shaded-4.4.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/Users/ashutdwi/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/ashutdwi/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/Users/ashutdwi/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/ashutdwi/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/ashutdwi/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/ashutdwi/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/ashutdwi/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/Users/ashutdwi/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/ashutdwi/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.2.0/hadoop-mapreduce-client-common-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.2.0/hadoop-yarn-client-2.2.0.jar:/Users/ashutdwi/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/ashutdwi/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/ashutdwi/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.9/jersey-test-framework-grizzly2-1.9.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-core/1.9/jersey-test-framework-core-1.9.jar:/Users/ashutdwi/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-grizzly2/1.9/jersey-grizzly2-1.9.jar:/Users/ashutdwi/.m2/repository/org/glassfish/grizzly/grizzly-http/2.1.2/grizzly-http-2.1.2.jar:/Users/ashutdwi/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.2/grizzly-framework-2.1.2.jar:/Users/ashutdwi/.m2/repository/org/glassfish/gmbal/gmbal-api-only/3.0.0-b023/gmbal-api-only-3.0.0-b023.jar:/Users/ashutdwi/.m2/repository/org/glassfish/external/management-api/3.0.0-b012/management-api-3.0.0-b012.jar:/Users/ashutdwi/.m2/repository/org/glassfish/grizzly/grizzly-http-server/2.1.2/grizzly-http-server-2.1.2.jar:/Users/ashutdwi/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.2/grizzly-rcm-2.1.2.jar:/Users/ashutdwi/.m2/repository/org/glassfish/grizzly/grizzly-http-servlet/2.1.2/grizzly-http-servlet-2.1.2.jar:/Users/ashutdwi/.m2/repository/org/glassfish/javax.servlet/3.1/javax.servlet-3.1.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/ashutdwi/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/ashutdwi/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/Users/ashutdwi/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/Users/ashutdwi/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.2.0/hadoop-yarn-server-common-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.2.0/hadoop-mapreduce-client-shuffle-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.2.0/hadoop-yarn-api-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.2.0/hadoop-mapreduce-client-core-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.2.0/hadoop-yarn-common-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.2.0/hadoop-mapreduce-client-jobclient-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/hadoop/hadoop-annotations/2.2.0/hadoop-annotations-2.2.0.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-launcher_2.11/1.6.3/spark-launcher_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-network-common_2.11/1.6.3/spark-network-common_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-network-shuffle_2.11/1.6.3/spark-network-shuffle_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/Users/ashutdwi/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.4.4/jackson-annotations-2.4.4.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-unsafe_2.11/1.6.3/spark-unsafe_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/net/java/dev/jets3t/jets3t/0.7.1/jets3t-0.7.1.jar:/Users/ashutdwi/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/ashutdwi/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/Users/ashutdwi/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/Users/ashutdwi/.m2/repository/org/apache/curator/curator-client/2.4.0/curator-client-2.4.0.jar:/Users/ashutdwi/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/Users/ashutdwi/.m2/repository/jline/jline/0.9.94/jline-0.9.94.jar:/Users/ashutdwi/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/Users/ashutdwi/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/Users/ashutdwi/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/ashutdwi/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/ashutdwi/.m2/repository/org/slf4j/jul-to-slf4j/1.7.10/jul-to-slf4j-1.7.10.jar:/Users/ashutdwi/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.10/jcl-over-slf4j-1.7.10.jar:/Users/ashutdwi/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/Users/ashutdwi/.m2/repository/org/xerial/snappy/snappy-java/1.1.2.6/snappy-java-1.1.2.6.jar:/Users/ashutdwi/.m2/repository/net/jpountz/lz4/lz4/1.3.0/lz4-1.3.0.jar:/Users/ashutdwi/.m2/repository/org/roaringbitmap/RoaringBitmap/0.5.11/RoaringBitmap-0.5.11.jar:/Users/ashutdwi/.m2/repository/commons-net/commons-net/2.2/commons-net-2.2.jar:/Users/ashutdwi/.m2/repository/com/typesafe/akka/akka-remote_2.11/2.3.11/akka-remote_2.11-2.3.11.jar:/Users/ashutdwi/.m2/repository/com/typesafe/akka/akka-actor_2.11/2.3.11/akka-actor_2.11-2.3.11.jar:/Users/ashutdwi/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/Users/ashutdwi/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/Users/ashutdwi/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/ashutdwi/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/Users/ashutdwi/.m2/repository/com/typesafe/akka/akka-slf4j_2.11/2.3.11/akka-slf4j_2.11-2.3.11.jar:/Users/ashutdwi/.m2/repository/org/json4s/json4s-jackson_2.11/3.2.10/json4s-jackson_2.11-3.2.10.jar:/Users/ashutdwi/.m2/repository/org/json4s/json4s-core_2.11/3.2.10/json4s-core_2.11-3.2.10.jar:/Users/ashutdwi/.m2/repository/org/json4s/json4s-ast_2.11/3.2.10/json4s-ast_2.11-3.2.10.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/scalap/2.11.0/scalap-2.11.0.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/ashutdwi/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/ashutdwi/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/ashutdwi/.m2/repository/org/apache/mesos/mesos/0.21.1/mesos-0.21.1-shaded-protobuf.jar:/Users/ashutdwi/.m2/repository/io/netty/netty-all/4.0.29.Final/netty-all-4.0.29.Final.jar:/Users/ashutdwi/.m2/repository/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/Users/ashutdwi/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar:/Users/ashutdwi/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.2/metrics-jvm-3.1.2.jar:/Users/ashutdwi/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar:/Users/ashutdwi/.m2/repository/io/dropwizard/metrics/metrics-graphite/3.1.2/metrics-graphite-3.1.2.jar:/Users/ashutdwi/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.4.4/jackson-databind-2.4.4.jar:/Users/ashutdwi/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.4.4/jackson-core-2.4.4.jar:/Users/ashutdwi/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.11/2.4.4/jackson-module-scala_2.11-2.4.4.jar:/Users/ashutdwi/.m2/repository/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar:/Users/ashutdwi/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/Users/ashutdwi/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/Users/ashutdwi/.m2/repository/org/tachyonproject/tachyon-client/0.8.2/tachyon-client-0.8.2.jar:/Users/ashutdwi/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar:/Users/ashutdwi/.m2/repository/org/tachyonproject/tachyon-underfs-hdfs/0.8.2/tachyon-underfs-hdfs-0.8.2.jar:/Users/ashutdwi/.m2/repository/org/tachyonproject/tachyon-underfs-s3/0.8.2/tachyon-underfs-s3-0.8.2.jar:/Users/ashutdwi/.m2/repository/org/tachyonproject/tachyon-underfs-local/0.8.2/tachyon-underfs-local-0.8.2.jar:/Users/ashutdwi/.m2/repository/net/razorvine/pyrolite/4.9/pyrolite-4.9.jar:/Users/ashutdwi/.m2/repository/net/sf/py4j/py4j/0.9/py4j-0.9.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-streaming_2.11/1.6.3/spark-streaming_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-sql_2.11/1.6.3/spark-sql_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-catalyst_2.11/1.6.3/spark-catalyst_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/org/codehaus/janino/janino/2.7.8/janino-2.7.8.jar:/Users/ashutdwi/.m2/repository/org/codehaus/janino/commons-compiler/2.7.8/commons-compiler-2.7.8.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-column/1.7.0/parquet-column-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-common/1.7.0/parquet-common-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-encoding/1.7.0/parquet-encoding-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-generator/1.7.0/parquet-generator-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-hadoop/1.7.0/parquet-hadoop-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar:/Users/ashutdwi/.m2/repository/org/apache/parquet/parquet-jackson/1.7.0/parquet-jackson-1.7.0.jar:/Users/ashutdwi/.m2/repository/org/apache/spark/spark-graphx_2.11/1.6.3/spark-graphx_2.11-1.6.3.jar:/Users/ashutdwi/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/Users/ashutdwi/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar:/Users/ashutdwi/.m2/repository/org/scalanlp/breeze_2.11/0.11.2/breeze_2.11-0.11.2.jar:/Users/ashutdwi/.m2/repository/org/scalanlp/breeze-macros_2.11/0.11.2/breeze-macros_2.11-0.11.2.jar:/Users/ashutdwi/.m2/repository/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/Users/ashutdwi/.m2/repository/com/github/rwl/jtransforms/2.4.0/jtransforms-2.4.0.jar:/Users/ashutdwi/.m2/repository/org/spire-math/spire_2.11/0.7.4/spire_2.11-0.7.4.jar:/Users/ashutdwi/.m2/repository/org/spire-math/spire-macros_2.11/0.7.4/spire-macros_2.11-0.7.4.jar:/Users/ashutdwi/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/Users/ashutdwi/.m2/repository/org/jpmml/pmml-model/1.1.15/pmml-model-1.1.15.jar:/Users/ashutdwi/.m2/repository/org/jpmml/pmml-agent/1.1.15/pmml-agent-1.1.15.jar:/Users/ashutdwi/.m2/repository/org/jpmml/pmml-schema/1.1.15/pmml-schema-1.1.15.jar:/Users/ashutdwi/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.7/jaxb-impl-2.2.7.jar:/Users/ashutdwi/.m2/repository/com/sun/xml/bind/jaxb-core/2.2.7/jaxb-core-2.2.7.jar:/Users/ashutdwi/.m2/repository/javax/xml/bind/jaxb-api/2.2.7/jaxb-api-2.2.7.jar:/Users/ashutdwi/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/Users/ashutdwi/.m2/repository/args4j/args4j/2.33/args4j-2.33.jar:/Users/ashutdwi/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/ashutdwi/.m2/repository/org/apache/logging/log4j/log4j-core/2.11.1/log4j-core-2.11.1.jar:/Users/ashutdwi/.m2/repository/org/apache/logging/log4j/log4j-api/2.11.1/log4j-api-2.11.1.jar:/Users/ashutdwi/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/ashutdwi/.m2/repository/org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar:/Users/ashutdwi/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/ashutdwi/.m2/repository/org/scalatest/scalatest_2.11/3.0.4/scalatest_2.11-3.0.4.jar:/Users/ashutdwi/.m2/repository/org/scalactic/scalactic_2.11/3.0.4/scalactic_2.11-3.0.4.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/modules/scala-xml_2.11/1.0.5/scala-xml_2.11-1.0.5.jar:/Users/ashutdwi/.m2/repository/org/scalacheck/scalacheck_2.11/1.13.5/scalacheck_2.11-1.13.5.jar:/Users/ashutdwi/.m2/repository/org/scala-sbt/test-interface/1.0/test-interface-1.0.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/scala-library/2.11.8/scala-library-2.11.8.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/scala-reflect/2.11.8/scala-reflect-2.11.8.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.11/1.0.4/scala-parser-combinators_2.11-1.0.4.jar:/Users/ashutdwi/.m2/repository/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar"
 org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner -s 
org.apache.mxnet.spark.MXNetGeneralSuite -testName "run spark with MLP" -C 
org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestReporter 
-showProgressMessages true
   Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
   19/01/25 18:07:52 INFO SparkContext: Running Spark version 1.6.3
   19/01/25 18:07:52 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   19/01/25 18:07:52 INFO SecurityManager: Changing view acls to: ashutdwi
   19/01/25 18:07:52 INFO SecurityManager: Changing modify acls to: ashutdwi
   19/01/25 18:07:52 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(ashutdwi); users 
with modify permissions: Set(ashutdwi)
   19/01/25 18:07:53 INFO Utils: Successfully started service 'sparkDriver' on 
port 55439.
   19/01/25 18:07:53 INFO Slf4jLogger: Slf4jLogger started
   19/01/25 18:07:53 INFO Remoting: Starting remoting
   19/01/25 18:07:53 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://[email protected]:55440]
   19/01/25 18:07:53 INFO Utils: Successfully started service 
'sparkDriverActorSystem' on port 55440.
   19/01/25 18:07:53 INFO SparkEnv: Registering MapOutputTracker
   19/01/25 18:07:53 INFO SparkEnv: Registering BlockManagerMaster
   19/01/25 18:07:53 INFO DiskBlockManager: Created local directory at 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/blockmgr-5fb86e1b-8627-4a91-b3ac-a3c449e6a68b
   19/01/25 18:07:53 INFO MemoryStore: MemoryStore started with capacity 2.4 GB
   19/01/25 18:07:53 INFO SparkEnv: Registering OutputCommitCoordinator
   19/01/25 18:07:53 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.
   19/01/25 18:07:53 INFO Utils: Successfully started service 'SparkUI' on port 
4041.
   19/01/25 18:07:53 INFO SparkUI: Started SparkUI at http://192.168.0.20:4041
   19/01/25 18:07:53 INFO Executor: Starting executor ID driver on host 
localhost
   19/01/25 18:07:53 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 55441.
   19/01/25 18:07:53 INFO NettyBlockTransferService: Server created on 55441
   19/01/25 18:07:54 INFO BlockManagerMaster: Trying to register BlockManager
   19/01/25 18:07:54 INFO BlockManagerMasterEndpoint: Registering block manager 
localhost:55441 with 2.4 GB RAM, BlockManagerId(driver, localhost, 55441)
   19/01/25 18:07:54 INFO BlockManagerMaster: Registered BlockManager
   19/01/25 18:07:54 INFO MemoryStore: Block broadcast_0 stored as values in 
memory (estimated size 107.7 KB, free 2.4 GB)
   19/01/25 18:07:54 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes 
in memory (estimated size 9.8 KB, free 2.4 GB)
   19/01/25 18:07:54 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
on localhost:55441 (size: 9.8 KB, free: 2.4 GB)
   19/01/25 18:07:54 INFO SparkContext: Created broadcast 0 from textFile at 
MXNetGeneralSuite.scala:35
   19/01/25 18:07:54 INFO MXNetJVM: Try loading mxnet-scala from native path.
   19/01/25 18:07:54 WARN MXNetJVM: MXNet Scala native library not found in 
path. Copying native library from the archive. Consider installing the library 
somewhere in the path (for Windows: PATH, for Linux: LD_LIBRARY_PATH), or 
specifying by Java cmd option -Djava.library.path=[lib path].
   19/01/25 18:07:54 WARN MXNetJVM: 
LD_LIBRARY_PATH=/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/native/osx-x86_64-cpu/target
   19/01/25 18:07:54 WARN MXNetJVM: 
java.library.path=/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/native/osx-x86_64-cpu/target
   19/01/25 18:07:55 INFO Utils: Copying 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/../assembly/osx-x86_64-cpu/target/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
 to 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
   19/01/25 18:07:55 INFO SparkContext: Added file 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/../assembly/osx-x86_64-cpu/target/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
 at 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/assembly/osx-x86_64-cpu/target/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
 with timestamp 1548419875041
   19/01/25 18:07:55 INFO Utils: Copying 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/mxnet-spark-INTERNAL.jar
 to 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-spark-INTERNAL.jar
   19/01/25 18:07:55 INFO SparkContext: Added file 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/mxnet-spark-INTERNAL.jar
 at 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/mxnet-spark-INTERNAL.jar
 with timestamp 1548419875078
   19/01/25 18:07:55 INFO FileInputFormat: Total input paths to process : 1
   19/01/25 18:07:55 INFO MXNet: repartitioning training set to 2 partitions
   19/01/25 18:07:55 INFO MXNet: Starting scheduler on 192.168.0.20:55443
   19/01/25 18:07:55 INFO ParameterServer: Started process: 
/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/bin/java  -cp 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar:/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-spark-INTERNAL.jar
 org.apache.mxnet.spark.ParameterServer --role=scheduler 
--root-uri=192.168.0.20 --root-port=55443 --num-server=1 --num-worker=2 
--timeout=0 at 192.168.0.20:55443
   19/01/25 18:07:55 INFO ParameterServer: Starting InputStream-Redirecter 
Thread for 192.168.0.20:55443
   19/01/25 18:07:55 INFO ParameterServer: Starting ErrorStream-Redirecter 
Thread for 192.168.0.20:55443
   19/01/25 18:07:55 WARN WarnIfNotDisposed: LEAK: [one-time warning] An 
instance of org.apache.mxnet.Symbol was not disposed. Set property 
mxnet.traceLeakedObjects to true to enable tracing
   19/01/25 18:07:55 INFO SparkContext: Starting job: foreachPartition at 
MXNet.scala:126
   19/01/25 18:07:55 INFO DAGScheduler: Got job 0 (foreachPartition at 
MXNet.scala:126) with 1 output partitions
   19/01/25 18:07:55 INFO DAGScheduler: Final stage: ResultStage 0 
(foreachPartition at MXNet.scala:126)
   19/01/25 18:07:55 INFO DAGScheduler: Parents of final stage: List()
   19/01/25 18:07:55 INFO DAGScheduler: Missing parents: List()
   19/01/25 18:07:55 INFO SparkContext: Starting job: foreachPartition at 
MXNet.scala:238
   19/01/25 18:07:55 INFO DAGScheduler: Submitting ResultStage 0 
(ParallelCollectionRDD[7] at parallelize at MXNet.scala:126), which has no 
missing parents
   19/01/25 18:07:55 INFO MemoryStore: Block broadcast_1 stored as values in 
memory (estimated size 5.0 KB, free 2.4 GB)
   19/01/25 18:07:55 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes 
in memory (estimated size 2.5 KB, free 2.4 GB)
   19/01/25 18:07:55 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory 
on localhost:55441 (size: 2.5 KB, free: 2.4 GB)
   19/01/25 18:07:55 INFO SparkContext: Created broadcast 1 from broadcast at 
DAGScheduler.scala:1006
   19/01/25 18:07:55 INFO DAGScheduler: Submitting 1 missing tasks from 
ResultStage 0 (ParallelCollectionRDD[7] at parallelize at MXNet.scala:126)
   19/01/25 18:07:55 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
   19/01/25 18:07:55 INFO DAGScheduler: Registering RDD 3 (repartition at 
MXNet.scala:251)
   19/01/25 18:07:55 INFO DAGScheduler: Got job 1 (foreachPartition at 
MXNet.scala:238) with 2 output partitions
   19/01/25 18:07:55 INFO DAGScheduler: Final stage: ResultStage 2 
(foreachPartition at MXNet.scala:238)
   19/01/25 18:07:55 INFO DAGScheduler: Parents of final stage: 
List(ShuffleMapStage 1)
   19/01/25 18:07:55 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 1)
   19/01/25 18:07:55 INFO DAGScheduler: Submitting ShuffleMapStage 1 
(MapPartitionsRDD[3] at repartition at MXNet.scala:251), which has no missing 
parents
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 
0, localhost, partition 0,PROCESS_LOCAL, 2327 bytes)
   19/01/25 18:07:55 INFO MemoryStore: Block broadcast_2 stored as values in 
memory (estimated size 4.3 KB, free 2.4 GB)
   Error: A JNI error has occurred, please check your installation and try again
   Exception in thread "main" 19/01/25 18:07:55 INFO MemoryStore: Block 
broadcast_2_piece0 stored as bytes in memory (estimated size 2.4 KB, free 2.4 
GB)
   java.lang.NoClassDefFoundError: scala/collection/Seq
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at 
sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
   Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)19/01/25 18:07:55 
INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:55441 
(size: 2.4 KB, free: 2.4 GB)
   
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
   19/01/25 18:07:55 INFO SparkContext: Created broadcast 2 from broadcast at 
DAGScheduler.scala:1006
   19/01/25 18:07:55 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
   19/01/25 18:07:55 INFO DAGScheduler: Submitting 8 missing tasks from 
ShuffleMapStage 1 (MapPartitionsRDD[3] at repartition at MXNet.scala:251)
   19/01/25 18:07:55 INFO TaskSchedulerImpl: Adding task set 1.0 with 8 tasks
   Exception in thread "Thread-34" java.lang.IllegalArgumentException: 
requirement failed: Failed to start ps scheduler process with exit code 1
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$startPSSchedulerInner$1(MXNet.scala:159)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$startPSScheduler$1.apply(MXNet.scala:162)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$startPSScheduler$1.apply(MXNet.scala:162)
        at 
org.apache.mxnet.spark.MXNet$MXNetControllingThread.run(MXNet.scala:38)
   19/01/25 18:07:55 INFO Executor: Fetching 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/assembly/osx-x86_64-cpu/target/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
 with timestamp 1548419875041
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 
1, localhost, partition 0,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 
2, localhost, partition 1,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 
3, localhost, partition 2,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 
4, localhost, partition 3,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 4.0 in stage 1.0 (TID 
5, localhost, partition 4,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 5.0 in stage 1.0 (TID 
6, localhost, partition 5,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 6.0 in stage 1.0 (TID 
7, localhost, partition 6,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
   19/01/25 18:07:55 INFO Executor: Running task 1.0 in stage 1.0 (TID 2)
   19/01/25 18:07:55 INFO Executor: Running task 2.0 in stage 1.0 (TID 3)
   19/01/25 18:07:55 INFO Executor: Running task 3.0 in stage 1.0 (TID 4)
   19/01/25 18:07:55 INFO Executor: Running task 4.0 in stage 1.0 (TID 5)
   19/01/25 18:07:55 INFO Executor: Running task 5.0 in stage 1.0 (TID 6)
   19/01/25 18:07:55 INFO Executor: Running task 6.0 in stage 1.0 (TID 7)
   19/01/25 18:07:55 INFO Utils: 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/assembly/osx-x86_64-cpu/target/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
 has been previously copied to 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar
   19/01/25 18:07:55 INFO Executor: Fetching 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/mxnet-spark-INTERNAL.jar
 with timestamp 1548419875078
   19/01/25 18:07:55 INFO Utils: 
/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/target/mxnet-spark-INTERNAL.jar
 has been previously copied to 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-spark-INTERNAL.jar
   19/01/25 18:07:55 INFO MXNet: Starting server ...
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:33554432+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:167772160+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:134217728+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:67108864+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:201326592+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:100663296+33554432
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:0+33554432
   19/01/25 18:07:55 INFO ParameterServer: Started process: 
/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/bin/java  -cp 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-full_2.11-osx-x86_64-cpu-1.5.0-SNAPSHOT-src.jar:/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3/userFiles-908ec2c6-1cd7-4248-a8d6-a29d63bfc3d1/mxnet-spark-INTERNAL.jar
 org.apache.mxnet.spark.ParameterServer --role=server --root-uri=192.168.0.20 
--root-port=55443 --num-server=1 --num-worker=2 --timeout=0 at 
192.168.0.20:55443
   19/01/25 18:07:55 INFO ParameterServer: Starting InputStream-Redirecter 
Thread for 192.168.0.20:55443
   19/01/25 18:07:55 INFO ParameterServer: Starting ErrorStream-Redirecter 
Thread for 192.168.0.20:55443
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.task.id is deprecated. Instead, 
use mapreduce.task.attempt.id
   19/01/25 18:07:55 INFO deprecation: mapred.task.is.map is deprecated. 
Instead, use mapreduce.task.ismap
   19/01/25 18:07:55 INFO deprecation: mapred.task.partition is deprecated. 
Instead, use mapreduce.task.partition
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.tip.id is deprecated. Instead, 
use mapreduce.task.id
   19/01/25 18:07:55 INFO deprecation: mapred.job.id is deprecated. Instead, 
use mapreduce.job.id
   Error: A JNI error has occurred, please check your installation and try again
   Exception in thread "main" java.lang.NoClassDefFoundError: 
scala/collection/Seq
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at 
sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
   Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
   19/01/25 18:07:55 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
   java.lang.IllegalArgumentException: requirement failed: ps server process 
quit with exit code 1
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:137)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:126)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   19/01/25 18:07:55 INFO TaskSetManager: Starting task 7.0 in stage 1.0 (TID 
8, localhost, partition 7,PROCESS_LOCAL, 2377 bytes)
   19/01/25 18:07:55 INFO Executor: Running task 7.0 in stage 1.0 (TID 8)
   19/01/25 18:07:55 INFO HadoopRDD: Input split: 
file:/Users/ashutdwi/ora/opensource/incubator-mxnet/scala-package/spark/bin/train_full.txt.txt:234881024+8528911
   19/01/25 18:07:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 
localhost): java.lang.IllegalArgumentException: requirement failed: ps server 
process quit with exit code 1
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:137)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:126)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   19/01/25 18:07:55 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; 
aborting job
   19/01/25 18:07:55 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks 
have all completed, from pool 
   19/01/25 18:07:55 INFO TaskSchedulerImpl: Cancelling stage 0
   19/01/25 18:07:55 INFO DAGScheduler: ResultStage 0 (foreachPartition at 
MXNet.scala:126) failed in 0.428 s
   19/01/25 18:07:55 INFO DAGScheduler: Job 0 failed: foreachPartition at 
MXNet.scala:126, took 0.483542 s
   Exception in thread "Thread-35" org.apache.spark.SparkException: Job aborted 
due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: 
Lost task 0.0 in stage 0.0 (TID 0, localhost): 
java.lang.IllegalArgumentException: requirement failed: ps server process quit 
with exit code 1
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:137)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:126)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
        at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$startPSServersInner$1(MXNet.scala:126)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$startPSServers$1.apply(MXNet.scala:140)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$startPSServers$1.apply(MXNet.scala:140)
        at 
org.apache.mxnet.spark.MXNet$MXNetControllingThread.run(MXNet.scala:38)
   Caused by: java.lang.IllegalArgumentException: requirement failed: ps server 
process quit with exit code 1
        at scala.Predef$.require(Predef.scala:224)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:137)
        at 
org.apache.mxnet.spark.MXNet$$anonfun$org$apache$mxnet$spark$MXNet$$startPSServersInner$1$1.apply(MXNet.scala:126)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at 
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   19/01/25 18:07:55 INFO BlockManagerInfo: Removed broadcast_1_piece0 on 
localhost:55441 in memory (size: 2.5 KB, free: 2.4 GB)
   19/01/25 18:07:55 INFO ContextCleaner: Cleaned accumulator 1
   19/01/25 18:07:56 INFO Executor: Finished task 7.0 in stage 1.0 (TID 8). 
2254 bytes result sent to driver
   19/01/25 18:07:56 INFO TaskSetManager: Finished task 7.0 in stage 1.0 (TID 
8) in 832 ms on localhost (1/8)
   19/01/25 18:07:57 INFO Executor: Finished task 1.0 in stage 1.0 (TID 2). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 
2) in 2266 ms on localhost (2/8)
   19/01/25 18:07:57 INFO Executor: Finished task 5.0 in stage 1.0 (TID 6). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO Executor: Finished task 3.0 in stage 1.0 (TID 4). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO Executor: Finished task 6.0 in stage 1.0 (TID 7). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 5.0 in stage 1.0 (TID 
6) in 2297 ms on localhost (3/8)
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 
4) in 2299 ms on localhost (4/8)
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 6.0 in stage 1.0 (TID 
7) in 2296 ms on localhost (5/8)
   19/01/25 18:07:57 INFO Executor: Finished task 2.0 in stage 1.0 (TID 3). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 
3) in 2329 ms on localhost (6/8)
   19/01/25 18:07:57 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 
1) in 2403 ms on localhost (7/8)
   19/01/25 18:07:57 INFO Executor: Finished task 4.0 in stage 1.0 (TID 5). 
2254 bytes result sent to driver
   19/01/25 18:07:57 INFO TaskSetManager: Finished task 4.0 in stage 1.0 (TID 
5) in 2411 ms on localhost (8/8)
   19/01/25 18:07:57 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks 
have all completed, from pool 
   19/01/25 18:07:57 INFO DAGScheduler: ShuffleMapStage 1 (repartition at 
MXNet.scala:251) finished in 2.425 s
   19/01/25 18:07:57 INFO DAGScheduler: looking for newly runnable stages
   19/01/25 18:07:57 INFO DAGScheduler: running: Set()
   19/01/25 18:07:57 INFO DAGScheduler: waiting: Set(ResultStage 2)
   19/01/25 18:07:57 INFO DAGScheduler: failed: Set()
   19/01/25 18:07:57 INFO DAGScheduler: Submitting ResultStage 2 
(MapPartitionsRDD[8] at mapPartitions at MXNet.scala:209), which has no missing 
parents
   19/01/25 18:07:57 INFO MemoryStore: Block broadcast_3 stored as values in 
memory (estimated size 7.2 KB, free 2.4 GB)
   19/01/25 18:07:57 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes 
in memory (estimated size 3.6 KB, free 2.4 GB)
   19/01/25 18:07:57 INFO BlockManagerInfo: Added broadcast_3_piece0 in memory 
on localhost:55441 (size: 3.6 KB, free: 2.4 GB)
   19/01/25 18:07:57 INFO SparkContext: Created broadcast 3 from broadcast at 
DAGScheduler.scala:1006
   19/01/25 18:07:57 INFO DAGScheduler: Submitting 2 missing tasks from 
ResultStage 2 (MapPartitionsRDD[8] at mapPartitions at MXNet.scala:209)
   19/01/25 18:07:57 INFO TaskSchedulerImpl: Adding task set 2.0 with 2 tasks
   19/01/25 18:07:57 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 
9, localhost, partition 0,NODE_LOCAL, 2355 bytes)
   19/01/25 18:07:57 INFO TaskSetManager: Starting task 1.0 in stage 2.0 (TID 
10, localhost, partition 1,NODE_LOCAL, 2355 bytes)
   19/01/25 18:07:57 INFO Executor: Running task 0.0 in stage 2.0 (TID 9)
   19/01/25 18:07:57 INFO Executor: Running task 1.0 in stage 2.0 (TID 10)
   19/01/25 18:07:57 INFO CacheManager: Partition rdd_8_0 not found, computing 
it
   19/01/25 18:07:57 INFO CacheManager: Partition rdd_8_1 not found, computing 
it
   19/01/25 18:07:57 INFO ShuffleBlockFetcherIterator: Getting 8 non-empty 
blocks out of 8 blocks
   19/01/25 18:07:57 INFO ShuffleBlockFetcherIterator: Getting 8 non-empty 
blocks out of 8 blocks
   19/01/25 18:07:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches 
in 7 ms
   19/01/25 18:07:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches 
in 7 ms
   19/01/25 18:08:01 WARN WarnIfNotDisposed: LEAK: [one-time warning] An 
instance of org.apache.mxnet.NDArray was not disposed. Set property 
mxnet.traceLeakedObjects to true to enable tracing
   19/01/25 18:08:02 INFO MXNet: Launching worker ...
   19/01/25 18:08:02 INFO MXNet: Batch 128
   19/01/25 18:08:06 INFO MXNet: Launching worker ...
   19/01/25 18:08:06 INFO MXNet: Batch 128
   19/01/25 18:08:22 ERROR Executor: Exception in task 1.0 in stage 2.0 (TID 10)
   org.apache.mxnet.MXNetError: [18:08:22] include/mxnet/kvstore.h:273: compile 
with USE_DIST_KVSTORE=1 to init parameter server's environment
   
   Stack trace returned 5 entries:
   [bt] (0) 0   libmxnet.so                         0x0000000124656ab5 
dmlc::StackTrace() + 261
   [bt] (1) 1   libmxnet.so                         0x000000012465686f 
dmlc::LogMessageFatal::~LogMessageFatal() + 47
   [bt] (2) 2   libmxnet.so                         0x0000000125cd87c3 
MXInitPSEnv + 659
   [bt] (3) 3   mxnet-scala                         0x000000012114c86b 
Java_org_apache_mxnet_LibInfo_mxInitPSEnv + 331
   [bt] (4) 4   ???                                 0x00000001055ea667 0x0 + 
4385056359
   
   
        at org.apache.mxnet.Base$.checkCall(Base.scala:111)
        at org.apache.mxnet.KVStoreServer$.init(KVStoreServer.scala:108)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$setupKVStore(MXNet.scala:190)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:228)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:209)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   19/01/25 18:08:22 WARN TaskSetManager: Lost task 1.0 in stage 2.0 (TID 10, 
localhost): org.apache.mxnet.MXNetError: [18:08:22] 
include/mxnet/kvstore.h:273: compile with USE_DIST_KVSTORE=1 to init parameter 
server's environment
   
   Stack trace returned 5 entries:
   [bt] (0) 0   libmxnet.so                         0x0000000124656ab5 
dmlc::StackTrace() + 261
   [bt] (1) 1   libmxnet.so                         0x000000012465686f 
dmlc::LogMessageFatal::~LogMessageFatal() + 47
   [bt] (2) 2   libmxnet.so                         0x0000000125cd87c3 
MXInitPSEnv + 659
   [bt] (3) 3   mxnet-scala                         0x000000012114c86b 
Java_org_apache_mxnet_LibInfo_mxInitPSEnv + 331
   [bt] (4) 4   ???                                 0x00000001055ea667 0x0 + 
4385056359
   
   
        at org.apache.mxnet.Base$.checkCall(Base.scala:111)
        at org.apache.mxnet.KVStoreServer$.init(KVStoreServer.scala:108)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$setupKVStore(MXNet.scala:190)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:228)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:209)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   19/01/25 18:08:22 ERROR TaskSetManager: Task 1 in stage 2.0 failed 1 times; 
aborting job
   19/01/25 18:08:22 INFO TaskSchedulerImpl: Cancelling stage 2
   19/01/25 18:08:22 INFO Executor: Executor is trying to kill task 0.0 in 
stage 2.0 (TID 9)
   19/01/25 18:08:22 INFO TaskSchedulerImpl: Stage 2 was cancelled
   19/01/25 18:08:22 INFO DAGScheduler: ResultStage 2 (foreachPartition at 
MXNet.scala:238) failed in 24.387 s
   19/01/25 18:08:22 INFO DAGScheduler: Job 1 failed: foreachPartition at 
MXNet.scala:238, took 26.904674 s
   
   Job aborted due to stage failure: Task 1 in stage 2.0 failed 1 times, most 
recent failure: Lost task 1.0 in stage 2.0 (TID 10, localhost): 
org.apache.mxnet.MXNetError: [18:08:22] include/mxnet/kvstore.h:273: compile 
with USE_DIST_KVSTORE=1 to init parameter server's environment
   
   Stack trace returned 5 entries:
   [bt] (0) 0   libmxnet.so                         0x0000000124656ab5 
dmlc::StackTrace() + 261
   [bt] (1) 1   libmxnet.so                         0x000000012465686f 
dmlc::LogMessageFatal::~LogMessageFatal() + 47
   [bt] (2) 2   libmxnet.so                         0x0000000125cd87c3 
MXInitPSEnv + 659
   [bt] (3) 3   mxnet-scala                         0x000000012114c86b 
Java_org_apache_mxnet_LibInfo_mxInitPSEnv + 331
   [bt] (4) 4   ???                                 0x00000001055ea667 0x0 + 
4385056359
   
   
        at org.apache.mxnet.Base$.checkCall(Base.scala:111)
        at org.apache.mxnet.KVStoreServer$.init(KVStoreServer.scala:108)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$setupKVStore(MXNet.scala:190)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:228)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:209)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   Driver stacktrace:
   org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in 
stage 2.0 failed 1 times, most recent failure: Lost task 1.0 in stage 2.0 (TID 
10, localhost): org.apache.mxnet.MXNetError: [18:08:22] 
include/mxnet/kvstore.h:273: compile with USE_DIST_KVSTORE=1 to init parameter 
server's environment
   
   Stack trace returned 5 entries:
   [bt] (0) 0   libmxnet.so                         0x0000000124656ab5 
dmlc::StackTrace() + 261
   [bt] (1) 1   libmxnet.so                         0x000000012465686f 
dmlc::LogMessageFatal::~LogMessageFatal() + 47
   [bt] (2) 2   libmxnet.so                         0x0000000125cd87c3 
MXInitPSEnv + 659
   [bt] (3) 3   mxnet-scala                         0x000000012114c86b 
Java_org_apache_mxnet_LibInfo_mxInitPSEnv + 331
   [bt] (4) 4   ???                                 0x00000001055ea667 0x0 + 
4385056359
   
   
        at org.apache.mxnet.Base$.checkCall(Base.scala:111)
        at org.apache.mxnet.KVStoreServer$.init(KVStoreServer.scala:108)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$setupKVStore(MXNet.scala:190)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:228)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:209)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:920)
        at 
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
        at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)
        at org.apache.mxnet.spark.MXNet.trainModel(MXNet.scala:238)
        at org.apache.mxnet.spark.MXNet.fit(MXNet.scala:260)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite$$anonfun$3.apply(MXNetGeneralSuite.scala:63)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite$$anonfun$3.apply(MXNetGeneralSuite.scala:61)
        at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
        at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
        at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
        at 
org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
        at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(MXNetGeneralSuite.scala:30)
        at 
org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:221)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite.runTest(MXNetGeneralSuite.scala:30)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
        at 
org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
        at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
        at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
        at org.scalatest.Suite$class.run(Suite.scala:1147)
        at 
org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
        at 
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
        at 
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
        at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite.org$scalatest$BeforeAndAfterAll$$super$run(MXNetGeneralSuite.scala:30)
        at 
org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
        at 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
        at 
org.apache.mxnet.spark.MXNetGeneralSuite.run(MXNetGeneralSuite.scala:30)
        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
        at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
        at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1334)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1334)
        at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
        at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
        at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1500)
        at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
        at org.scalatest.tools.Runner$.run(Runner.scala:850)
        at org.scalatest.tools.Runner.run(Runner.scala)
        at 
org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
        at 
org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
   Caused by: org.apache.mxnet.MXNetError: [18:08:22] 
include/mxnet/kvstore.h:273: compile with USE_DIST_KVSTORE=1 to init parameter 
server's environment
   
   Stack trace returned 5 entries:
   [bt] (0) 0   libmxnet.so                         0x0000000124656ab5 
dmlc::StackTrace() + 261
   [bt] (1) 1   libmxnet.so                         0x000000012465686f 
dmlc::LogMessageFatal::~LogMessageFatal() + 47
   [bt] (2) 2   libmxnet.so                         0x0000000125cd87c3 
MXInitPSEnv + 659
   [bt] (3) 3   mxnet-scala                         0x000000012114c86b 
Java_org_apache_mxnet_LibInfo_mxInitPSEnv + 331
   [bt] (4) 4   ???                                 0x00000001055ea667 0x0 + 
4385056359
   
   
        at org.apache.mxnet.Base$.checkCall(Base.scala:111)
        at org.apache.mxnet.KVStoreServer$.init(KVStoreServer.scala:108)
        at 
org.apache.mxnet.spark.MXNet.org$apache$mxnet$spark$MXNet$$setupKVStore(MXNet.scala:190)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:228)
        at org.apache.mxnet.spark.MXNet$$anonfun$1.apply(MXNet.scala:209)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
        at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   19/01/25 18:08:22 INFO SparkUI: Stopped Spark web UI at 
http://192.168.0.20:4041
   19/01/25 18:08:22 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   19/01/25 18:08:22 INFO MemoryStore: MemoryStore cleared
   19/01/25 18:08:22 INFO BlockManager: BlockManager stopped
   19/01/25 18:08:22 INFO BlockManagerMaster: BlockManagerMaster stopped
   19/01/25 18:08:22 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   19/01/25 18:08:22 INFO SparkContext: Successfully stopped SparkContext
   19/01/25 18:08:22 INFO RemoteActorRefProvider$RemotingTerminator: Shutting 
down remote daemon.
   19/01/25 18:08:22 INFO RemoteActorRefProvider$RemotingTerminator: Remote 
daemon shut down; proceeding with flushing remote transports.
   19/01/25 18:08:22 INFO NativeLibraryLoader: Deleting 
/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/mxnet2451649713761743801/mxnet-scala
   19/01/25 18:08:22 INFO NativeLibraryLoader: Deleting 
/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/mxnet2451649713761743801/libmxnet.so
   19/01/25 18:08:22 INFO NativeLibraryLoader: Deleting 
/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/mxnet2451649713761743801
   19/01/25 18:08:22 INFO ShutdownHookManager: Shutdown hook called
   19/01/25 18:08:22 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/q5/fkq1x5gj5l73cznw53c263wr0000gn/T/spark-b3cbf154-c098-49dc-8232-3a97b7e43bc3
   
   Process finished with exit code 0
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to