Hi, I'm trying to use the native blas, and I followed all the threads I saw here and I still can't get rid of those warning: WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
I'm using: Centos 6.5 java 1.7.0_71 spark 1.1.0 running on yarn cluster and the things I done so far are: - install those packeges: - lapack-devel - blas-devel - atlas-devel - openblas-devel - gcc-gfortran.x86_64 - libgfortran.x86_64 - added those dependencies to mllib pom.xml: <dependency> <groupId>org.scalanlp</groupId> <artifactId>breeze-natives_${scala.binary.version}</artifactId> <version>0.9</version> </dependency> <dependency> <groupId>com.github.fommil.netlib</groupId> <artifactId>all</artifactId> <version>1.1.2</version> <type>pom</type> </dependency> (I know that the -Pnetlib-lgpl flag shloud have take care of the netlib dependency, but for some reason it didn't work) - build spark with the command: mvn -Pyarn -Phive -Phadoop-2.5 -Pnetlib-lgpl -Dhadoop.version=2.5.0-cdh5.2.0 -DskipTests clean package (I'm building it on windows if it makes any difference) And I did the follwing checks: sudo /sbin/ldconfig -p | grep libblas.so.3 libblas.so.3 (libc6,x86-64) => /usr/lib64/libblas.so.3 sudo /sbin/ldconfig -p | grep liblapack.so.3 liblapack.so.3 (libc6,x86-64) => /usr/lib64/liblapack.so.3 liblapack.so.3 (libc6,x86-64) => /usr/lib64/atlas/liblapack.so.3 jar tf assembly\target\scala-2.10\spark-assembly-1.1.0-cdh5.2.0-hadoop2.5.0-cdh5.2.0.jar | grep "netlib-native_system-linux-x86_64.so" netlib-native_system-linux-x86_64.so netlib-native_system-linux-x86_64.so.asc Anything else I can try? Thanks, Lev. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-native-blas-with-mllib-tp21156.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org