Re: anyone using netlib-java with sparkR on yarn spark1.6?

2015-11-11 Thread Tom Graves
Is there anything other then the spark assembly that needs to be in the 
classpath?  I verified the assembly was built right and its in the classpath 
(else nothing would work).
Thanks,Tom 


 On Tuesday, November 10, 2015 8:29 PM, Shivaram Venkataraman 
 wrote:
   

 I think this is happening in the driver. Could you check the classpath
of the JVM that gets started ? If you use spark-submit on yarn the
classpath is setup before R gets launched, so it should match the
behavior of Scala / Python.

Thanks
Shivaram

On Fri, Nov 6, 2015 at 1:39 PM, Tom Graves  wrote:
> I'm trying to use the netlib-java stuff with mllib and sparkR on yarn. I've
> compiled with -Pnetlib-lgpl, see the necessary things in the spark assembly
> jar.  The nodes have  /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3,
> and /usr/lib/libgfortran.so.3.
>
>
> Running:
> data <- read.df(sqlContext, 'data.csv', 'com.databricks.spark.csv')
> mdl = glm(C2~., data, family="gaussian")
>
> But I get the error:
> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemLAPACK
> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefLAPACK
> 15/11/06 21:17:27 ERROR RBackendHandler: fitRModelFormula on
> org.apache.spark.ml.api.r.SparkRWrappers failed
> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
>  java.lang.AssertionError: assertion failed: lapack.dpotrs returned 18.
>        at scala.Predef$.assert(Predef.scala:179)
>        at
> org.apache.spark.mllib.linalg.CholeskyDecomposition$.solve(CholeskyDecomposition.scala:40)
>        at
> org.apache.spark.ml.optim.WeightedLeastSquares.fit(WeightedLeastSquares.scala:114)
>        at
> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:166)
>        at
> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:65)
>        at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)
>        at org.apache.spark.ml.Predictor.fit(Predictor.scala:71)
>        at
> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:138)
>        at
> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:134)
>
> Anyone have this working?
>
> Thanks,
> Tom

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



  

Re: anyone using netlib-java with sparkR on yarn spark1.6?

2015-11-11 Thread Shivaram Venkataraman
Nothing more -- The only two things I can think of are: (a) is there
something else on the classpath that comes before this lgpl JAR ? I've
seen cases where two versions of netlib-java on the classpath can mess
things up. (b) There is something about the way SparkR is using
reflection to invoke the ML Pipelines code that is breaking the BLAS
library discovery. I don't know of a good way to debug this yet
though.

Thanks
Shivaram

On Wed, Nov 11, 2015 at 5:55 AM, Tom Graves  wrote:
> Is there anything other then the spark assembly that needs to be in the
> classpath?  I verified the assembly was built right and its in the classpath
> (else nothing would work).
>
> Thanks,
> Tom
>
>
>
> On Tuesday, November 10, 2015 8:29 PM, Shivaram Venkataraman
>  wrote:
>
>
> I think this is happening in the driver. Could you check the classpath
> of the JVM that gets started ? If you use spark-submit on yarn the
> classpath is setup before R gets launched, so it should match the
> behavior of Scala / Python.
>
> Thanks
> Shivaram
>
> On Fri, Nov 6, 2015 at 1:39 PM, Tom Graves 
> wrote:
>> I'm trying to use the netlib-java stuff with mllib and sparkR on yarn.
>> I've
>> compiled with -Pnetlib-lgpl, see the necessary things in the spark
>> assembly
>> jar.  The nodes have  /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3,
>> and /usr/lib/libgfortran.so.3.
>>
>>
>> Running:
>> data <- read.df(sqlContext, 'data.csv', 'com.databricks.spark.csv')
>> mdl = glm(C2~., data, family="gaussian")
>>
>> But I get the error:
>> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
>> com.github.fommil.netlib.NativeSystemLAPACK
>> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
>> com.github.fommil.netlib.NativeRefLAPACK
>> 15/11/06 21:17:27 ERROR RBackendHandler: fitRModelFormula on
>> org.apache.spark.ml.api.r.SparkRWrappers failed
>> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
>>  java.lang.AssertionError: assertion failed: lapack.dpotrs returned 18.
>>at scala.Predef$.assert(Predef.scala:179)
>>at
>>
>> org.apache.spark.mllib.linalg.CholeskyDecomposition$.solve(CholeskyDecomposition.scala:40)
>>at
>>
>> org.apache.spark.ml.optim.WeightedLeastSquares.fit(WeightedLeastSquares.scala:114)
>>at
>>
>> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:166)
>>at
>>
>> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:65)
>>at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)
>>at org.apache.spark.ml.Predictor.fit(Predictor.scala:71)
>>at
>> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:138)
>>at
>> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:134)
>>
>> Anyone have this working?
>>
>> Thanks,
>> Tom
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: anyone using netlib-java with sparkR on yarn spark1.6?

2015-11-10 Thread Shivaram Venkataraman
I think this is happening in the driver. Could you check the classpath
of the JVM that gets started ? If you use spark-submit on yarn the
classpath is setup before R gets launched, so it should match the
behavior of Scala / Python.

Thanks
Shivaram

On Fri, Nov 6, 2015 at 1:39 PM, Tom Graves  wrote:
> I'm trying to use the netlib-java stuff with mllib and sparkR on yarn. I've
> compiled with -Pnetlib-lgpl, see the necessary things in the spark assembly
> jar.  The nodes have  /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3,
> and /usr/lib/libgfortran.so.3.
>
>
> Running:
> data <- read.df(sqlContext, 'data.csv', 'com.databricks.spark.csv')
> mdl = glm(C2~., data, family="gaussian")
>
> But I get the error:
> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemLAPACK
> 15/11/06 21:17:27 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefLAPACK
> 15/11/06 21:17:27 ERROR RBackendHandler: fitRModelFormula on
> org.apache.spark.ml.api.r.SparkRWrappers failed
> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
>   java.lang.AssertionError: assertion failed: lapack.dpotrs returned 18.
>at scala.Predef$.assert(Predef.scala:179)
> at
> org.apache.spark.mllib.linalg.CholeskyDecomposition$.solve(CholeskyDecomposition.scala:40)
> at
> org.apache.spark.ml.optim.WeightedLeastSquares.fit(WeightedLeastSquares.scala:114)
> at
> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:166)
> at
> org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:65)
> at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)
> at org.apache.spark.ml.Predictor.fit(Predictor.scala:71)
> at
> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:138)
> at
> org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:134)
>
> Anyone have this working?
>
> Thanks,
> Tom

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



anyone using netlib-java with sparkR on yarn spark1.6?

2015-11-06 Thread Tom Graves
I'm trying to use the netlib-java stuff with mllib and sparkR on yarn. I've 
compiled with -Pnetlib-lgpl, see the necessary things in the spark assembly 
jar.  The nodes have  /usr/lib64/liblapack.so.3, /usr/lib64/libblas.so.3, and 
/usr/lib/libgfortran.so.3.

Running:data <- read.df(sqlContext, 'data.csv', 'com.databricks.spark.csv')
mdl = glm(C2~., data, family="gaussian")

But I get the error:15/11/06 21:17:27 WARN LAPACK: Failed to load 
implementation from: com.github.fommil.netlib.NativeSystemLAPACK15/11/06 
21:17:27 WARN LAPACK: Failed to load implementation from: 
com.github.fommil.netlib.NativeRefLAPACK15/11/06 21:17:27 ERROR 
RBackendHandler: fitRModelFormula on org.apache.spark.ml.api.r.SparkRWrappers 
failedError in invokeJava(isStatic = TRUE, className, methodName, ...) :   
java.lang.AssertionError: assertion failed: lapack.dpotrs returned 18.       at 
scala.Predef$.assert(Predef.scala:179)        at 
org.apache.spark.mllib.linalg.CholeskyDecomposition$.solve(CholeskyDecomposition.scala:40)
        at 
org.apache.spark.ml.optim.WeightedLeastSquares.fit(WeightedLeastSquares.scala:114)
        at 
org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:166)
        at 
org.apache.spark.ml.regression.LinearRegression.train(LinearRegression.scala:65)
        at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)        at 
org.apache.spark.ml.Predictor.fit(Predictor.scala:71)        at 
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:138)        at 
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:134)
Anyone have this working?
Thanks,Tom