Re: libraryDependencies

2016-07-27 Thread Jacek Laskowski
Hi,

How did you reference "sparksample"? If it ended up in
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample I
believe it was referenced as a git-based project in sbt. Is that
correct?

Also, when you "provided" Spark libs you won't be able to run Spark
apps in sbt. See
https://github.com/sbt/sbt-assembly#-provided-configuration. The trick
is to create a test app that executes main of your standalone app.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Tue, Jul 26, 2016 at 9:18 PM, Martin Somers  wrote:
>
> my build file looks like
>
> libraryDependencies  ++= Seq(
>   // other dependencies here
>   "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
>   "org.apache.spark" %% "spark-mllib_2.11" % "1.6.0",
>   "org.scalanlp" % "breeze_2.11" % "0.7",
>   // native libraries are not included by default. add this if
> you want them (as of 0.7)
>   // native libraries greatly improve performance, but increase
> jar sizes.
>   "org.scalanlp" % "breeze-natives_2.11" % "0.7",
> )
>
> not 100% sure on the version numbers if they are indeed correct
> getting an error of
>
> [info] Resolving jline#jline;2.12.1 ...
> [info] Done updating.
> [info] Compiling 1 Scala source to
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
> [error]
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
> 
> ...
>
>
> Im trying to import in
>
> import org.apache.spark.mllib.linalg.distributed.RowMatrix
> import org.apache.spark.mllib.linalg.SingularValueDecomposition
>
> import org.apache.spark.mllib.linalg.{Vector, Vectors}
>
>
> import breeze.linalg._
> import breeze.linalg.{ Matrix => B_Matrix }
> import breeze.linalg.{ Vector => B_Matrix }
> import breeze.linalg.DenseMatrix
>
> object MyApp {
>   def main(args: Array[String]): Unit = {
> //code here
> }
>
>
> It might not be the correct way of doing this
>
> Anyone got any suggestion
> tks
> M
>
>
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: libraryDependencies

2016-07-26 Thread Michael Armbrust
libraryDependencies  ++= Seq(
  // other dependencies here
  "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
  "org.apache.spark" %% "spark-mllib" % "1.6.2" % "provided",
  "org.scalanlp" %% "breeze" % "0.12",
  // native libraries are not included by default. add this if
you want them (as of 0.7)
  // native libraries greatly improve performance, but increase
jar sizes.
  "org.scalanlp" %% "breeze-natives" % "0.12",
)

On Tue, Jul 26, 2016 at 12:49 PM, Martin Somers  wrote:

> cheers - I updated
>
> libraryDependencies  ++= Seq(
>   // other dependencies here
>   "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
>   "org.apache.spark" %% "spark-mllib_2.10" % "1.6.2",
>   "org.scalanlp" %% "breeze" % "0.12",
>   // native libraries are not included by default. add this if
> you want them (as of 0.7)
>   // native libraries greatly improve performance, but
> increase jar sizes.
>   "org.scalanlp" %% "breeze-natives" % "0.12",
> )
>
> and getting similar error
>
> Compiling 1 Scala source to
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
> [error]
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
> [error] ^
> [error]
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:3:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.SingularValueDecomposition
> [error] ^
> [error]
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:5:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.{Vector, Vectors}
> [error] ^
> [error]
> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:8:
> not found: object breeze
>
> On Tue, Jul 26, 2016 at 8:36 PM, Michael Armbrust 
> wrote:
>
>> Also, you'll want all of the various spark versions to be the same.
>>
>> On Tue, Jul 26, 2016 at 12:34 PM, Michael Armbrust <
>> mich...@databricks.com> wrote:
>>
>>> If you are using %% (double) then you do not need _2.11.
>>>
>>> On Tue, Jul 26, 2016 at 12:18 PM, Martin Somers 
>>> wrote:
>>>

 my build file looks like

 libraryDependencies  ++= Seq(
   // other dependencies here
   "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
   "org.apache.spark" %% "spark-mllib_2.11" % "1.6.0",
   "org.scalanlp" % "breeze_2.11" % "0.7",
   // native libraries are not included by default. add this
 if you want them (as of 0.7)
   // native libraries greatly improve performance, but
 increase jar sizes.
   "org.scalanlp" % "breeze-natives_2.11" % "0.7",
 )

 not 100% sure on the version numbers if they are indeed correct
 getting an error of

 [info] Resolving jline#jline;2.12.1 ...
 [info] Done updating.
 [info] Compiling 1 Scala source to
 /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
 [error]
 /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
 object mllib is not a member of package org.apache.spark
 [error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
 
 ...


 Im trying to import in

 import org.apache.spark.mllib.linalg.distributed.RowMatrix
 import org.apache.spark.mllib.linalg.SingularValueDecomposition

 import org.apache.spark.mllib.linalg.{Vector, Vectors}


 import breeze.linalg._
 import breeze.linalg.{ Matrix => B_Matrix }
 import breeze.linalg.{ Vector => B_Matrix }
 import breeze.linalg.DenseMatrix

 object MyApp {
   def main(args: Array[String]): Unit = {
 //code here
 }


 It might not be the correct way of doing this

 Anyone got any suggestion
 tks
 M




>>>
>>
>
>
> --
> M
>


Re: libraryDependencies

2016-07-26 Thread Martin Somers
cheers - I updated

libraryDependencies  ++= Seq(
  // other dependencies here
  "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
  "org.apache.spark" %% "spark-mllib_2.10" % "1.6.2",
  "org.scalanlp" %% "breeze" % "0.12",
  // native libraries are not included by default. add this if
you want them (as of 0.7)
  // native libraries greatly improve performance, but increase
jar sizes.
  "org.scalanlp" %% "breeze-natives" % "0.12",
)

and getting similar error

Compiling 1 Scala source to
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
[error]
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
[error] ^
[error]
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:3:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.linalg.SingularValueDecomposition
[error] ^
[error]
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:5:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.linalg.{Vector, Vectors}
[error] ^
[error]
/Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:8:
not found: object breeze

On Tue, Jul 26, 2016 at 8:36 PM, Michael Armbrust 
wrote:

> Also, you'll want all of the various spark versions to be the same.
>
> On Tue, Jul 26, 2016 at 12:34 PM, Michael Armbrust  > wrote:
>
>> If you are using %% (double) then you do not need _2.11.
>>
>> On Tue, Jul 26, 2016 at 12:18 PM, Martin Somers 
>> wrote:
>>
>>>
>>> my build file looks like
>>>
>>> libraryDependencies  ++= Seq(
>>>   // other dependencies here
>>>   "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
>>>   "org.apache.spark" %% "spark-mllib_2.11" % "1.6.0",
>>>   "org.scalanlp" % "breeze_2.11" % "0.7",
>>>   // native libraries are not included by default. add this
>>> if you want them (as of 0.7)
>>>   // native libraries greatly improve performance, but
>>> increase jar sizes.
>>>   "org.scalanlp" % "breeze-natives_2.11" % "0.7",
>>> )
>>>
>>> not 100% sure on the version numbers if they are indeed correct
>>> getting an error of
>>>
>>> [info] Resolving jline#jline;2.12.1 ...
>>> [info] Done updating.
>>> [info] Compiling 1 Scala source to
>>> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
>>> [error]
>>> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
>>> object mllib is not a member of package org.apache.spark
>>> [error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
>>> 
>>> ...
>>>
>>>
>>> Im trying to import in
>>>
>>> import org.apache.spark.mllib.linalg.distributed.RowMatrix
>>> import org.apache.spark.mllib.linalg.SingularValueDecomposition
>>>
>>> import org.apache.spark.mllib.linalg.{Vector, Vectors}
>>>
>>>
>>> import breeze.linalg._
>>> import breeze.linalg.{ Matrix => B_Matrix }
>>> import breeze.linalg.{ Vector => B_Matrix }
>>> import breeze.linalg.DenseMatrix
>>>
>>> object MyApp {
>>>   def main(args: Array[String]): Unit = {
>>> //code here
>>> }
>>>
>>>
>>> It might not be the correct way of doing this
>>>
>>> Anyone got any suggestion
>>> tks
>>> M
>>>
>>>
>>>
>>>
>>
>


-- 
M


Re: libraryDependencies

2016-07-26 Thread Michael Armbrust
Also, you'll want all of the various spark versions to be the same.

On Tue, Jul 26, 2016 at 12:34 PM, Michael Armbrust 
wrote:

> If you are using %% (double) then you do not need _2.11.
>
> On Tue, Jul 26, 2016 at 12:18 PM, Martin Somers  wrote:
>
>>
>> my build file looks like
>>
>> libraryDependencies  ++= Seq(
>>   // other dependencies here
>>   "org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
>>   "org.apache.spark" %% "spark-mllib_2.11" % "1.6.0",
>>   "org.scalanlp" % "breeze_2.11" % "0.7",
>>   // native libraries are not included by default. add this
>> if you want them (as of 0.7)
>>   // native libraries greatly improve performance, but
>> increase jar sizes.
>>   "org.scalanlp" % "breeze-natives_2.11" % "0.7",
>> )
>>
>> not 100% sure on the version numbers if they are indeed correct
>> getting an error of
>>
>> [info] Resolving jline#jline;2.12.1 ...
>> [info] Done updating.
>> [info] Compiling 1 Scala source to
>> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/target/scala-2.11/classes...
>> [error]
>> /Users/studio/.sbt/0.13/staging/42f93875138543b4e1d3/sparksample/src/main/scala/MyApp.scala:2:
>> object mllib is not a member of package org.apache.spark
>> [error] import org.apache.spark.mllib.linalg.distributed.RowMatrix
>> 
>> ...
>>
>>
>> Im trying to import in
>>
>> import org.apache.spark.mllib.linalg.distributed.RowMatrix
>> import org.apache.spark.mllib.linalg.SingularValueDecomposition
>>
>> import org.apache.spark.mllib.linalg.{Vector, Vectors}
>>
>>
>> import breeze.linalg._
>> import breeze.linalg.{ Matrix => B_Matrix }
>> import breeze.linalg.{ Vector => B_Matrix }
>> import breeze.linalg.DenseMatrix
>>
>> object MyApp {
>>   def main(args: Array[String]): Unit = {
>> //code here
>> }
>>
>>
>> It might not be the correct way of doing this
>>
>> Anyone got any suggestion
>> tks
>> M
>>
>>
>>
>>
>