Re: sbt shenanigans for a Spark-based project

2016-11-15 Thread Marco Mistroni
Uhm i removed mvn repo and ivy folder as well, sbt seems to kick in but for
some reason it cannot 'see' org.apacke.spark-mllib and therefore my
compilation fails
i have temporarily fixed it by placing the spark-mllib jar in my project
\lib directory,
perhaps i'll try to create a brand new Spark project and see if that makes
any differences

thanks for assistance Don!
kr
 marco

On Mon, Nov 14, 2016 at 11:13 PM, Don Drake  wrote:

> I would remove your entire local Maven repo (~/.m2/repo in linux) and try
> again. I'm able to compile sample code with your build.sbt and sbt
> v.0.13.12.
>
> -Don
>
> On Mon, Nov 14, 2016 at 3:11 PM, Marco Mistroni 
> wrote:
>
>> uhm.sorry.. still same issues. this is hte new version
>>
>> name := "SparkExamples"
>> version := "1.0"
>> scalaVersion := "2.11.8"
>> val sparkVersion = "2.0.1"
>>
>> // Add a single dependency
>> libraryDependencies += "junit" % "junit" % "4.8" % "test"
>> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
>> "org.slf4j" % "slf4j-simple" % "1.7.5",
>> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
>> libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
>> sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-mllib"   %
>> sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink"
>> % "2.0.1"
>> libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
>>
>>
>> resolvers += "softprops-maven" at "http://dl.bintray.com/content
>> /softprops/maven"
>>
>> Still seeing these kinds of errors  which seems to lead to the fact that
>> somehow sbt is getting confused..
>>
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:2:
>> object mllib is not a member of package org.apache.spark
>> [error] import org.apache.spark.mllib.linalg.{ Vector, Vectors }
>> [error] ^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:3:
>> object mllib is not a member of package org.apache.spark
>> [error] import org.apache.spark.mllib.regression.LabeledPoint
>> [error] ^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:4:
>> object classification is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.classification.{
>> RandomForestClassifier, RandomForestClassificationModel }
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:6:
>> object feature is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.feature.{ StringIndexer,
>> IndexToString, VectorIndexer, VectorAssembler }
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:7:
>> object evaluation is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.evaluation.{ RegressionEvaluator,
>> MulticlassClassificationEvaluator }
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:8:
>> object classification is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.classification._
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:9:
>> object tuning is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.tuning.{ CrossValidator,
>> ParamGridBuilder }
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:10:
>> object tuning is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.tuning.{ ParamGridBuilder,
>> TrainValidationSplit }
>> [error]^
>> [error] 
>> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:16:
>> object Pipeline is not a member of package org.apache.spark.ml
>> [error] import org.apache.spark.ml.{ Pipeline, PipelineModel }
>>
>> any other hints?
>>
>> thanks and regarsd
>>  marco
>>
>>
>>
>>
>> On Sun, Nov 13, 2016 at 10:52 PM, Don Drake  wrote:
>>
>>> I would upgrade your Scala version to 2.11.8 as Spark 2.0 uses Scala
>>> 2.11 by default.
>>>
>>> On Sun, Nov 13, 2016 at 3:01 PM, Marco Mistroni 
>>> wrote:
>>>
 HI all
  i have a small Spark-based project which at the moment depends on jar
 from Spark 1.6.0
 The project has few Spark examples plus one which depends on Flume
 libraries


 I am attempting to move to Spark 2.0, but i am having issues with
 my dependencies
 The stetup below works fine when compiled against 1.6.0 dependencies

 

Re: sbt shenanigans for a Spark-based project

2016-11-14 Thread Don Drake
I would remove your entire local Maven repo (~/.m2/repo in linux) and try
again. I'm able to compile sample code with your build.sbt and sbt
v.0.13.12.

-Don

On Mon, Nov 14, 2016 at 3:11 PM, Marco Mistroni  wrote:

> uhm.sorry.. still same issues. this is hte new version
>
> name := "SparkExamples"
> version := "1.0"
> scalaVersion := "2.11.8"
> val sparkVersion = "2.0.1"
>
> // Add a single dependency
> libraryDependencies += "junit" % "junit" % "4.8" % "test"
> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
> "org.slf4j" % "slf4j-simple" % "1.7.5",
> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
> libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
> sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink" %
> "2.0.1"
> libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
>
>
> resolvers += "softprops-maven" at "http://dl.bintray.com/
> content/softprops/maven"
>
> Still seeing these kinds of errors  which seems to lead to the fact that
> somehow sbt is getting confused..
>
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:2:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.{ Vector, Vectors }
> [error] ^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:3:
> object mllib is not a member of package org.apache.spark
> [error] import org.apache.spark.mllib.regression.LabeledPoint
> [error] ^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:4:
> object classification is not a member of package org.apache.spark.ml
> [error] import org.apache.spark.ml.classification.{
> RandomForestClassifier, RandomForestClassificationModel }
> [error]^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:6:
> object feature is not a member of package org.apache.spark.ml
> [error] import org.apache.spark.ml.feature.{ StringIndexer, IndexToString,
> VectorIndexer, VectorAssembler }
> [error]^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:7:
> object evaluation is not a member of package org.apache.spark.ml
> [error] import org.apache.spark.ml.evaluation.{ RegressionEvaluator,
> MulticlassClassificationEvaluator }
> [error]^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:8:
> object classification is not a member of package org.apache.spark.ml
> [error] import org.apache.spark.ml.classification._
> [error]^
> [error] 
> C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:9:
> object tuning is not a member of package org.apache.spark.ml
> [error] import org.apache.spark.ml.tuning.{ CrossValidator,
> ParamGridBuilder }
> [error]^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> DecisionTreeExampleML.scala:10: object tuning is not a member of package
> org.apache.spark.ml
> [error] import org.apache.spark.ml.tuning.{ ParamGridBuilder,
> TrainValidationSplit }
> [error]^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> DecisionTreeExampleML.scala:16: object Pipeline is not a member of
> package org.apache.spark.ml
> [error] import org.apache.spark.ml.{ Pipeline, PipelineModel }
>
> any other hints?
>
> thanks and regarsd
>  marco
>
>
>
>
> On Sun, Nov 13, 2016 at 10:52 PM, Don Drake  wrote:
>
>> I would upgrade your Scala version to 2.11.8 as Spark 2.0 uses Scala 2.11
>> by default.
>>
>> On Sun, Nov 13, 2016 at 3:01 PM, Marco Mistroni 
>> wrote:
>>
>>> HI all
>>>  i have a small Spark-based project which at the moment depends on jar
>>> from Spark 1.6.0
>>> The project has few Spark examples plus one which depends on Flume
>>> libraries
>>>
>>>
>>> I am attempting to move to Spark 2.0, but i am having issues with
>>> my dependencies
>>> The stetup below works fine when compiled against 1.6.0 dependencies
>>>
>>> name := "SparkExamples"
>>> version := "1.0"
>>> scalaVersion := "2.10.5"
>>> val sparkVersion = "1.6.0"
>>>
>>>
>>> // Add a single dependency
>>> libraryDependencies += "junit" % "junit" % "4.8" % "test"
>>> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
>>> "org.slf4j" % "slf4j-simple" % "1.7.5",
>>> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
>>> libraryDependencies += "org.apache.spark"%%"spark-core"   %
>>> sparkVersion
>>> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
>>> 

Re: sbt shenanigans for a Spark-based project

2016-11-14 Thread Marco Mistroni
uhm.sorry.. still same issues. this is hte new version

name := "SparkExamples"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "2.0.1"

// Add a single dependency
libraryDependencies += "junit" % "junit" % "4.8" % "test"
libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
"org.slf4j" % "slf4j-simple" % "1.7.5",
"org.clapper" %% "grizzled-slf4j" % "1.0.2")
libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink" %
"2.0.1"
libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion


resolvers += "softprops-maven" at "
http://dl.bintray.com/content/softprops/maven;

Still seeing these kinds of errors  which seems to lead to the fact that
somehow sbt is getting confused..

C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:2:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.linalg.{ Vector, Vectors }
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:3:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.regression.LabeledPoint
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:4:
object classification is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.classification.{ RandomForestClassifier,
RandomForestClassificationModel }
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:6:
object feature is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.feature.{ StringIndexer, IndexToString,
VectorIndexer, VectorAssembler }
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:7:
object evaluation is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.evaluation.{ RegressionEvaluator,
MulticlassClassificationEvaluator }
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:8:
object classification is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.classification._
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:9:
object tuning is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.tuning.{ CrossValidator,
ParamGridBuilder }
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:10:
object tuning is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.tuning.{ ParamGridBuilder,
TrainValidationSplit }
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\DecisionTreeExampleML.scala:16:
object Pipeline is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.{ Pipeline, PipelineModel }

any other hints?

thanks and regarsd
 marco




On Sun, Nov 13, 2016 at 10:52 PM, Don Drake  wrote:

> I would upgrade your Scala version to 2.11.8 as Spark 2.0 uses Scala 2.11
> by default.
>
> On Sun, Nov 13, 2016 at 3:01 PM, Marco Mistroni 
> wrote:
>
>> HI all
>>  i have a small Spark-based project which at the moment depends on jar
>> from Spark 1.6.0
>> The project has few Spark examples plus one which depends on Flume
>> libraries
>>
>>
>> I am attempting to move to Spark 2.0, but i am having issues with
>> my dependencies
>> The stetup below works fine when compiled against 1.6.0 dependencies
>>
>> name := "SparkExamples"
>> version := "1.0"
>> scalaVersion := "2.10.5"
>> val sparkVersion = "1.6.0"
>>
>>
>> // Add a single dependency
>> libraryDependencies += "junit" % "junit" % "4.8" % "test"
>> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
>> "org.slf4j" % "slf4j-simple" % "1.7.5",
>> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
>> libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
>> sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-mllib"   %
>> sparkVersion
>> libraryDependencies += "org.apache.spark"%%"spark-streaming-flume" %
>> "1.3.0"
>> libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
>>
>>
>> resolvers += "softprops-maven" at "http://dl.bintray.com/content
>> /softprops/maven"
>>
>>
>>
>> This is the build.sbt version for using Spark 2 dependencies
>>
>> name := 

Re: sbt shenanigans for a Spark-based project

2016-11-13 Thread Don Drake
I would upgrade your Scala version to 2.11.8 as Spark 2.0 uses Scala 2.11
by default.

On Sun, Nov 13, 2016 at 3:01 PM, Marco Mistroni  wrote:

> HI all
>  i have a small Spark-based project which at the moment depends on jar
> from Spark 1.6.0
> The project has few Spark examples plus one which depends on Flume
> libraries
>
>
> I am attempting to move to Spark 2.0, but i am having issues with
> my dependencies
> The stetup below works fine when compiled against 1.6.0 dependencies
>
> name := "SparkExamples"
> version := "1.0"
> scalaVersion := "2.10.5"
> val sparkVersion = "1.6.0"
>
>
> // Add a single dependency
> libraryDependencies += "junit" % "junit" % "4.8" % "test"
> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
> "org.slf4j" % "slf4j-simple" % "1.7.5",
> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
> libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
> sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming-flume" %
> "1.3.0"
> libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
>
>
> resolvers += "softprops-maven" at "http://dl.bintray.com/
> content/softprops/maven"
>
>
>
> This is the build.sbt version for using Spark 2 dependencies
>
> name := "SparkExamples"
> version := "1.0"
> scalaVersion := "2.10.6"
> val sparkVersion = "2.0.1"
>
>
> // Add a single dependency
> libraryDependencies += "junit" % "junit" % "4.8" % "test"
> libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
> "org.slf4j" % "slf4j-simple" % "1.7.5",
> "org.clapper" %% "grizzled-slf4j" % "1.0.2")
> libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
> sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
> libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink" %
> "2.0.1"
> libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
> resolvers += "softprops-maven" at "http://dl.bintray.com/
> content/softprops/maven"
>
> but the sbt compile fails miserably...below few of the errors (it actually
> compiles like i forgot all the depencencies as it is complaining on all
> org.apache.spark.ml and mllib packages
>
> [warn] Multiple dependencies with the same organization/name but different
> versions. To avoid conflict, pick one version:
> [warn]  * org.apache.spark:spark-core_2.10:(1.6.1, 2.0.1)
> [warn]  * org.apache.spark:spark-streaming_2.10:(1.6.1, 2.0.1)
> [warn]  * org.apache.spark:spark-sql_2.10:(1.6.1, 2.0.1)
> [warn]  * org.apache.spark:spark-mllib_2.10:(1.6.1, 2.0.1)
> [info] Resolving org.scala-lang#scala-library;2.10.6 ...
> .
> [warn] * org.apache.spark:spark-mllib_2.10:1.6.1 -> 2.0.1
> [warn] * org.apache.spark:spark-sql_2.10:1.6.1 -> 2.0.1
> [warn] * org.apache.spark:spark-streaming_2.10:1.6.1 -> 2.0.1
> [warn] * org.apache.spark:spark-core_2.10:1.6.1 -> 2.0.1
> [warn] Run 'evicted' to see detailed eviction warnings
> [info] Compiling 18 Scala sources to C:\Users\marco\SparkExamples\
> target\scala-2.10\classes...
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:2: object mllib is not a member of
> package org.apache.spark
> [error] import org.apache.spark.mllib.linalg.{ Vector, Vectors }
> [error] ^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:3: object mllib is not a member of
> package org.apache.spark
> [error] import org.apache.spark.mllib.regression.LabeledPoint
> [error] ^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:4: object classification is not a member
> of package org.apache.spark.ml
> [error] import org.apache.spark.ml.classification._
> [error]^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:5: object mllib is not a member of
> package org.apache.spark
> [error] import org.apache.spark.mllib.tree.DecisionTree
> [error] ^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:6: object mllib is not a member of
> package org.apache.spark
> [error] import org.apache.spark.mllib.tree.model.DecisionTreeModel
> [error] ^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> AnotherDecisionTreeExample.scala:7: object mllib is not a member of
> package org.apache.spark
> [error] import org.apache.spark.mllib.util.MLUtils
> [error] ^
> [error] C:\Users\marco\SparkExamples\src\main\scala\
> 

sbt shenanigans for a Spark-based project

2016-11-13 Thread Marco Mistroni
HI all
 i have a small Spark-based project which at the moment depends on jar
from Spark 1.6.0
The project has few Spark examples plus one which depends on Flume libraries


I am attempting to move to Spark 2.0, but i am having issues with
my dependencies
The stetup below works fine when compiled against 1.6.0 dependencies

name := "SparkExamples"
version := "1.0"
scalaVersion := "2.10.5"
val sparkVersion = "1.6.0"


// Add a single dependency
libraryDependencies += "junit" % "junit" % "4.8" % "test"
libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
"org.slf4j" % "slf4j-simple" % "1.7.5",
"org.clapper" %% "grizzled-slf4j" % "1.0.2")
libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming-flume" %
"1.3.0"
libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion


resolvers += "softprops-maven" at "
http://dl.bintray.com/content/softprops/maven;



This is the build.sbt version for using Spark 2 dependencies

name := "SparkExamples"
version := "1.0"
scalaVersion := "2.10.6"
val sparkVersion = "2.0.1"


// Add a single dependency
libraryDependencies += "junit" % "junit" % "4.8" % "test"
libraryDependencies ++= Seq("org.slf4j" % "slf4j-api" % "1.7.5",
"org.slf4j" % "slf4j-simple" % "1.7.5",
"org.clapper" %% "grizzled-slf4j" % "1.0.2")
libraryDependencies += "org.apache.spark"%%"spark-core"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming"   %
sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-mllib"   % sparkVersion
libraryDependencies += "org.apache.spark"%%"spark-streaming-flume-sink" %
"2.0.1"
libraryDependencies += "org.apache.spark"%%"spark-sql"   % sparkVersion
resolvers += "softprops-maven" at "
http://dl.bintray.com/content/softprops/maven;

but the sbt compile fails miserably...below few of the errors (it actually
compiles like i forgot all the depencencies as it is complaining on all
org.apache.spark.ml and mllib packages

[warn] Multiple dependencies with the same organization/name but different
versions. To avoid conflict, pick one version:
[warn]  * org.apache.spark:spark-core_2.10:(1.6.1, 2.0.1)
[warn]  * org.apache.spark:spark-streaming_2.10:(1.6.1, 2.0.1)
[warn]  * org.apache.spark:spark-sql_2.10:(1.6.1, 2.0.1)
[warn]  * org.apache.spark:spark-mllib_2.10:(1.6.1, 2.0.1)
[info] Resolving org.scala-lang#scala-library;2.10.6 ...
.
[warn] * org.apache.spark:spark-mllib_2.10:1.6.1 -> 2.0.1
[warn] * org.apache.spark:spark-sql_2.10:1.6.1 -> 2.0.1
[warn] * org.apache.spark:spark-streaming_2.10:1.6.1 -> 2.0.1
[warn] * org.apache.spark:spark-core_2.10:1.6.1 -> 2.0.1
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 18 Scala sources to
C:\Users\marco\SparkExamples\target\scala-2.10\classes...
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:2:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.linalg.{ Vector, Vectors }
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:3:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.regression.LabeledPoint
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:4:
object classification is not a member of package org.apache.spark.ml
[error] import org.apache.spark.ml.classification._
[error]^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:5:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.tree.DecisionTree
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:6:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.tree.model.DecisionTreeModel
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:7:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.util.MLUtils
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:9:
object mllib is not a member of package org.apache.spark
[error] import org.apache.spark.mllib.stat.{
MultivariateStatisticalSummary, Statistics }
[error] ^
[error]
C:\Users\marco\SparkExamples\src\main\scala\AnotherDecisionTreeExample.scala:10:
object mllib is not a member of package org.apache.spark
[error] import