I changed import in the sample from

    import org.apache.spark.mllib.linalg.*;

to

   import org.apache.spark.ml.linalg.*;

and the sample now runs.

   Thanks
     Bob


On Wed, Jul 27, 2016 at 1:33 PM, Robert Goodman <bsg...@gmail.com> wrote:
> I tried to run the JavaAFTSurvivalRegressionExample on Spark 2.0 and the
> example doesn't work. It looks like the problem is that the example is using
> the MLLib Vector/VectorUDT to create the DataSet which needs to be converted
> using MLUtils before using in the model. I haven't actually tried this yet.
>
> When I run the example (/bin/run-example
> ml.JavaAFTSurvivalRegressionExample), I get the following stack trace
>
> Exception in thread "main" java.lang.IllegalArgumentException: requirement
> failed: Column features must be of type
> org.apache.spark.ml.linalg.VectorUDT@3bfc3ba7 but was actually
> org.apache.spark.mllib.linalg.VectorUDT@f71b0bce.
> at scala.Predef$.require(Predef.scala:224)
> at
> org.apache.spark.ml.util.SchemaUtils$.checkColumnType(SchemaUtils.scala:42)
> at
> org.apache.spark.ml.regression.AFTSurvivalRegressionParams$class.validateAndTransformSchema(AFTSurvivalRegression.scala:106)
> at
> org.apache.spark.ml.regression.AFTSurvivalRegression.validateAndTransformSchema(AFTSurvivalRegression.scala:126)
> at
> org.apache.spark.ml.regression.AFTSurvivalRegression.fit(AFTSurvivalRegression.scala:199)
> at
> org.apache.spark.examples.ml.JavaAFTSurvivalRegressionExample.main(JavaAFTSurvivalRegressionExample.java:67)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> Are you suppose to be able use the ML version of VectorUDT? The Spark 2.0
> API docs for Java, don't show the class but I was able to import the class
> into a java program.
>
>    Thanks
>      Bob

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to