>From the docs
<https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.package@DataFrame=org.apache.spark.sql.Dataset[org.apache.spark.sql.Row]>,
DataFrame is just Dataset[Row]. The are various converters for subtypes of
Product if you want, using "as[T]", where T <: Product, or there is an
implicit decoder in scope, I believe.

Also, this is probably a user list question.


On Thu, Aug 18, 2016 at 10:59 AM Minudika Malshan <minudika...@gmail.com>
wrote:

> Hi all,
>
> Most of Spark ML algorithms requires a dataset to train the model.
> I would like to know how to convert a spark *data-frame* to a *dataset*
> using Java.
> Your support is much appreciated.
>
> Thank you!
> Minudika
>

Reply via email to