Please take a look at:
sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala

    val ds1 = Seq(1, 2, 3).toDS().as("a")
    val ds2 = Seq(1, 2).toDS().as("b")

    checkAnswer(
      ds1.joinWith(ds2, $"a.value" === $"b.value", "inner"),

On Tue, Feb 9, 2016 at 7:07 AM, Raghava Mutharaju <[email protected]
> wrote:

> Hello All,
>
> joinWith() method in Dataset takes a condition of type Column. Without
> converting a Dataset to a DataFrame, how can we get a specific column?
>
> For eg: case class Pair(x: Long, y: Long)
>
> A, B are Datasets of type Pair and I want to join A.x with B.y
>
> A.joinWith(B, A.toDF().col("x") == B.toDF().col("y"))
>
> Is there a way to avoid using toDF()?
>
> I am having similar issues with the usage of filter(A.x == B.y)
>
> --
> Regards,
> Raghava
>

Reply via email to