HI
i looked at the links you sent me, and i haven't found any clue how to
adapt it to my current code.
my code is very simple:
val sc = spark.sparkContext
val productsNum = 100000
println(s"Saving $productsNum products RDD to the space")
val rdd = sc.parallelize(1 to productsNum).map { i =>
Product(i, "Description of product " + i, Random.nextInt(10),
Random.nextBoolean())
}
is that simple to use beam instead of SparkContext ? i'm not familiar
with Spark at all so i have no idea what is Spark runner and how can i
use it in my case, just need to make it work :).
Thanks Tal
On Tue, Sep 26, 2017 at 11:57 AM, Aviem Zur <[email protected]> wrote:
> Hi Tal,
>
> Thanks for reaching out!
>
> Please take a look at our documentation:
>
> Quickstart guide (Java): https://beam.apache.org/get-started/quickstart-
> java/
> This guide will show you how to run our wordcount example using each any
> of the runners (For example, direct runner or Spark runner in your case).
>
> More reading:
> Programming guide: https://beam.apache.org/documentation/programming-
> guide/
> Spark runner: https://beam.apache.org/documentation/runners/spark/
>
> Please let us know if you have further questions, and good luck with your
> first try of Beam!
>
> Aviem.
>
> On Tue, Sep 26, 2017 at 11:47 AM tal m <[email protected]> wrote:
>
>> hi
>> i'm new at Spark and also at beam.
>> currently i have Java code that use Spark from reading some data from DB.
>> my Spark code using SparkSession.builder (.....) and also sparkContext.
>> how can i make beam work similar to my current code, i just want make it
>> work for now.
>> Thanks Tal
>>
>