Hi,

You can write code in Beam that can run on Spark, but it is not a generic
wrapper for existing Spark code.

Beam provides a higher level abstraction that lets you write code targeting
different environments - for example code that can run on Spark and on
Flink.

I'd suggest scanning the quick-start [1], looking at the Spark word count
example [2] and reading the programming guide [3] - you should be able to
run some quick tests in your favourite IDE and then run the same code on
Spark by using the Spark runner.

I hope this helps,
Tim

[1] https://beam.apache.org/get-started/quickstart-java/
[2] https://beam.apache.org/get-started/wordcount-example/
[3] https://beam.apache.org/documentation/programming-guide/

On Sun, Oct 8, 2017 at 4:15 PM, tal m <[email protected]> wrote:

> hi,
> my code use JavaSparkContext can i wrap it with beam ?
> can someone can add a sample code?
> Thanks
>

Reply via email to