Hello all - I have a barebones word_count script that runs locally but I don't have a clue how to run this using SparkRunner.
For example, Local: mvn exec:java -Dexec.mainClass="com.apache.beam.learning.WordCount" -Dexec.args="--runner=DirectRunner" Spark: mvn exec:java -Dexec.mainClass="com.apache.beam.learning.WordCount" -Dexec.args="--runner=SparkRunner" My code takes runner from args, PipelineOptions opts = PipelineOptionsFactory.fromArgs(args).create(); I have local spark cluster, but what additional parameters need to be given to make beam code run on spark. (Sorry, there seems to be not so great documentation for this use case, or perhaps, I overlooked?) Thank you for your help. *--* *Mahesh Vangala* *(Ph) 443-326-1957* *(web) mvangala.com <http://mvangala.com>*
