I've got a (dirty) usecase where I have existing spark batch job which produces an output that I would like to feed into my beam pipeline (assuming running on SparkRunner). I was trying to run it as one job (the output is reduced so not a big data hence ok to do something like Create.of(rdd.collect())) but that's failing because of the two separate spark contexts. Is it possible to build the beam pipeline on existing spark context? thx,Antony.
- appending beam pipeline to spark job Antony Mayi
- Re: appending beam pipeline to spark job Jean-Baptiste Onofré
- Re: appending beam pipeline to spark job Antony Mayi
- Re: appending beam pipeline to spark jo... Jean-Baptiste Onofré
- Re: appending beam pipeline to spar... Antony Mayi
