Re: Getting "java.lang.NoSuchFieldError: TRUNCATE_SIZED_RESTRICTION" in Flink cluster

2020-10-06 Thread Praveen K Viswanathan
l/12708 which will be available in > 2.25.0 release (this release is currently underway). > > On Mon, Oct 5, 2020 at 11:28 AM Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> Thanks for sharing the tool Tomo. I will give it a try and let you know. >> >

Re: Getting "java.lang.NoSuchFieldError: TRUNCATE_SIZED_RESTRICTION" in Flink cluster

2020-10-05 Thread Praveen K Viswanathan
it > a try? > > https://github.com/GoogleCloudPlatform/cloud-opensource-java/wiki/Linkage-Checker-Enforcer-Rule > > Regards, > Tomo > > On Fri, Oct 2, 2020 at 21:34 Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> Hi - We have a beam pi

Re: Getting "java.lang.NoSuchFieldError: TRUNCATE_SIZED_RESTRICTION" in Flink cluster

2020-10-05 Thread Praveen K Viswanathan
ollectionView > 2020-10-03 00:42:31,115 INFO > org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator - | > leaveCompositeTransform- View.AsList > 2020-10-03 00:42:31,116 INFO > org.apache.beam.runners.flink.FlinkStreamingPipelineTranslator - | > enterCompositeTransform-

Getting "java.lang.NoSuchFieldError: TRUNCATE_SIZED_RESTRICTION" in Flink cluster

2020-10-02 Thread Praveen K Viswanathan
a profile for flink-runner as shown below. > > flink-runner > > > >org.apache.beam >beam-runners-flink-1.10 >2.21.0 > > > > > And the docker image has below flink version FROM flink:1.10.0-scala_2.12 Both our pipeline and SDF based IO connector are on Beam 2.23.0 version. Appreciate if you can guide us on what is causing this exception. -- Thanks, Praveen K Viswanathan

Re: Output from Window not getting materialized

2020-09-17 Thread Praveen K Viswanathan
> // write PCollection> to stream >> >> .apply(MyIO.write() >> >> .withStream(outputStream) >> >> .withConsumerConfig(config)); >> >> >> >> >> >> Without the window transform, we can read from the stream and write to >> it, however, I don’t see output after the Window transform. Could you >> please help pin down the issue? >> >> Thank you, >> >> Gaurav >> > -- Thanks, Praveen K Viswanathan

Re: Output from Window not getting materialized

2020-09-17 Thread Praveen K Viswanathan
te() >> >> .withStream(outputStream) >> >> .withConsumerConfig(config)); >> >> >> >> >> >> Without the window transform, we can read from the stream and write to >> it, however, I don’t see output after the Window transform. Could you >> please help pin down the issue? >> >> Thank you, >> >> Gaurav >> > -- Thanks, Praveen K Viswanathan

Re: Conditional branching during pipeline execution time

2020-07-08 Thread Praveen K Viswanathan
No worries, the error was due to the DoFn element not defined as an instance of SchemaCoder. On Tue, Jul 7, 2020 at 10:46 PM Praveen K Viswanathan < harish.prav...@gmail.com> wrote: > Thanks Luke. I changed the pipeline structure as shown below. I could see > that this flow would

Re: Conditional branching during pipeline execution time

2020-07-07 Thread Praveen K Viswanathan
(Decider) -outA-> PCollection -> > ParDo(DoFnA) > \outB-> PCollection -> > ParDo(DoFnB) > > See[1] for how to create a DoFn with multiple outputs. > > 1: > https://beam.apache.org/documentation/pipelines/design-your-pipeline/#a-single

Conditional branching during pipeline execution time

2020-07-07 Thread Praveen K Viswanathan
ching case. -- Thanks, Praveen K Viswanathan

Re: DoFn with SideInput

2020-06-29 Thread Praveen K Viswanathan
t; 1: https://beam.apache.org/documentation/programming-guide/#side-inputs > 2: https://beam.apache.org/documentation/patterns/side-inputs/ > > On Sun, Jun 28, 2020 at 6:24 PM Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> >> Hi All - I am facing an is

DoFn with SideInput

2020-06-28 Thread Praveen K Viswanathan
@ProcessElement public void processElement(@Element Map>> input, OutputReceiver out) { * Log.of("UpdateFn " + input);* out.output(new CustomObject()); } } -- Thanks, Praveen K Viswanathan

Re: Caching data inside DoFn

2020-06-26 Thread Praveen K Viswanathan
parallelism > during processing (for small pipelines this likely won't matter). > > On Fri, Jun 26, 2020 at 7:29 AM Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> Hi All - I have a DoFn which generates data (KV pair) for each element >> that it i

Caching data inside DoFn

2020-06-26 Thread Praveen K Viswanathan
e value from KV if there is a match in the key -- Thanks, Praveen K Viswanathan

Unable to commit offset using KafkaIO

2020-06-24 Thread Praveen K Viswanathan
sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"user\" password=\"pwd\";"; )) .withConsumerConfigUpdates(ImmutableMap.of( ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true )) -- Thanks, Praveen K Viswanathan

Re: Designing an existing pipeline in Beam

2020-06-24 Thread Praveen K Viswanathan
n / Transform instance multiple times in the > graph or you can follow regular development practices where the common code > is factored into a method and two different DoFn's invoke it. > > On Tue, Jun 23, 2020 at 4:28 PM Praveen K Viswanathan < > harish.prav...@gmail.com&g

Re: Designing an existing pipeline in Beam

2020-06-23 Thread Praveen K Viswanathan
create a cycle though. > > On Mon, Jun 22, 2020 at 6:02 PM Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> Another way to put this question is, how do we write a beam pipeline for >> an existing pipeline (in Java) that has a dozen of custom obje

Re: Designing an existing pipeline in Beam

2020-06-22 Thread Praveen K Viswanathan
objects, getters and setters and HashMap *but inside a DoFn*. Is this the optimal way or does Beam offer something else? On Mon, Jun 22, 2020 at 3:47 PM Praveen K Viswanathan < harish.prav...@gmail.com> wrote: > Hi Luke, > > We can say Map 2 as a kind of a template using which you

Re: Designing an existing pipeline in Beam

2020-06-22 Thread Praveen K Viswanathan
, 2020 at 8:23 AM Luke Cwik wrote: > Who reads map 1? > Can it be stale? > > It is unclear what you are trying to do in parallel and why you wouldn't > stick all this logic into a single DoFn / stateful DoFn. > > On Sat, Jun 20, 2020 at 7:14 PM Praveen K Viswanathan < >

Designing an existing pipeline in Beam

2020-06-20 Thread Praveen K Viswanathan
. These are my initial thoughts/questions and I would like to get some expert advice on how we typically design such an interleaved transformation in Apache Beam. Appreciate your valuable insights on this. -- Thanks, Praveen K Viswanathan

Re: Making RPCs in Beam

2020-06-19 Thread Praveen K Viswanathan
/transforms/java/aggregation/groupintobatches/ > > On Fri, Jun 19, 2020 at 8:20 AM Brian Hulette wrote: > >> Kenn wrote a blog post showing how to do batched RPCs with the state and >> timer APIs: https://beam.apache.org/blog/timely-processing/ >> >> Is that helpful? >> &

Re: Making RPCs in Beam

2020-06-19 Thread Praveen K Viswanathan
; Is that helpful? > > Brian > > On Thu, Jun 18, 2020 at 5:29 PM Praveen K Viswanathan < > harish.prav...@gmail.com> wrote: > >> Hello Everyone, >> >> In my pipeline I have to make a *single RPC call* as well as a *Batched >> RPC call* to fetch data for e

Making RPCs in Beam

2020-06-18 Thread Praveen K Viswanathan
and could share a sample code or details on how to do this. -- Thanks, Praveen K Viswanathan

Testing 'Exactly-Once' using Apache Beam + Spark Runner

2020-05-03 Thread Praveen K Viswanathan
-- Thanks, Praveen K Viswanathan