Also look at back pressure enabled. Both of these can be used to limit the rate
Sent from my iPhone
> On May 10, 2016, at 8:02 AM, chandan prakash
> wrote:
>
> Hi,
> I am using Spark Streaming with Direct kafka approach.
> Want to limit number of event records coming in my batches.
> Have ques
I think before doing a code update you would like to gracefully shutdown your
streaming job and checkpoint the processed offsets ( and any state that you
maintain ) in database or Hdfs.
When you start the job up it should read this checkpoint file , build the
necessary state and begin processing
Hi,
I am new to scala/graphx and am having problems converting a tsv file to a
graph. I have a flat tab separated file like below:
n1 P1 n2
n3 P1 n4
n2 P2 n3
n3 P2 n1
n1 P3 n4
n3 P3 n2
where n1,n2,n3,n4 are the nodes of the graph and R1,P2,P3 are the
properties which should form the edges b