Hi guys,
I'm using kafka+spark streaming do log analysis.
Now my requirement is that the log alarm rules may change
sometimes.
Rules maybe like this:
App=Hadoop,keywords=oom|Exception|error,threshold=10
The threshold or keywords may update sometimes.
What I do is :
1. Use a Map[app,logrule] variable to store the log rules. Define it
as a static member.
2. Use a custom StreamingListener , read the configuration file in
event onBatchStarted
3. When I use the variable , I found the value is not updated in
windowstream. So now I read the configure file when use
4. Now I put the log rule in a local path, I should put it in every
worker.
What 's the best practice to do in this case?
Thanks for your help, I 'm new in spark-streaming , I even not totally
understand the principle.
Best Regards,
Evan Yao