Hi Prabeesh,
Do a export _JAVA_OPTIONS=-Xmx10g before starting the shark. Also you can
do a ps aux | grep shark and see how much memory it is being allocated,
mostly it should be 512mb, in that case increase the limit.
Thanks
Best Regards
On Fri, May 23, 2014 at 10:22 AM, prabeesh k
I¹m trying out 1.0 on a set of small Spark Streaming tests and am running
into problems. Here¹s one of the little programs I¹ve used for a long
time ‹ it reads a Kafka stream that contains Twitter JSON tweets and does
some simple counting. The program starts OK (it connects to the Kafka
stream
Also one other thing to try, try removing all of the logic form inside
of foreach and just printing something. It could be that somehow an
exception is being triggered inside of your foreach block and as a
result the output goes away.
On Fri, May 23, 2014 at 6:00 PM, Patrick Wendell
Few more suggestions.
1. See the web ui, is the system running any jobs? If not, then you may
need to give the system more nodes. Basically the system should have more
cores than the number of receivers.
2. Furthermore there is a streaming specific web ui which gives more
streaming specific data.