When I launched spark-shell using, spark-shell ---master local[2].
Same behaviour, no output on console but only timestamps.

When I did, lines.saveAsTextFiles("hdfslocation",suffix);
I get empty files of 0 bytes on hdfs

On Wed, Apr 15, 2015 at 12:46 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Just make sure you have atleast 2 cores available for processing. You can
> try launching it in local[2] and make sure its working fine.
>
> Thanks
> Best Regards
>
> On Tue, Apr 14, 2015 at 11:41 PM, Shushant Arora <
> shushantaror...@gmail.com> wrote:
>
>> Hi
>>
>> I am running a spark streaming application but on console nothing is
>> getting printed.
>>
>> I am doing
>> 1.bin/spark-shell --master clusterMgrUrl
>> 2.import org.apache.spark.streaming.StreamingContext
>> import org.apache.spark.streaming.StreamingContext._
>> import org.apache.spark.streaming.dstream.DStream
>> import org.apache.spark.streaming.Duration
>> import org.apache.spark.streaming.Seconds
>> val ssc = new StreamingContext( sc, Seconds(1))
>> val lines = ssc.socketTextStream("hostname",7777)
>> lines.print()
>> ssc.start()
>> ssc.awaitTermination()
>>
>> Jobs are getting created when I see webUI but nothing gets printed on
>> console.
>>
>> I have started a nc script on hostname  port 7777 and can see messages
>> typed on this port from another console.
>>
>>
>>
>> Please let me know If I am doing something wrong.
>>
>>
>>
>>
>

Reply via email to