Re: Question on Spark shell

2016-07-11 Thread Sivakumaran S
That was my bad with the title. I am getting that output when I run my application, both from the IDE as well as in the console. I want the server logs itself displayed in the terminal from where I start the server. Right now, running the command ‘start-master.sh’ returns the prompt. I want

Re: Question on Spark shell

2016-07-11 Thread Anthony May
I see. The title of your original email was "Spark Shell" which is a Spark REPL environment based on the Scala Shell, hence why I misunderstood you. You should have the same output starting the application on the console. You are not seeing any output? On Mon, 11 Jul 2016 at 11:55 Sivakumaran S

Re: Question on Spark shell

2016-07-11 Thread Sivakumaran S
I am running a spark streaming application using Scala in the IntelliJ IDE. I can see the Spark output in the IDE itself (aggregation and stuff). I want the spark server logging (INFO, WARN, etc) to be displayed in screen when I start the master in the console. For example, when I start a kafka

Re: Question on Spark shell

2016-07-11 Thread Anthony May
Starting the Spark Shell gives you a Spark Context to play with straight away. The output is printed to the console. On Mon, 11 Jul 2016 at 11:47 Sivakumaran S wrote: > Hello, > > Is there a way to start the spark server with the log output piped to > screen? I am currently

Question on Spark shell

2016-07-11 Thread Sivakumaran S
Hello, Is there a way to start the spark server with the log output piped to screen? I am currently running spark in the standalone mode on a single machine. Regards, Sivakumaran - To unsubscribe e-mail: