I see. The title of your original email was "Spark Shell" which is a Spark
REPL environment based on the Scala Shell, hence why I misunderstood you.

You should have the same output starting the application on the console.
You are not seeing any output?

On Mon, 11 Jul 2016 at 11:55 Sivakumaran S <siva.kuma...@me.com> wrote:

> I am running a spark streaming application using Scala in the IntelliJ
> IDE. I can see the Spark output in the IDE itself (aggregation and stuff).
> I want the spark server logging (INFO, WARN, etc) to be displayed in screen
> when I start the master in the console. For example, when I start a kafka
> cluster, the prompt is not returned and the debug log is printed to the
> terminal. I want that set up with my spark server.
>
> I hope that explains my retrograde requirement :)
>
>
>
> On 11-Jul-2016, at 6:49 PM, Anthony May <anthony...@gmail.com> wrote:
>
> Starting the Spark Shell gives you a Spark Context to play with straight
> away. The output is printed to the console.
>
> On Mon, 11 Jul 2016 at 11:47 Sivakumaran S <siva.kuma...@me.com> wrote:
>
>> Hello,
>>
>> Is there a way to start the spark server with the log output piped to
>> screen? I am currently running spark in the standalone mode on a single
>> machine.
>>
>> Regards,
>>
>> Sivakumaran
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>

Reply via email to