I'd suggest using the slf4j APIs directly.  They provide a nice stable API
that works with a variety of logging backends.  This is what Spark does
internally.

On Sun, Jun 26, 2016 at 4:02 AM, Paolo Patierno <ppatie...@live.com> wrote:

> Yes ... the same here ... I'd like to know the best way for adding logging
> in a custom receiver for Spark Streaming 2.0
>
> *Paolo Patierno*
>
> *Senior Software Engineer (IoT) @ Red Hat**Microsoft MVP on **Windows
> Embedded & IoT*
> *Microsoft Azure Advisor*
>
> Twitter : @ppatierno <http://twitter.com/ppatierno>
> Linkedin : paolopatierno <http://it.linkedin.com/in/paolopatierno>
> Blog : DevExperience <http://paolopatierno.wordpress.com/>
>
>
> ------------------------------
> From: jonathaka...@gmail.com
> Date: Fri, 24 Jun 2016 20:56:40 +0000
> Subject: Re: Logging trait in Spark 2.0
> To: yuzhih...@gmail.com; ppatie...@live.com
> CC: user@spark.apache.org
>
>
> Ted, how is that thread related to Paolo's question?
>
> On Fri, Jun 24, 2016 at 1:50 PM Ted Yu <yuzhih...@gmail.com> wrote:
>
> See this related thread:
>
>
> http://search-hadoop.com/m/q3RTtEor1vYWbsW&subj=RE+Configuring+Log4J+Spark+1+5+on+EMR+4+1+
>
> On Fri, Jun 24, 2016 at 6:07 AM, Paolo Patierno <ppatie...@live.com>
> wrote:
>
> Hi,
>
> developing a Spark Streaming custom receiver I noticed that the Logging
> trait isn't accessible anymore in Spark 2.0.
>
> trait Logging in package internal cannot be accessed in package
> org.apache.spark.internal
>
> For developing a custom receiver what is the preferred way for logging ?
> Just using log4j dependency as any other Java/Scala library/application ?
>
> Thanks,
> Paolo
>
> *Paolo Patierno*
>
> *Senior Software Engineer (IoT) @ Red Hat**Microsoft MVP on **Windows
> Embedded & IoT*
> *Microsoft Azure Advisor*
>
> Twitter : @ppatierno <http://twitter.com/ppatierno>
> Linkedin : paolopatierno <http://it.linkedin.com/in/paolopatierno>
> Blog : DevExperience <http://paolopatierno.wordpress.com/>
>
>
>

Reply via email to