GitHub user devaraj-kavali opened a pull request:

    [SPARK-21962][CORE] Distributed Tracing in Spark

    ## What changes were proposed in this pull request?
    This PR integrates with HTrace, it sends traces for the application and 
tasks when the span receivers are configured. The trace configurations can be 
updated along with spark configurations by adding prefix 'spark.htrace.' to the 
HTrace configurations like below,
    `spark.htrace.htraced.receiver.address`     IP:PORT
    `spark.htrace.local.file.span.receiver.path`        /path/local-span-file
    `spark.htrace.sampler.classes`      org.apache.htrace.core.AlwaysSampler
    And also it provides an additional configuration to receive the parent span 
with the config name ``, if the `` 
configuration exist then it takes it as parent span, otherwise it starts a new 
span for each application.
    ## How was this patch tested?
    I have verified using the existing tests with the added test and also 
verified manually in all these below deployment modes with different tracers 
individually and together.
    1. Local and local-cluster
    2. Standalone Client and Cluster modes
    3. Yarn Client and Cluster modes
    4. Mesos Client and Cluster modes

You can merge this pull request into a Git repository by running:

    $ git pull SPARK-21962

Alternatively you can review and apply these changes as the patch at:

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21071
commit 254e4ed38411d45cc8c2ba8cdace069da219c359
Author: Devaraj K <devaraj@...>
Date:   2018-04-14T00:06:36Z

    [SPARK-21962][CORE] Distributed Tracing in Spark



To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to