The Ganglia module has only 2 files.
In addition to dropping, we may choose the following two ways to support it
still partially
like `kafka-0.8` which Apache Spark supports in Scala 2.11 only.

   1. We can stick to `dropwizard 3.x` for JDK8 (by default) and use
`dropwizard 4.x` for `hadoop-3.2` profile only.
   2. If we upgrade to `drop wizard 4.x` completely, we can make the
Ganglia module as an external packages (with dropwizard 3.x) for Apache
Spark 3.0 JDK8.

$ tree .
.
├── pom.xml
└── src
    └── main
        └── scala
            └── org
                └── apache
                    └── spark
                        └── metrics
                            └── sink
                                └── GangliaSink.scala

-------------------------------------------------------------------------------
Language                     files          blank        comment
code
-------------------------------------------------------------------------------
Scala                            1             20             17
  59
Maven                            1              4             17
  27
-------------------------------------------------------------------------------
SUM:                             2             24             34
  86
-------------------------------------------------------------------------------

Bests,
Dongjoon.


On Wed, Oct 30, 2019 at 6:18 PM Sean Owen <sro...@gmail.com> wrote:

> I wanted to raise this to dev@.
>
> So, updating dropwizard metrics from 3.2.x to 4.x might be important for
> JDK 11 support. Our tests pass as-is without this update. But we don't test
> some elements of this metrics support, like Ganglia integration. And I have
> heard reports that downstream custom usages of dropwizard 3.2.x doesn't
> work on JDK 11.
>
> The bad news is that the Ganglia integration doesn't exist anymore in 4.x.
> And we have a whole custom module for that integration with Spark.
>
> My question is: how much do we need to keep Ganglia integration in Spark
> 3.x? I think it does have some users. We can keep it as is and hope it
> works out in JDK 11, or consider dropping this module.
>
>
> ---------- Forwarded message ---------
> From: Apache Spark QA <notificati...@github.com>
> Date: Wed, Oct 30, 2019 at 6:56 PM
> Subject: Re: [apache/spark] [SPARK-29674][CORE] Update dropwizard metrics
> to 4.1.x for JDK 9+ (#26332)
> To: apache/spark <sp...@noreply.github.com>
> Cc: Sean Owen <sro...@gmail.com>, Assign <ass...@noreply.github.com>
>
>
> *Test build #112974 has started
> <https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/112974/testReport>*
> for PR 26332 at commit aefde48
> <https://github.com/apache/spark/commit/aefde48a30942f94670f634cbeab98e23749c283>
> .
>
> —
> You are receiving this because you were assigned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/26332?email_source=notifications&email_token=AAGIZ6V4BAPUXWJXRTOH56LQRINLNA5CNFSM4JHB4IO2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOECWEHLA#issuecomment-548160428>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AAGIZ6S24KJARB6DUU2FMEDQRINLNANCNFSM4JHB4IOQ>
> .
>

Reply via email to