FYI that this is happening today. Users may see slowness and paused jobs.
We will send a note when upgrade is complete.

Thanks,

Nuria

On Thu, Apr 5, 2018 at 1:22 PM, Andrew Otto <o...@wikimedia.org> wrote:

> Hi all!
>
> I just upgraded spark2 across the cluster to Spark 2.3.0
> <https://spark.apache.org/releases/spark-release-2-3-0.html>.  If you are
> using the pyspark2*, spark2-*, etc. executables, you will now be using
> Spark 2.3.0.
>
> We are moving towards making Spark 2 the default Spark for all Analytics
> production jobs.  We don’t have a deprecation plan for Spark 1 yet, so you
> should be able to continue using Spark 1 for the time being.  However, in
> order to support large yarn Spark 2 jobs, we need to upgrade the default
> Yarn Spark Shuffle Service to Spark 2.  This means that large Spark 1 jobs
> may no longer work properly.  We don’t know of any large productionized
> Spark 1 jobs other than the ones the Analytics team manages, but if you
> have any that you are worried about, please let us know ASAP.
>
> -Andrew & Analytics Engineering
>
>
>
> _______________________________________________
> Research-Internal mailing list
> research-inter...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/research-internal
>
>
_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to