It is not even an old “cluster”. It is a central metastore shares by multiple clusters.
On Wed, Jan 23, 2019 at 10:04 AM Dongjoon Hyun <dongjoon.h...@gmail.com> wrote: > Got it. Thank you for sharing that, Reynold. > > So, you mean they will use `Apache Spark 3.0.0` on the old clusters with > Hive 0.x, right? > > If that happens actually, no problem to keep them. > > Bests, > Dongjoon. > > > On Tue, Jan 22, 2019 at 11:49 PM Xiao Li <gatorsm...@gmail.com> wrote: > >> Based on my experience in development of Spark SQL, the maintenance cost >> is very small for supporting different versions of Hive metastore. Feel >> free to ping me if we hit any issue about it. >> >> Cheers, >> >> Xiao >> >> Reynold Xin <r...@databricks.com> 于2019年1月22日周二 下午11:18写道: >> >>> Actually a non trivial fraction of users / customers I interact with >>> still use very old Hive metastores. Because it’s very difficult to upgrade >>> Hive metastore wholesale (it’d require all the production jobs that access >>> the same metastore be upgraded at once). This is even harder than JVM >>> upgrade which can be done on a per job basis, or OS upgrade that can be >>> done on a per machine basis. >>> >>> Is there high maintenance cost with keeping these? My understanding is >>> that Michael did a good job initially with classloader isolation and >>> modular design that they are very easy to maintain. >>> >>> On Jan 22, 2019, at 11:13 PM, Hyukjin Kwon <gurwls...@gmail.com> wrote: >>> >>> Yea, I was thinking about that too. They are too old to keep. +1 for >>> removing them out. >>> >>> 2019년 1월 23일 (수) 오전 11:30, Dongjoon Hyun <dongjoon.h...@gmail.com>님이 작성: >>> >>>> Hi, All. >>>> >>>> Currently, Apache Spark supports Hive Metastore(HMS) 0.12 ~ 2.3. >>>> Among them, HMS 0.x releases look very old since we are in 2019. >>>> If these are not used in the production any more, can we drop HMS 0.x >>>> supports in 3.0.0? >>>> >>>> hive-0.12.0 2013-10-10 >>>> hive-0.13.0 2014-04-15 >>>> hive-0.13.1 2014-11-16 >>>> hive-0.14.0 2014-11-16 >>>> ( https://archive.apache.org/dist/hive/ ) >>>> >>>> In addition, if there is someone who is still using these HMS versions >>>> and has a plan to install and use Spark 3.0.0 with these HMS versions, >>>> could you reply this email thread? If there is a reason, that would be very >>>> helpful for me. >>>> >>>> Thanks, >>>> Dongjoon. >>>> >>>