+1, this was indeed a problem in the past.

On Sun, 15 Jul 2018, 22:56 Reynold Xin, <r...@databricks.com> wrote:

> Makes sense. Thanks for looking into this.
>
> On Sun, Jul 15, 2018 at 1:51 PM Sean Owen <sro...@gmail.com> wrote:
>
>> Yesterday I cleaned out old Spark releases from the mirror system --
>> we're supposed to only keep the latest release from active branches out on
>> mirrors. (All releases are available from the Apache archive site.)
>>
>> Having done so I realized quickly that the
>> HiveExternalCatalogVersionsSuite relies on the versions it downloads being
>> available from mirrors. It has been flaky, as sometimes mirrors are
>> unreliable. I think now it will not work for any versions except 2.3.1,
>> 2.2.2, 2.1.3.
>>
>> Because we do need to clean those releases out of the mirrors soon
>> anyway, and because they're flaky sometimes, I propose adding logic to the
>> test to fall back on downloading from the Apache archive site.
>>
>> ... and I'll do that right away to unblock
>> HiveExternalCatalogVersionsSuite runs. I think it needs to be backported to
>> other branches as they will still be testing against potentially
>> non-current Spark releases.
>>
>> Sean
>>
>

Reply via email to