>From a very quick look, I believe that's just occasional network issue in
AppVeyor. For example, in this case:
  Downloading:
https://repo.maven.apache.org/maven2/org/scala-lang/scala-compiler/2.11.8/scala-compiler-2.11.8.jar
This took 26ish mins and seems further downloading jars look mins much more
than usual.

FYI, It usually takes built 35 ~ 40 mins and R tests 25 ~ 30 mins where
usually ends up 1 hour 5 min.
Will take another look to reduce the time if the usual time reaches 1 hour
and 30 mins (which is the current AppVeyor limit).
I did this few times before - https://github.com/apache/spark/pull/19722
and https://github.com/apache/spark/pull/19816.

The timeout is already increased from 1 hour to 1 hour and 30 mins. They
still look disallowing to increase timeout anymore.
I contacted with them few times and manually requested this.

For the best, I believe we usually just rebase rather than merging the
commits in any case as mentioned in the contribution guide.
The test failure in the PR should be ignorable if that's not directly
related with SparkR.


Thanks.



2018-05-14 8:45 GMT+08:00 Ilan Filonenko <i...@cornell.edu>:

> Hi dev,
>
> I recently updated an on-going PR [https://github.com/apache/
> spark/pull/21092] that was updated with a merge that included a lot of
> commits from master and I got the following error:
>
> *continuous-integration/appveyor/pr *— AppVeyor build failed
>
> due to:
>
> *Build execution time has reached the maximum allowed time for your plan
> (90 minutes).*
>
> seen here: https://ci.appveyor.com/project/ApacheSoftwareFoundation/
> spark/build/2300-master
>
> As this is the first time I am seeing this, I am wondering if this is in
> relation to a large merge and if it is, I am wondering if the timeout can
> be increased.
>
> Thanks!
>
> Best,
> Ilan Filonenko
>

Reply via email to