Github user brennonyork commented on the pull request:
https://github.com/apache/spark/pull/5096#issuecomment-91028275
@shivaram a few things after looking at the build code some more...
1. The timeout value comes from the line [here in
`dev/run-tests-jenkins`](https://github.com/apache/spark/blob/master/dev/run-tests-jenkins#L50).
Its currently set at 120 minutes and **doesn't** include the time it takes for
PR's to be tested against the master branch (i.e. for dependencies). We could
certainly up that value, but I'd ask that since, I'm assuming, the
`dev/run-tests` script on this PR runs all the new SparkR tests (plus any
additional for core Spark you've added), that you run `dev/run-tests` locally
and, for whatever additional time is needed, update the timeout in
`dev/run-tests-jenkins` for this PR. The impetus for running locally first is
that I'd much rather get a baseline for what it takes for all the new tests to
run and then add 15ish minutes for fluff rather than throw a number into the
wind.
2. Completely agree we should get some timing metrics for the various PR
tests (thanks for the idea!). I'll generate a JIRA for that and take a look
soon. That said, just to reiterate, those tests **are not** holding up the
actual Spark test suite from finishing unless Jenkins has some deeper timing
hooks than I know about. I assume though that it's merely a factor of the large
corpus tests that were likely added in this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]