Github user holdenk commented on the issue:
https://github.com/apache/spark/pull/15821
@shaneknapp your understanding about what run-pip-tests code is pretty
correct. It's important to note that part of the test is installing the pyspark
package its self to makesure we didn't break the packaging, and pyarrow is only
installed because we want to be able to run some pyarrow tests with it -- we
don't need that to be part of the packaging tests infact it would be simpler to
have it be part of the normal tests.
So one possible approach to fix this I think would be updating conda on the
machines because its old, installing pyarrow into the py3k worker env, and then
taking the pyarrow tests out of the packaging test and instead have them run in
the normal flow.
I'm not super sure this is a cert issue per-se, it seems that newer
versions of conda are working fine (it's possible the SSL lib is slightly out
of date and not understanding wildcards or something else in the cert)?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]