Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
I'm unable to duplicate the PySpark failures locally. I assume I need a
specific version of SciPy to duplicate the error. Is there a way I could get
what versions the build server is running? Something like:
`sorted(["%s==%s" % (i.key, i.version) for i in
pip.get_installed_distributions()])` for python and python 3.4?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]