nchammas commented on a change in pull request #27912: [SPARK-31155] Enable pydocstyle tests URL: https://github.com/apache/spark/pull/27912#discussion_r392639239
########## File path: dev/requirements.txt ########## @@ -1,5 +1,8 @@ -flake8==3.5.0 +pycodestyle==2.5.0 Review comment: I always work on Spark from within a dedicated Python virtual environment, so any requirements specified there do not affect my work on any other project. I work with independent Python virtual environments as a matter of habit; perhaps that's why I don't consider it a disruption to developer environments to pin versions. As for what's common convention in the Python community, there is a shift among some high profile projects towards using tools like pip-tools, pipenv, or poetry, to completely specify and pin all direct and transitive dependencies. Here are a couple of examples that use pip-sync, for example: * Warehouse (the software behind PyPI): https://github.com/pypa/warehouse/blob/master/requirements/tests.txt * Trio (next gen Python async library): https://github.com/python-trio/trio/blob/master/test-requirements.txt ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
