Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19695#discussion_r149651615
  
    --- Diff: dev/create-release/release-build.sh ---
    @@ -130,6 +130,10 @@ else
       fi
     fi
     
    +LSOF=lsof
    +if ! hash $LSOF 2>/dev/null; then
    --- End diff --
    
    I am not sure .. but I found `hash` is used in few places more than 
`command` within Spark:
    
    ```bash
    grep -r "hash" . | grep ">/dev/null"
    ```
    
    ```
    ./dev/run-pip-tests:if hash virtualenv 2>/dev/null && [ ! -n "$USE_CONDA" 
]; then
    ./dev/run-pip-tests:  if hash python2 2>/dev/null; then
    ./dev/run-pip-tests:  elif hash python 2>/dev/null; then
    ./dev/run-pip-tests:  if hash python3 2>/dev/null; then
    ./dev/run-pip-tests:elif hash conda 2>/dev/null; then
    ./dev/run-pip-tests:if ! hash pip 2>/dev/null; then
    ./sql/create-docs.sh:if ! hash python 2>/dev/null; then
    ./sql/create-docs.sh:if ! hash mkdocs 2>/dev/null; then
    ```
    
    Looks:
    
    ```bash
    grep -r "command" . | grep ">/dev/null"
    ```
    
    not used in Spark but .. I am not sure ..


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to