GitHub user kiszk opened a pull request:
https://github.com/apache/spark/pull/19998
[SPARK-22377][BUILD] Use lsof or /usr/sbin/lsof in run-tests.py
## What changes were proposed in this pull request?
In [the environment where `/usr/sbin/lsof` does not
exist](https://github.com/apache/spark/pull/19695#issuecomment-342865001),
`./dev/run-tests.py` for `maven` causes the following error. This is because
the current `./dev/run-tests.py` checks existence of only `/usr/sbin/lsof` and
aborts immediately if it does not exist.
This PR changes as follows:
1. Check whether `lsof` or `/usr/sbin/lsof` exists
2. Go forward if both of them do not exist
```
/bin/sh: 1: /usr/sbin/lsof: not found
Usage:
kill [options] <pid> [...]
Options:
<pid> [...] send signal to every <pid> listed
-<signal>, -s, --signal <signal>
specify the <signal> to be sent
-l, --list=[<signal>] list all signal names, or convert one to a name
-L, --table list all signal names in a nice table
-h, --help display this help and exit
-V, --version output version information and exit
For more details see kill(1).
Traceback (most recent call last):
File "./dev/run-tests.py", line 626, in <module>
main()
File "./dev/run-tests.py", line 597, in main
build_apache_spark(build_tool, hadoop_version)
File "./dev/run-tests.py", line 389, in build_apache_spark
build_spark_maven(hadoop_version)
File "./dev/run-tests.py", line 329, in build_spark_maven
exec_maven(profiles_and_goals)
File "./dev/run-tests.py", line 270, in exec_maven
kill_zinc_on_port(zinc_port)
File "./dev/run-tests.py", line 258, in kill_zinc_on_port
subprocess.check_call(cmd, shell=True)
File "/usr/lib/python2.7/subprocess.py", line 541, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '/usr/sbin/lsof -P |grep 3156 | grep
LISTEN | awk '{ print $2; }' | xargs kill' returned non-zero exit status 123
```
## How was this patch tested?
manually tested
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/kiszk/spark SPARK-22813
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19998.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19998
----
commit 969bc227f255d721044e057da633c5f2becca2af
Author: Kazuaki Ishizaki <[email protected]>
Date: 2017-12-16T02:14:14Z
initial commit
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]