[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 it's building w/the proper java now. from the build log: ``` [EnvInject] - Injecting as environment variables the properties content JENKINS_MASTER_HOSTNAME=amp-jenkins-master JAVA_HOME=/usr/java/jdk1.8.0_191 JAVA_7_HOME=/usr/java/jdk1.7.0_79 SPARK_TESTING=1 ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 ok, PRB builds updated w/the new JAVA_HOME --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 oh crap forgot about the PRB. updating the config now. On Fri, Dec 7, 2018 at 2:58 PM Dongjoon Hyun wrote: > @shaneknapp <https://github.com/shaneknapp> . > The above run is still using old one. Do we need to retrigger to use > JDK8_191? > > [info] Building Spark (w/Hive 1.2.1) using SBT with these arguments: -Phadoop-2.7 -Pkubernetes -Phive-thriftserver -Pkinesis-asl -Pyarn -Pspark-ganglia-lgpl -Phive -Pmesos test:package streaming-kinesis-asl-assembly/assembly > Using /usr/java/jdk1.8.0_60 as default JAVA_HOME. > Note, this will be overridden by -java-home if it is set. > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/23218#issuecomment-445391141>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrGWEM95ltU2VLf_5LITLbplG-y5oks5u2vKigaJpZM4ZAs01> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 okie dokie, java8 update is done! https://issues.apache.org/jira/browse/SPARK-26282 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 https://issues.apache.org/jira/browse/SPARK-26282 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23218: [SPARK-26266][BUILD] Update to Scala 2.12.8
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23218 shouldn't be too hard, but it will require some downtime. On Wed, Dec 5, 2018 at 5:41 AM Sean Owen wrote: > Ah OK, so all of them were a JVM crash. It would probably be a good idea > to update the JVM on all the workers as _60 is over 3 years old. It's > probably not as simple as it sounds but WDYT @shaneknapp > <https://github.com/shaneknapp> ? > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/23218#issuecomment-88433>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrB6g0RYWoCxPtgI1FXp-xjCqobIUks5u180QgaJpZM4ZAs01> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23148: [SPARK-26177] Automated formatting for Scala code
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23148 > This is already useful in that we can just ask people to run dev/scalafmt (I'll update developer guids) as the output style looks _also_ just fine. I won't try to have this automatically add the formatting to the build. yeah, i agree... this might not be the right test to run during PR builds, but should be useful for local development. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23148: [SPARK-26177] Automated formatting for Scala code
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23148 sgtm, i'll be more than happy to review once i get in to the office. :) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23148: [SPARK-26177] Automated formatting for Scala code
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23148 i think the best place for this, imo, is in dev/run-tests.py (and called via run-test-jenkins.py). On Thu, Nov 29, 2018 at 6:56 AM Sean Owen wrote: > Merged to master. @shaneknapp <https://github.com/shaneknapp> I think you > can update the PR builder to call this now, if we're ready to give it a > spin. > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/23148#issuecomment-442862266>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrHVVU5TwT33gV831TexZhbNrIItKks5uz_WQgaJpZM4YzwZh> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23117: [WIP][SPARK-7721][INFRA] Run and generate test co...
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/23117#discussion_r236488706 --- Diff: dev/run-tests.py --- @@ -434,6 +434,63 @@ def run_python_tests(test_modules, parallelism): run_cmd(command) +def run_python_tests_with_coverage(test_modules, parallelism): +set_title_and_block("Running PySpark tests with coverage report", "BLOCK_PYSPARK_UNIT_TESTS") + +command = [os.path.join(SPARK_HOME, "python", "run-tests-with-coverage")] +if test_modules != [modules.root]: +command.append("--modules=%s" % ','.join(m.name for m in test_modules)) +command.append("--parallelism=%i" % parallelism) +run_cmd(command) +post_python_tests_results() + + +def post_python_tests_results(): +if "SPARK_TEST_KEY" not in os.environ: +print("[error] 'SPARK_TEST_KEY' environment variable was not set. Unable to post" + "PySpark coverage results.") +sys.exit(1) --- End diff -- sure, i can do that tomorrow (currently heading out for the day). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #23117: [WIP][SPARK-7721][INFRA] Run and generate test co...
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/23117#discussion_r236440252 --- Diff: dev/run-tests.py --- @@ -434,6 +434,63 @@ def run_python_tests(test_modules, parallelism): run_cmd(command) +def run_python_tests_with_coverage(test_modules, parallelism): +set_title_and_block("Running PySpark tests with coverage report", "BLOCK_PYSPARK_UNIT_TESTS") + +command = [os.path.join(SPARK_HOME, "python", "run-tests-with-coverage")] +if test_modules != [modules.root]: +command.append("--modules=%s" % ','.join(m.name for m in test_modules)) +command.append("--parallelism=%i" % parallelism) +run_cmd(command) +post_python_tests_results() + + +def post_python_tests_results(): +if "SPARK_TEST_KEY" not in os.environ: +print("[error] 'SPARK_TEST_KEY' environment variable was not set. Unable to post" + "PySpark coverage results.") +sys.exit(1) --- End diff -- actually, i do agree w/you @squito ... we need to make sure that the test running code works both in and out of our jenkins environment. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23117: [WIP][SPARK-7721][INFRA] Run and generate test coverage ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23117 not yet, but i will carve out some time today and wednesday to look closer. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23117: [WIP][SPARK-7721][INFRA] Run and generate test coverage ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23117 i'll try and take a look at this over the next couple of days, but it's a holiday weekend and i may not be able to get to this until monday. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 gonna hold off on backporting until i inspect each branch independently. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 okie dokie... this will be my first official push to the spark repo! :) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 howdy howdy! i opened this nearly 2 weeks ago, and was wondering if i could get another set of eyeballs on it... @holdenk @srowen @felixcheung --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23061: [SPARK-26095][build] Disable parallelization in make-dis...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23061 thanks @vanzin --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23061: [SPARK-26095][build] Disable parallelization in make-dis...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23061 > BTW noticed this: > > ``` > Step 6/9 : COPY R ${SPARK_HOME}/R > COPY failed: stat /var/lib/docker/tmp/docker-builder004084591/R: no such file or directory > Failed to build SparkR Docker image, please refer to Docker build output for details. > ``` > Are the R profile and the R integration tests intentionally disabled? `KubernetesSuite` doesn't mix in `RTestsSuite`, which is why it hasn't failed... yeah, intentionally disabled. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23061: [SPARK-26095][build] Disable parallelization in make-dis...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23061 edit the subject, add [k8s], then ask jenkins to trigger a build. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23017: [SPARK-26015][K8S] Set a default UID for Spark on K8S Im...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 btw i wiped all of my .ivy2 and .m2 dirs before building, just in case we're looking at a poisoned artifact. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23017: [SPARK-26015][K8S] Set a default UID for Spark on K8S Im...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 > manually on a different machine or just on the workers? on a different machine (your laptop, something local, etc). i'm trying on a couple of different workers and it's always hanging @ that exact step. :\ nothing has been updated, system-wise, on ANY of the workers. i think this might be a problem w/making the dist and am not looking forward to any git archaeology to find the broken change. :( --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23017: [SPARK-26015][K8S] Set a default UID for Spark on K8S Im...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 @vanzin @ifilonenko can you guys try and build the dist locally on your dev laptops? here's a little wrapper script to make it easier: you'll need to update your PATH to have some version of python3 installed (tho i don't actually think it's necessary), as well as JAVA_HOME... might also need zinc in there as well. ``` #!/bin/bash rm -f spark-*.tgz export DATE=`date "+%Y%m%d"` export REVISION=`git rev-parse --short HEAD` export AMPLAB_JENKINS=1 export PATH="$PATH:/home/anaconda/envs/py3k/bin" # Prepend JAVA_HOME/bin to fix issue where Zinc's embedded SBT incremental compiler seems to # ignore our JAVA_HOME and use the system javac instead. export PATH="$JAVA_HOME/bin:$PATH:/usr/local/bin" # Generate random point for Zinc export ZINC_PORT ZINC_PORT=$(python -S -c "import random; print random.randrange(3030,4030)") export SBT_OPTS="-Duser.home=$HOME -Dsbt.ivy.home=$HOME/.ivy2" export SPARK_VERSIONS_SUITE_IVY_PATH="$HOME/.ivy2" ./dev/make-distribution.sh --name ${DATE}-${REVISION} --pip --tgz -DzincPort=${ZINC_PORT} \ -Phadoop-2.7 -Pkubernetes -Pkinesis-asl -Phive -Phive-thriftserver retcode=$? exit $retcode ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23017: [SPARK-26015][K8S] Set a default UID for Spark on K8S Im...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 yep it gets to the same spot when i try and build manually, and fails: ``` [INFO] --- maven-source-plugin:3.0.1:test-jar-no-fork (create-source-jar) @ spark-mllib-local_2.12 --- [INFO] Building jar: /home/eecs/sknapp/src/spark/mllib-local/target/spark-mllib-local_2.12-3.0.0-SNAPSHOT-test-sources.jar ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23017: [SPARK-26015][K8S] Set a default UID for Spark on K8S Im...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23017 @vanzin i'm really not sure what's going on w/this. i noticed it happening on research-jenkins-worker-07 yesterday, so i rebooted the box and that seemed to fix it. now it's back, and happening on ALL of the workers. at least we know it's not something up w/the systems, but when it builds the dist. i'll try and build the dist manually on the workers and see what happens as well. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23026: [SPARK-25960][k8s] Support subpath mounting with Kuberne...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23026 > > if such a list exists it should be the same list that triggers regular tests. > > I defer that to @shaneknapp no, @vanzin is right. i'll update that tomorrow. @vanzin for historical knowledge: once i get spark ported to ubuntu (literally down to one or two troublesome builds! such closeness!), the k8s prb will be merged in to the regular spark prb. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22911: [SPARK-25815][k8s] Support kerberos in client mode, keyt...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22911 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22967: [SPARK-25956] Make Scala 2.12 as default Scala version i...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22967 @dongjoon-hyun not a problem, i'll need to update the build config(s) what branches will need 2.12 vs 2.11? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23012: [SPARK-26014][R] Deprecate R prior to version 3.4 in Spa...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23012 @felixcheung @HyukjinKwon yes: deprecation in this case means we test against R-3.1.1 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23012: [SPARK-26014][R] Deprecate R prior to version 3.4 in Spa...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23012 TL;DR: let's go w/deprecation. still TL;DR: if i never have to install or manage R again, i will be a happy person! @HyukjinKwon upgrading R is easy. getting the right mix of R and all of the associated packages working "as expected" is a nightmare. the biggest problem i foresee is if we upgrade R (and all other packages) on the workers, every version of spark will be tested against this... and there will be bugs, test failures, and other time consuming (and obtuse) problems to debug. multiply this by every branch, and you can see the rabbit hole you've just entered. for example, a month ago when i finally had time to dive back in to the ubuntu port, after finally figuring out how to install R+friends on ubuntu in an identical way to the centos workers, i STILL was finding problems w/lintr (see: https://github.com/apache/spark/pull/22896). anyways: i'm more than happy to upgrade R and all the packages to something much more recent, but i will definitely appreciate some help in the game of test-failure whack-a-mole. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 nah, it's not urgent at all. i also tested this locally by breaking various python scripts and confirming that it caught errors as expected. On Tue, Nov 13, 2018 at 5:03 PM Hyukjin Kwon wrote: > I haven't taken a look super closely but the idea looks itself okay. Is it > urgent? if yes, yup. I don't object to go ahead right away. Otherwise, > might be good to leave open for few days for review comments .. > > Let me leave some cc's for @srowen <https://github.com/srowen>, > @felixcheung <https://github.com/felixcheung>, @holdenk > <https://github.com/holdenk> .. > > â > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/22994#issuecomment-438497996>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrPDJEYKTATyZjlJPnmGjllCBkTb5ks5uu2vFgaJpZM4YXXiK> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 (weird, github ate my last comment) @HyukjinKwon i think we're g2g for merging this in to master and backports. you want to do this, or should i? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23019: [SPARK-26025][k8s] Speed up docker image build on dev re...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23019 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23012: [SPARK-26014][R] Deprecate R prior to version 3.4 in Spa...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23012 howdy howdy! unless we dockerize spark builds (someday!), we're going to be stuck w/testing against one version of R on the jenkins workers... i've been looking in to packrat to help manage packages, but having more than one version of R will require me manually building and disting it out. and i really, truly, don't want to do that. let me know how you think i should proceed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23019: [SPARK-26025][k8s] Speed up docker image build on dev re...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23019 ok, i think i fixed the problem w/the jenkins config. test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23019: [SPARK-26025][k8s] Speed up docker image build on dev re...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23019 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #23019: [SPARK-26025][k8s] Speed up docker image build on dev re...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/23019 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 alright, i think we're g2g. i'll squash my commits now before merging. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 re https://github.com/apache/spark/pull/22963#issuecomment-437133365 i checked, and the only one we can seemingly download independently is pycodestyle. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 i have no idea why the tests aren't passing btw. :\ --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 see the output from the following build to get the gory details of what's happening: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/98664/console i removed `set -x` from the top to clean up the output in the build logs. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22994: [BUILD] refactor dev/lint-python in to something readabl...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22994 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 orthogonal to this PR, but just as an FYI: https://github.com/apache/spark/pull/22994 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22994: this is serious refactor
GitHub user shaneknapp opened a pull request: https://github.com/apache/spark/pull/22994 this is serious refactor ## What changes were proposed in this pull request? `dev/lint-python` is a mess of nearly unreadable bash. i would like to fix that as best as i can. ## How was this patch tested? the build system will test this. DO NOT MERGE UNTIL I GIVE THE ALL-CLEAR AS I WILL HAVE DEBUGGING CODE IN THE SCRIPT! You can merge this pull request into a Git repository by running: $ git pull https://github.com/shaneknapp/spark lint-python-refactor Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/22994.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #22994 commit 7977751e1a474968da63aea6b0616e9ab083cb5c Author: shane knapp Date: 2018-11-09T19:18:37Z this is serious refactor --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 > I don't have the context here to have a strong opinion, but, it seems like we should make the test env setup self-contained if possible, to avoid dependencies on and maintenance of the build env. After all others need to run these tests too. To that end, downloading and installing particular versions seems reasonable (if they're not already installed at the right version?) Is that hard? it *should* be reasonable to download a binary as we do for pycodestyle. i'll look around and see what i can find. > > Whatever makes this more robust and needs less maintenance from you sounds good. I think you're welcome to clean up the script as you like too. i will absolutely be taking a long, hard look at lint-python... --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 pydocstyle tests passed w/o issue btw: https://amplab.cs.berkeley.edu/jenkins/job/ubuntuSparkPRB/134/consoleFull this is on ubuntu w/python 3.5, flake8 3.6.0, pydocstyle 3.0.0 and pycodestyle 2.4.0. i also updated all of the centos jenkins workers to have flake8 3.6.0 and pycodestyle 2.4.0. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 interesting. from the flake8 webpage: ``` It is very important to install Flake8 on the correct version of Python for your needs. If you want Flake8 to properly parse new language features in Python 3.5 (for example), you need it to be installed on 3.5 for Flake8 to understand those features. In many ways, Flake8 is tied to the version of Python on which it runs. ``` we're testing flake8 w/python 3.4.5 on the centos workers, and soon to be python 3.5 everywhere. this means that flake8 probably hasn't been behaving properly since the get-go. also, this PR sets the flake8 version to 3.6, which means we should be testing against python3.6... i was only planning on bumping the python version from 3.4 -> 3.5, however. what's *also* confusing is that there are version of flake8 3.6.0 for python2.7, 3.6 and 3.7. i think we have some serious, and heretofore unknown, version incompatibilities in our python testing environment. what i'm going to do/test: * pin flake8 to version 3.5.0, and the python2.7 testing environment. * pin pycodestyle to 3.6.0, again in the python2.7 testing environment. * leave sphinx-build in the python 3.{4,5} environment (as it always has been) * install pydocstyle in a test env and see what breaks --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 ok, i missed that previous comment about flake8 and pycodestyle version incompatibility. since we're running flake8 3.50, i agree that this could be causing problems w/pycodestyle running properly: ``` flake8 3.5.0 has requirement pycodestyle<2.4.0,>=2.0.0, but you'll have pycodestyle 2.4.0 which is incompatible. ``` i'll bump flake8 to 3.6 on all of the workers, and then bump pycodestyle to 2.4.0 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22963: [SPARK-25962][BUILD][PYTHON] Specify minimum versions fo...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22963 sorry to jump in late on this, but i just wanted to check in on some of this stuff... `dev/lint-python` is a nightmare and i just wanted to discuss a couple of things: 1) the script downloads pycodestyle if it's not the right version. however, flake8, pycodestyle and sphinx are all installed and managed by the build system (see versions @ the end of this comment). i'll update all of the workers (and the ansible) to install the right version(s) of things as found in the script today. 1) in addition to (1), i'd like to do one of two things: either remove the pycodestyle download step, OR add installation steps for flake8, pydocstyle and sphinx. 2) we currently don't have pydocstyle installed on any of the workers. i am more than happy to install it immediately, but am concerned about build breakages as we've never tested this before. 3) holy crap `dev/lint-python` makes my eyes bleed! currently installed packages + versions: ``` -bash-4.1$ which pycodestyle && pycodestyle --version /home/anaconda/bin/pycodestyle 2.3.1 -bash-4.1$ which flake8 && flake8 --version /home/anaconda/bin/flake8 3.5.0 (mccabe: 0.6.1, pycodestyle: 2.3.1, pyflakes: 1.6.0) CPython 2.7.13 on Linux -bash-4.1$ which sphinx-build && sphinx-build --version /home/anaconda/envs/py3k/bin/sphinx-build Sphinx (sphinx-build) 1.2.3 ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22931: [SPARK-25930][K8s] Fix scala string detection in k8s tes...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22931 sorry, ignore amplab's report. the build passed, but my hacking on the integration test reports was what caused the failure. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22931: [SPARK-25930][K8s] Fix scala string detection in k8s tes...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22931 btw next k8s test that runs will actually have logs! here's the integration test log from this run, which wasn't archived... [integration-tests.log](https://github.com/apache/spark/files/2544075/integration-tests.log) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22931: [SPARK-25930][K8s] Fix scala string detection in k8s tes...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22931 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22145: [SPARK-25152][K8S] Enable SparkR Integration Tests for K...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22145 closer. just started working again on the ubuntu port about a week ago. hopefully before EOY. On Thu, Nov 1, 2018 at 10:41 AM mccheah wrote: > Just wanted to ping on this - how close are we to getting this working on > CI? > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/22145#issuecomment-435119399>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrOTtfONuGh4jmTX5b_sOx7zSwKpzks5uqzJWgaJpZM4WDIaa> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22896: [SPARKR]found some extra whitespace in the R tests
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22896 actually can we merge this? it's causing spurious lintr errors. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22896: [SPARKR]found some extra whitespace in the R tests
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22896 thanks @HyukjinKwon ! --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22896: [SPARKR]found some extra whitespace in the R tests
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22896 appveyor timeout #fml --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22896: [SPARKR]found some extra whitespace in the R tests
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22896 nteresting... still get the `StartTag` error in the build log: ``` Error: StartTag: invalid element name [68] Execution halted ``` it's orthogonal to this PR, but i'll put some time in today and see if i can't figure out what's up. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22896: [SPARKR]found some extra whitespace in the R tests
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22896 i am confused, though, as to why this wasn't caught during the PR that added this file: https://github.com/apache/spark/pull/22455 on the centos workers, there was an error message in the PRB builds for that PR, but they didn't stop execution: Error: StartTag: invalid element name [68] however, it *was* caught on the ubuntu workers. both centos and ubuntu are running the same version of lintr and testthat. centos: ``` [sknapp@amp-jenkins-worker-04 ~]$ Rscript -e "packageVersion('lintr')" [1] â1.0.0.9001â [sknapp@amp-jenkins-worker-04 ~]$ Rscript -e "packageVersion('testthat')" [1] â1.0.2â ``` ubuntu: ``` sknapp@amp-jenkins-staging-worker-02:~$ Rscript -e "packageVersion('lintr')" [1] â1.0.0.9001â sknapp@amp-jenkins-staging-worker-02:~$ Rscript -e "packageVersion('testthat')" [1] â1.0.2â ``` i'll investigate further, but¯\_(ã)_/¯ --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22896: found some extra whitespace in the R tests
GitHub user shaneknapp opened a pull request: https://github.com/apache/spark/pull/22896 found some extra whitespace in the R tests ## What changes were proposed in this pull request? during my ubuntu-port testing, i found some extra whitespace that for some reason wasn't getting caught on the centos lint-r build step. ## How was this patch tested? the build system will test this! i used one of my ubuntu testing builds and scped over the modified file. before my fix: https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/22/console after my fix: https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/23/console You can merge this pull request into a Git repository by running: $ git pull https://github.com/shaneknapp/spark remove-extra-whitespace Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/22896.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #22896 commit f0fea61d56cbafe57a720b2a3a5f9601d80abea3 Author: shane knapp Date: 2018-10-30T17:56:41Z found some extra whitespace in the R tests --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22146: [SPARK-24434][K8S] pod template files
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22146 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22146: [SPARK-24434][K8S] pod template files
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22146 @mccheah fixed. for some reason, and on this build node only, that dir owner was set to root:root. this is fixed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22824: [SPARK-25834] [Structured Streaming]Update Mode should n...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22824 ok to test --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22824: [SPARK-25834] [Structured Streaming]Update Mode should n...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22824 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854][BUILD] fix `build/mvn` not to fail during ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 yeah... wasn't worried about the PRB failing. thanks for the merge/backport! --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854][BUILD] fix `build/mvn` not to fail during ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 build is green, and everything looks to be behaving normally! https://amplab.cs.berkeley.edu/jenkins/job/sknapp-testing-spark-branch-2.4-test-maven-hadoop-2.7/12/console ready to merge and backport (@srowen could you help out w/this one?) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854][BUILD] fix `build/mvn` not to fail during ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 > Merged build finished. Test FAILed. that was me killing the 1st PRB build --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22854: [SPARK-25854][BUILD] fix `build/mvn` not to fail ...
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22854#discussion_r228616782 --- Diff: build/mvn --- @@ -163,8 +163,14 @@ export MAVEN_OPTS=${MAVEN_OPTS:-"$_COMPILE_JVM_OPTS"} echo "Using \`mvn\` from path: $MVN_BIN" 1>&2 -# Last, call the `mvn` command as usual +# call the `mvn` command as usual "${MVN_BIN}" -DzincPort=${ZINC_PORT} "$@" +MVN_RETCODE=$? -# Try to shut down zinc explicitly -"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} +# SPARK-25854 +# Try to shut down zinc explicitly if the server is still running. if it's not running, +# it's timed out and we'll still need to exit the script w/a 0 to keep the build from +# failing. +"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} || true --- End diff -- alright, i tested the script w/a `false` call in the `mvn` helper script and nothing broke. pushed a change removing `|| true`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854][BUILD] fix `build/mvn` not to fail during ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 let's not merge until this build passes: https://amplab.cs.berkeley.edu/jenkins/job/sknapp-testing-spark-branch-2.4-test-maven-hadoop-2.7/12/ --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22854#discussion_r228607477 --- Diff: build/mvn --- @@ -163,8 +163,14 @@ export MAVEN_OPTS=${MAVEN_OPTS:-"$_COMPILE_JVM_OPTS"} echo "Using \`mvn\` from path: $MVN_BIN" 1>&2 -# Last, call the `mvn` command as usual +# call the `mvn` command as usual "${MVN_BIN}" -DzincPort=${ZINC_PORT} "$@" +MVN_RETCODE=$? -# Try to shut down zinc explicitly -"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} +# SPARK-25854 +# Try to shut down zinc explicitly if the server is still running. if it's not running, +# it's timed out and we'll still need to exit the script w/a 0 to keep the build from +# failing. +"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} || true --- End diff -- we probably don't need it, but i am more than comfortable keeping it in there #justincase --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 `mvn clean package` script logic worked! ``` [INFO] [INFO] BUILD SUCCESS [INFO] [INFO] Total time: 15:11 min [INFO] Finished at: 2018-10-26T10:01:13-07:00 [INFO] + MVN_RETCODE=0 + /home/jenkins/workspace/sknapp-testing-spark-branch-2.4-test-maven-hadoop-2.7/build/zinc-0.3.15/bin/zinc -shutdown -port 3087 + exit 0 <-- this is from "exit $MVN_RETCODE" + retcode1=0 ``` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 here's the test build run that's actually testing the changes to mvn: https://amplab.cs.berkeley.edu/jenkins/job/sknapp-testing-spark-branch-2.4-test-maven-hadoop-2.7/12/ --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22854 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22854#discussion_r228591941 --- Diff: build/mvn --- @@ -163,8 +163,19 @@ export MAVEN_OPTS=${MAVEN_OPTS:-"$_COMPILE_JVM_OPTS"} echo "Using \`mvn\` from path: $MVN_BIN" 1>&2 -# Last, call the `mvn` command as usual +# call the `mvn` command as usual "${MVN_BIN}" -DzincPort=${ZINC_PORT} "$@" -# Try to shut down zinc explicitly -"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} +# check to see if zinc server is still running post-build +"${ZINC_BIN}" -status -port ${ZINC_PORT} &> /dev/null +ZINC_STATUS=$? + +# Try to shut down zinc explicitly if the server is still running --- End diff -- now that i'm a couple cups of coffee in to my morning, i'm actually back to thinking that this might be the most elegant way of dealing w/this. i'll update the PR to do just this, and include a comment describing why we're doing it. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22854: [SPARK-25854] fix mvn to not always exit 1
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22854#discussion_r228584294 --- Diff: build/mvn --- @@ -163,8 +163,19 @@ export MAVEN_OPTS=${MAVEN_OPTS:-"$_COMPILE_JVM_OPTS"} echo "Using \`mvn\` from path: $MVN_BIN" 1>&2 -# Last, call the `mvn` command as usual +# call the `mvn` command as usual "${MVN_BIN}" -DzincPort=${ZINC_PORT} "$@" -# Try to shut down zinc explicitly -"${ZINC_BIN}" -shutdown -port ${ZINC_PORT} +# check to see if zinc server is still running post-build +"${ZINC_BIN}" -status -port ${ZINC_PORT} &> /dev/null +ZINC_STATUS=$? + +# Try to shut down zinc explicitly if the server is still running +if [ $ZINC_STATUS -eq 0 ]; then + # zinc is still running! + "${ZINC_BIN}" -shutdown -port ${ZINC_PORT} + exit 0 --- End diff -- i put the else in there for clarity, and in case we want to ever do something (like report that zinc timed out etc) if the exit code on the status is 1. ¯\_(ã)_/¯ i'm also not a fan of putting `exit 0` at the end of any bash script, anywhere, ever. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22854: [SPARK-25854] fix mvn to not always exit 1
GitHub user shaneknapp opened a pull request: https://github.com/apache/spark/pull/22854 [SPARK-25854] fix mvn to not always exit 1 ## What changes were proposed in this pull request? the final line in the mvn helper script in build/ attempts to shut down the zinc server. due to the zinc server being set up w/a 30min timeout, by the time the mvn test instantiation finishes, the server times out. this means that when the mvn script tries to shut down zinc, it returns w/an exit code of 1. this will then automatically fail the entire build (even if the build passes). ## How was this patch tested? i set up a test build: https://amplab.cs.berkeley.edu/jenkins/job/sknapp-testing-spark-branch-2.4-test-maven-hadoop-2.7/ You can merge this pull request into a Git repository by running: $ git pull https://github.com/shaneknapp/spark fix-mvn-helper-script Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/22854.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #22854 commit 2f0a91b64ee312bb5fb33885118afac4b7d0dc91 Author: shane knapp Date: 2018-10-26T16:01:30Z fix mvn to not always exit 1 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22501: [SPARK-25492][TEST] Refactor WideSchemaBenchmark to use ...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22501 @cloud-fan -- pip isn't broken... the actual error is found right above what you cut and pasted: `UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 2719: ordinal not in range(128)` i won't be able to look any deeper in to this until at least tomorrow at the earliest. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22748: [SPARK-25745][K8S] Improve docker-image-tool.sh script
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22748 k8s tests are triggered by the subject of the PR. if the test isn't being triggered (which it appears to be), then it will require a jenkins restart. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 also, i will be deleting the following jobs: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-sbt-hadoop-2.6/ https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-maven-hadoop-2.6/ https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/spark-master-compile-maven-hadoop-2.6/ --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 ok just to revisit this: i'm going to push out the new jenkins jobs configs now, and not gate on moving these to the spark repo. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 i haven't had a chance to do any of the jenkins stuff... after being sidetracked by the conversation to move the configs to the spark repo, plus planning for our big event that starts tomorrow, plus zomgmeetings all day today, work won't be able to start until early next week. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 > > > I want to see the configurations .. > > > they're just some absolutely breath-taking bits of yaml to define all of the spark jenkins jobs. aka: really nothing that exciting. ;) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 @srowen fair 'nuf... i'll create a jira for this tomorrow and we can hash out final design shite there (rather than overloading this PR). :) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 @vanzin i'm not opposed to hosting these configs somewhere else. @JoshRosen did this a few years back just to "get shit done"... i'd be leery of putting this in to the main spark repo, however, as only a very, very, very small subset of people (consisting mostly of myself) should actually ever touch this stuff. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 https://github.com/databricks/spark-jenkins-configurations/pull/47 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 @srowen sure, manually removing the failing jobs is one option... but since we auto-generate the job configs, any time we add a new branch they'll come back. i'd much rather do this the right way. :) --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22615: [SPARK-25016][BUILD][CORE] Remove support for Hadoop 2.6
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22615 consider me pinged. ;) i will need to do some refactoring in the jenkins job builder configs for this, so we'll definitely need to coordinate before this is merged. most likely i won't have much time until next week (risecamp will be taking all of my time wed-fri), but i'll see if i can't at least get an initial PR on this stuff by EOD tomorrow (oct 9th). @JoshRosen for a heads up on the forthcoming job config builder changes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22145: [SPARK-25152][K8S] Enable SparkR Integration Tests for K...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22145 yes, hopefully soon. i won't be able to start on this for at least another week due to our lab having a big event this coming week. On Sat, Oct 6, 2018 at 5:55 PM Felix Cheung wrote: > @shaneknapp <https://github.com/shaneknapp> could we do this soon? > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/22145#issuecomment-427617039>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrLNFewd2BvHoSWBI4X791ITbTu_Zks5uiVEegaJpZM4WDIaa> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22061: [SPARK-25079][PYTHON] preparing for python 3.5 bump
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22061 no, not yet. i'll give a bigger update later today on the status of this, but the TL;DR is that it's a bigger project than i expected. :\ On Mon, Sep 24, 2018 at 9:18 AM, Sean Owen wrote: > @shaneknapp <https://github.com/shaneknapp> should this go in now? > replacing 3.4? > > â > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/spark/pull/22061#issuecomment-424033759>, or mute > the thread > <https://github.com/notifications/unsubscribe-auth/ABiDrI0nMUTlFHKimrukHjoO-NM5kiAEks5ueQXXgaJpZM4V2HgI> > . > --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22333: [SPARK-25335][BUILD] Skip Zinc downloading if it's insta...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22333 moving any parts of the spark build infrastructure to use docker is a big project and not happening in the next few months. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22266: [SPARK-25270] lint-python: Add flake8 to find syn...
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22266#discussion_r214117901 --- Diff: dev/lint-python --- @@ -82,6 +82,26 @@ else rm "$PYCODESTYLE_REPORT_PATH" fi +python -m pip install flake8 --- End diff -- flake8 is installed on all centos and ubuntu workers. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22266: [SPARK-25270] lint-python: Add flake8 to find syn...
Github user shaneknapp commented on a diff in the pull request: https://github.com/apache/spark/pull/22266#discussion_r214114395 --- Diff: dev/lint-python --- @@ -82,6 +82,26 @@ else rm "$PYCODESTYLE_REPORT_PATH" fi +python -m pip install flake8 --- End diff -- yep. please remove this line, and i can ensure that it's installed on the workers. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22257: [SPARK-25264][K8S] Fix comma-delineated arguments passed...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22257 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #22257: [SPARK-25264][K8S] Fix comma-delineated arguments passed...
Github user shaneknapp commented on the issue: https://github.com/apache/spark/pull/22257 the network was disabled, so i fixed that: ``` virsh # net-list --all Name State Autostart Persistent -- default active yes yes docker-machines active yes yes minikube-net inactive noyes virsh # net-start minikube-net Network minikube-net started ``` and just for sanity, i checked the rest of the workers: ``` -bash-4.1$ pssh -h ubuntu_workers.txt -t 0 -i "virsh net-list --all" [1] 10:49:25 [SUCCESS] research-jenkins-worker-07 Name State Autostart Persistent -- default active yes yes minikube-net active yes yes [2] 10:49:25 [SUCCESS] amp-jenkins-staging-worker-01 Name State Autostart Persistent -- default active yes yes docker-machines active yes yes minikube-net active yes yes [3] 10:49:25 [SUCCESS] research-jenkins-worker-08 Name State Autostart Persistent -- default active yes yes minikube-net active yes yes [4] 10:49:25 [SUCCESS] amp-jenkins-staging-worker-02 Name State Autostart Persistent -- default active yes yes docker-machines active yes yes minikube-net active yes yes ``` and noticed that a couple weren't set to autostart (TODO: find out why). so, i fixed that: ``` -bash-4.1$ pssh -h ubuntu_workers.txt -t 0 -i "virsh net-autostart --network minikube-net" [1] 10:49:22 [SUCCESS] research-jenkins-worker-07 Network minikube-net marked as autostarted [2] 10:49:22 [SUCCESS] research-jenkins-worker-08 Network minikube-net marked as autostarted [3] 10:49:22 [SUCCESS] amp-jenkins-staging-worker-01 Network minikube-net marked as autostarted [4] 10:49:22 [SUCCESS] amp-jenkins-staging-worker-02 Network minikube-net marked as autostarted -bash-4.1$ pssh -h ubuntu_workers.txt -t 0 -i "virsh net-list --all" [1] 10:49:25 [SUCCESS] research-jenkins-worker-07 Name State Autostart Persistent -- default active yes yes minikube-net active yes yes [2] 10:49:25 [SUCCESS] amp-jenkins-staging-worker-01 Name State Autostart Persistent -- default active yes yes docker-machines active yes yes minikube-net active yes yes [3] 10:49:25 [SUCCESS] research-jenkins-worker-08 Name State Autostart Persistent -- default active yes yes minikube-net active yes yes [4] 10:49:25 [SUCCESS] amp-jenkins-staging-worker-02 Name State Autostart Persistent -- default active yes yes docker-machines active yes yes minikube-net active yes yes ``` you should be g2g. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org