[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477325660 @shaneknapp thanks for your patience, the other thing left to be done is to add this knowledge acquired the hard way to the docs so people can test easily (I think you were planning to do so). I saw the difficulty with new contributors in another PR to run the tests. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477325660 @shaneknapp thanks for your patience, the other thing left to be done is to add this knowledge acquired the hard way to the docs so people can test easily (I think you were planning to do so). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477311319 @shaneknapp Let's see I reverted the defaults to `/tmp`. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477306801 > once https://github.com/apache/spark/pull/23514/files#r269703958 is dealt with + tests pass i'll merge. @shaneknapp what do you see so far? What are the issues? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477178833 @shaneknapp if this is merged I will add integration tests for the other pr https://github.com/apache/spark/pull/23546 using this: https://github.com/ceph/cn since I need to use the local storage class thing This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-477178833 @shaneknapp if this is merged I will add integration tests for the other pr https://github.com/apache/spark/pull/23546 using this: https://github.com/ceph/cn since I will use the local storage class thing as well. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-476711154 @shaneknapp things run smoothly on ubuntu with kvm2 what is the bug you are referring to? I pasted the runs above. Try use `--gid=0` so you can create the files (I didnt use --uid=1001 --gid=1001 check https://unix.stackexchange.com/questions/44077/what-does-it-mean-to-be-in-group-0), pls follow what I did above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-476711154 @shaneknapp things run smoothly on ubuntu with kvm2 what is the bug you are referring to? I pasted the runs above. Try use `--gid=0` so you can create the files (https://unix.stackexchange.com/questions/44077/what-does-it-mean-to-be-in-group-0), follow what I did above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-476711154 @shaneknapp things run smoothly on ubuntu with kvm2 what is the bug you are referring to? I pasted the runs above. Try use `--gid=0` so you can create the files (https://unix.stackexchange.com/questions/44077/what-does-it-mean-to-be-in-group-0), pls follow what I did above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-476711154 @shaneknapp things run smoothly on ubuntu with kvm2 what is the bug you are referring to? I pasted the runs above. Try use `--gid=0` so you can create the files (https://unix.stackexchange.com/questions/44077/what-does-it-mean-to-be-in-group-0). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-476711154 @shaneknapp things run smoothly on ubuntu with kvm2 what is the bug you are referring to? I pasted the runs above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-473355130 @shaneknapp do the default ones exist (should I trigger the tests): ``` val HOST_PATH = sys.env.getOrElse("PVC_TESTS_HOST_PATH", "/home/jenkins/src/spark/tmp") val VM_PATH = sys.env.getOrElse("PVC_TESTS_VM_PATH", "/tmp/spark-k8s-integration-tests") ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469669773 @shaneknapp I updated the PR and created a jira [here](https://issues.apache.org/jira/browse/SPARK-27058), for supporting host mounting. For now I have: ``` val HOST_PATH = sys.env.getOrElse("PVC_TESTS_HOST_PATH", "/home/jenkins/src/spark/tmp") val VM_PATH = sys.env.getOrElse("PVC_TESTS_VM_PATH", "/tmp/spark-k8s-integration-tests") ``` but the default values should be `/tmp` when things are ready so users can run tests out of the box(just did this to trigger the tests here with the right paths). I also restored the version of the fabric8io client to 4.1.2 for now although there is an other effort for this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469669773 @shaneknapp I updated the PR and created a jira [here](https://issues.apache.org/jira/browse/SPARK-27058), for supporting host mounting. For now I have: ``` val HOST_PATH = sys.env.getOrElse("PVC_TESTS_HOST_PATH", "/home/jenkins/src/spark/tmp") val VM_PATH = sys.env.getOrElse("PVC_TESTS_VM_PATH", "/tmp/spark-k8s-integration-tests") ``` but the default values should be `/tmp` when things are ready (just did this to trigger the tests here with the right paths). I also restored the version of the fabric8io client to 4.1.2 for now although there is an other effort for this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469669773 @shaneknapp I updated the PR and created a jira [here](https://issues.apache.org/jira/browse/SPARK-27058), for supporting host mounting. For now I have: ``` val HOST_PATH = sys.env.getOrElse("PVC_TESTS_HOST_PATH", "/home/jenkins/src/spark/tmp") val VM_PATH = sys.env.getOrElse("PVC_TESTS_VM_PATH", "/tmp/spark-k8s-integration-tests") ``` but the default values should be `/tmp` when things are ready (just did it to test). I also restored the version of the fabric8io client to 4.1.2 for now although there is an other effort for this. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469523470 @shaneknapp btw this pr needs udpate, since I removed the update for the fabric8io client (as you know there was another PR for this and got reverted). So if you run the tests just use the latest client 4.1.2 in the in the integration tests pom. I will update shortly based on master status. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469507435 @shaneknapp Running on my machine ubuntu 18.04 with kvm2, works fine as well: ``` CONTAINER IDIMAGE COMMAND CREATED STATUS PORTS NAMES ⚠️ These changes will take effect upon a minikube delete and then a minikube start ⚠️ These changes will take effect upon a minikube delete and then a minikube start minikube v0.34.1 on linux (amd64) Creating kvm2 VM (CPUs=4, Memory=4096MB, Disk=2MB) ... "minikube" IP address is 192.168.39.211 Configuring Docker as the container runtime ... ✨ Preparing Kubernetes environment ... ▪ kubelet.resolv-conf=/run/systemd/resolve/resolv.conf Downloading kubeadm v1.13.3 Downloading kubelet v1.13.3 Pulling images required by Kubernetes v1.13.3 ... Launching Kubernetes v1.13.3 using kubeadm ... Configuring cluster permissions ... 樂 Verifying component health . kubectl is now configured to use "minikube" Done! Thank you for using minikube! Kubernetes master is running at https://192.168.39.211:8443 KubeDNS is running at https://192.168.39.211:8443/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy To further debug and diagnose cluster problems, use 'kubectl cluster-info dump'. Mounting /tmp/test into /tmp/test on the minikube VM This daemon process needs to stay alive for the mount to be accessible ... ufs starting ``` ``` $ ls /tmp/test -al total 1 -rw-rw-r-- 1 185 root 8 Mar 5 02:12 tmp6799698580179143603.txt $ ls /tmp/test -al total 4 drwxr-xr-x 1 185 root 4096 Mar 5 02:13 dfs_read_write_test kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-e8e751753741409fbb9ad2417d4d4aa3 0/1 Completed 0 88s [INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ spark-kubernetes-integration-tests_2.12 --- Discovery starting. Discovery completed in 205 milliseconds. Run starting. Expected test count is: 1 KubernetesSuite: - Test PVs with local storage Run completed in 48 seconds, 152 milliseconds. Total number of tests run: 1 Suites: completed 2, aborted 0 Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0 All tests passed. [INFO] [INFO] Reactor Summary for Spark Project Parent POM 3.0.0-SNAPSHOT: [INFO] [INFO] Spark Project Parent POM ... SUCCESS [ 3.272 s] [INFO] Spark Project Tags . SUCCESS [ 2.798 s] [INFO] Spark Project Local DB . SUCCESS [ 2.055 s] [INFO] Spark Project Networking ... SUCCESS [ 3.162 s] [INFO] Spark Project Shuffle Streaming Service SUCCESS [ 1.870 s] [INFO] Spark Project Unsafe ... SUCCESS [ 2.094 s] [INFO] Spark Project Launcher . SUCCESS [ 2.486 s] [INFO] Spark Project Core . SUCCESS [ 18.521 s] [INFO] Spark Project Kubernetes Integration Tests . SUCCESS [ 52.234 s] [INFO] [INFO] BUILD SUCCESS [INFO] [INFO] Total time: 01:29 min [INFO] Finished at: 2019-03-05T04:13:42+02:00 [INFO] spark log: kubectl logs spark-test-app-e8e751753741409fbb9ad2417d4d4aa3 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp6799698580179143603.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 02:13:03 INFO
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtualbox: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al ... drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils:
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtualbox: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al ... drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils:
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtual box: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al ... drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils:
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtual box: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al ... drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils:
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtual box: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils: Successfully
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469497166 @shaneknapp works fine with virtual box: ``` ./minikube mount --9p-version=9p2000.L --uid=185 --gid=0 /tmp/test:/tmp/test ssh -i ~/.minikube/machines/minikube/id_rsa docker@$(minikube ip) $ls /tmp -al ... drwxrwxr-x 1 185 root 4096 Mar 5 01:22 test $ ls /tmp/test -al total 5 drwxr-xr-x 1 185 root 4096 Mar 5 01:23 dfs_read_write_test -rw-rw-r-- 1 185 root8 Mar 5 01:21 tmp1583665121256859192.txt kubectl get pods -n spark NAME READY STATUS RESTARTS AGE spark-test-app-5ed2764629864476b13d10c76f74cdb7 0/1 Completed 0 2m49s ``` spark log: ``` kubectl logs spark-test-app-5ed2764629864476b13d10c76f74cdb7 -n spark ++ id -u + myuid=185 ++ id -g + mygid=0 + set +e ++ getent passwd 185 + uidentry= + set -e + '[' -z '' ']' + '[' -w /etc/passwd ']' + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' + SPARK_CLASSPATH=':/opt/spark/jars/*' + env + grep SPARK_JAVA_OPT_ + sort -t_ -k4 -n + sed 's/[^=]*=\(.*\)/\1/g' + readarray -t SPARK_EXECUTOR_JAVA_OPTS + '[' -n '' ']' + '[' -n '' ']' + '[' '' == 2 ']' + '[' '' == 3 ']' + '[' -z ']' + case "$1" in + shift 1 + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.4 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal /opt/spark/pv-tests/tmp1583665121256859192.txt /opt/spark/pv-tests Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/03/05 01:22:23 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:23 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:23 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:23 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Performing local word count Creating SparkSession 19/03/05 01:22:24 INFO SparkContext: Running Spark version 3.0.0-SNAPSHOT 19/03/05 01:22:24 INFO SparkContext: Submitted application: DFS Read Write Test 19/03/05 01:22:24 INFO SecurityManager: Changing view acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls to: 185,stavros 19/03/05 01:22:24 INFO SecurityManager: Changing view acls groups to: 19/03/05 01:22:24 INFO SecurityManager: Changing modify acls groups to: 19/03/05 01:22:24 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(185, stavros); groups with view permissions: Set(); users with modify permissions: Set(185, stavros); groups with modify permissions: Set() 19/03/05 01:22:24 INFO Utils: Successfully started service 'sparkDriver' on port 7078. 19/03/05 01:22:24 INFO SparkEnv: Registering MapOutputTracker 19/03/05 01:22:24 INFO SparkEnv: Registering BlockManagerMaster 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/03/05 01:22:24 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/03/05 01:22:24 INFO DiskBlockManager: Created local directory at /var/data/spark-ddaf8ace-85e7-4bc9-8ceb-2795ab03c8c4/blockmgr-c82635ff-8ba0-47f4-b7a6-131b8944e42b 19/03/05 01:22:24 INFO MemoryStore: MemoryStore started with capacity 593.9 MiB 19/03/05 01:22:24 INFO SparkEnv: Registering OutputCommitCoordinator 19/03/05 01:22:25 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/03/05 01:22:25 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-test-app-1551748865574-driver-svc.spark.svc:4040 19/03/05 01:22:25 INFO SparkContext: Added JAR local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar at file:/opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar with timestamp 1551748945067 19/03/05 01:22:25 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file 19/03/05 01:22:26 INFO ExecutorPodsAllocator: Going to request 2 executors from Kubernetes. 19/03/05 01:22:26 INFO Utils:
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469485430 I updated the ticket link above its [this](https://github.com/kubernetes/minikube/issues/2290), this is where I got the flags from. There is a [ticket](https://github.com/kubernetes/minikube/issues/3327) for adding this to `start`. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469485430 I updated the ticket link above its [this](https://github.com/kubernetes/minikube/issues/2290), this is where I got the flags from. @shaneknapp There is a [ticket](https://github.com/kubernetes/minikube/issues/3327) for adding this to `start`. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm? In my case I just need to test the file rights because I hit [this]( https://github.com/kubernetes/minikube/issues/2290) (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default). But there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469482216 The mkdirs issue is the one im hitting, I will test and let you know ;) > now that's an understatement. ;) If it broken it is broken :) There is a positive side for most things though ;) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469482216 The mkdirs issue is the on im trying to avoid will test and let you know ;) > now that's an understatement. ;) If it broken it is broken :) There is a positive side for most things though ;) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469482216 The mkdirs issue is the on im trying to avoid will test and let you know ;) > now that's an understatement. ;) If it broken it is broken. There is a positive side for most things though ;) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm? In my case I just need to test the file rights because I hit [this]( https://github.com/kubernetes/minikube/issues/229) (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default). But there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm? In my case I just need to test the file rights because I hit [this]( https://github.com/kubernetes/minikube/issues/229) one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default). But there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm? In my case I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/229. But there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm?. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest minikube (v0.34.1). Is it broken for kvm? In my case I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1 (is it broken for kvm?). I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes (at least this adventure reveals enough for minikube and improves our knowledge). This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example and maybe we can reproduce it on the test nodes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround set `--gid` and `--uid` and `--9p-version=9p2000.L`. Hopefully I will be able shortly to report back here a fully working example. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder in the vm because only the user on the host machine can write to it by default): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround `--9p-version=9p2000.L` to test and hopefully I can report back here a fully working example. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround `--9p-version=9p2000.L` to test and hopefully I can report back here a fully working example. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests
skonto edited a comment on issue #23514: [SPARK-24902][K8s] Add PV integration tests URL: https://github.com/apache/spark/pull/23514#issuecomment-469266727 @shaneknapp mounts do work on my machine with virtualbox and latest 0.34.1. I just need to test the file rights because I hit this one (in vm rkt user is used while in the container we have user 185, but 185 cant write to that mounted folder): https://github.com/kubernetes/minikube/issues/2290 but there is a workaround `--9p-version=9p2000.L` to test. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org