[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15094445#comment-15094445 ] Mark Grover commented on SPARK-12426: - Thanks Sean, if you could add this, that'd be great. h2. Running docker integration tests In order to run [docker integration tests|https://github.com/apache/spark/tree/master/docker-integration-tests], you have to install docker engine on your box. The instructions for installation can be found at https://docs.docker.com/engine/installation/. Once installed, the docker service needs to be started, if not already running. On Linux, this can be done by {{sudo service docker start}}. These integration tests run as a part of a regular Spark unit test run, therefore, it's necessary for docker engine to be installed and running if you want all Spark tests to pass. > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at >
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15095047#comment-15095047 ] Sean Owen commented on SPARK-12426: --- Done. > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) > at > java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37) > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487) > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remoting > shut down. > at
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15090534#comment-15090534 ] Sean Owen commented on SPARK-12426: --- I'm not sure how to give edit access -- don't think I'm an admin -- but I do have edit access. If you send me your edits I can apply them. > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) > at > java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37) > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487) > 15/12/18 10:12:50
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15090179#comment-15090179 ] Mark Grover commented on SPARK-12426: - [~sowen]/[~joshrosen] Just a reminder about this, I'd appreciate your response. Thanks! > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) > at > java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37) > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487) > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator:
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15083482#comment-15083482 ] Mark Grover commented on SPARK-12426: - Sean and Josh, I got to the bottom of this. This is because docker sucks when bubbling up the error that docker engine is not running on the machine running the unit tests. The instructions for installing docker engine on various OSs are at https://docs.docker.com/engine/installation/ Once installed the docker service needs to be started, if it's not already running. On Linux, this is simply {{sudo service docker start}} and then our docker integration tests pass. Sorry that I didn't get a chance to look into it around 1.6 rc time, holidays got in the way. I am thinking of adding this info on [this wiki page|https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ]. Please let me know if you think there is a better place, that's the best I could find. I don't seem to have access to edit that page, can one of you please give me access? Also, I was trying to search in the code for any puppet recipes we maintain for the setting up build slaves. In order, if our Jenkins infra were wiped out, how do we make sure docker-engine is installed and running? How do we maintain keep track of build dependencies? Thanks in advance! > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause:
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15070882#comment-15070882 ] Sean Owen commented on SPARK-12426: --- Hm, I get this on OSX now, or I did once. This always happens on stock Ubuntu 15. Yeah it must be the presence of some package or something, but I don't know what. I don't have a resolution though it seems like a reasonably valid issue if it reliably fails on a mainstream OS. Maybe worth keeping open and seeing if something becomes clear as the cause. > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) > at > java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50) > at >
[jira] [Commented] (SPARK-12426) Docker JDBC integration tests are failing again
[ https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15070064#comment-15070064 ] Josh Rosen commented on SPARK-12426: This is probably an environment / configuration issue. These tests are running successfully in AMPLab Jenkins and they also run within an experimental Docker in Docker solution that I'm prototyping. > Docker JDBC integration tests are failing again > --- > > Key: SPARK-12426 > URL: https://issues.apache.org/jira/browse/SPARK-12426 > Project: Spark > Issue Type: Bug > Components: SQL, Tests >Affects Versions: 1.6.0 >Reporter: Mark Grover > > The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to > be failing again on my machine (Ubuntu Precise). This was the same box that I > tested my previous commit on. Also, I am not confident this failure has much > to do with Spark, since a well known commit where the tests were passing, > fails now, in the same environment. > [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing > on his Ubuntu 15 box as well. > Here's the error, fyi: > {code} > 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting > down remote daemon. > 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote > daemon shut down; proceeding with flushing remote transports. > *** RUN ABORTED *** > com.spotify.docker.client.DockerException: > java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492) > at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528) > ... > Cause: java.util.concurrent.ExecutionException: > com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) > at > com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080) > at > com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76) > at > org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253) > at > org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58) > ... > Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: > java.io.IOException: No such file or directory > at > org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481) > at > org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) > at > java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50) > at > jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37) > at >