See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/2664/display/redirect?page=changes>

Changes:

[heejong] [BEAM-11520] Stage extra PyPI dependencies with generated requirements

[heejong] raise exception for non-file type artifacts


------------------------------------------
[...truncated 301.38 KB...]
Watching 1458 directories to track changes
Watching 1458 directories to track changes
Watching 1458 directories to track changes
Watching 1508 directories to track changes
Watching 1518 directories to track changes
Watching 1520 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache 
key a14a540052ab634974aed61ab2bd178c
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
(Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.393 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** for ':' Thread 10,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as 
it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** for ':' Thread 7,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Watching 1406 directories to track changes
Watching 1407 directories to track changes
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
Watching 1407 directories to track changes
Watching 1520 directories to track changes
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** for ':' 
Thread 6,5,main]) completed. Took 0.333 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for 
':' Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
Watching 1520 directories to track changes
Watching 1520 directories to track changes
Watching 1520 directories to track changes
Watching 1521 directories to track changes
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is 
b87295ec85f0d7917b826a7f2daae570
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not 
up-to-date because:
  No history is available.
Watching 1521 directories to track changes
Watching 1532 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key 
b87295ec85f0d7917b826a7f2daae570
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** for ':' Thread 7,5,main]) completed. Took 0.223 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Watching 1520 directories to track changes
Watching 1520 directories to track changes
Watching 1520 directories to track changes
Watching 1531 directories to track changes
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is c12e4bb2f0cbc1c51ca69bc123e69085
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
Watching 1531 directories to track changes
Watching 1531 directories to track changes
Watching 1531 directories to track changes
Watching 1543 directories to track changes
Watching 1544 directories to track changes
Watching 1545 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
c12e4bb2f0cbc1c51ca69bc123e69085
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for 
':' Thread 6,5,main]) completed. Took 0.22 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 6,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' 
Thread 8,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Watching 1545 directories to track changes
Watching 1545 directories to track changes
Watching 1546 directories to track changes
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
Watching 1546 directories to track changes
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
Watching 1546 directories to track changes
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' 
Thread 8,5,main]) completed. Took 0.048 secs.
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution **** for ':' 
Thread 8,5,main]) started.

> Task :sdks:java:io:hadoop-format:compileTestJava FROM-CACHE
Watching 1546 directories to track changes
Watching 1546 directories to track changes
Watching 1546 directories to track changes
Watching 1546 directories to track changes
Watching 1554 directories to track changes
Custom actions are attached to task 
':sdks:java:io:hadoop-format:compileTestJava'.
Build cache key for task ':sdks:java:io:hadoop-format:compileTestJava' is 
1fea9ef8d553138c7207f6258e0dfd38
Task ':sdks:java:io:hadoop-format:compileTestJava' is not up-to-date because:
  No history is available.
Watching 1554 directories to track changes
Watching 1554 directories to track changes
Watching 1554 directories to track changes
Watching 1562 directories to track changes
Watching 1563 directories to track changes
Watching 1564 directories to track changes
Loaded cache entry for task ':sdks:java:io:hadoop-format:compileTestJava' with 
cache key 1fea9ef8d553138c7207f6258e0dfd38
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution **** for ':' 
Thread 8,5,main]) completed. Took 0.615 secs.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution **** for ':' Thread 
8,5,main]) started.

> Task :sdks:java:io:hadoop-format:testClasses
Skipping task ':sdks:java:io:hadoop-format:testClasses' as it has no actions.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution **** for ':' Thread 
8,5,main]) completed. Took 0.0 secs.
:sdks:java:io:hadoop-format:integrationTest (Thread[Daemon ****,5,main]) 
started.

> Task :sdks:java:io:hadoop-format:integrationTest
Watching 1564 directories to track changes
Watching 1564 directories to track changes
Watching 1564 directories to track changes
Custom actions are attached to task 
':sdks:java:io:hadoop-format:integrationTest'.
Build cache key for task ':sdks:java:io:hadoop-format:integrationTest' is 
9e27d32ff5753ca5e338d5ebcf8365a2
Task ':sdks:java:io:hadoop-format:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Watching 1564 directories to track changes
Watching 1564 directories to track changes
Starting process 'Gradle Test Executor 1'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--numberOfRecords=600000","--bigQueryDataset=beam_performance","--bigQueryTable=hadoopformatioit_results","--influxMeasurement=hadoopformatioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=35.223.51.95","--postgresSsl=false","--postgresPort=5432","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.3/d90276fff414f06cb375f2057f6778cd63c6082f/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
    Feb 13, 2021 12:19:57 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #1 of 3 failed: The connection attempt failed..
    Feb 13, 2021 12:19:57 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Retrying in 2000 ms.
    Feb 13, 2021 12:20:09 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #2 of 3 failed: The connection attempt failed..
    Feb 13, 2021 12:20:09 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Retrying in 4000 ms.
    Feb 13, 2021 12:20:23 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #3 of 3 failed: The connection attempt failed..
    Feb 13, 2021 12:20:33 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #1 of 3 failed: The connection attempt failed..
    Feb 13, 2021 12:20:33 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Retrying in 2000 ms.
    Feb 13, 2021 12:20:45 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #2 of 3 failed: The connection attempt failed..
    Feb 13, 2021 12:20:45 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Retrying in 4000 ms.
    Feb 13, 2021 12:20:59 AM org.apache.beam.sdk.io.common.IOITHelper 
executeWithRetry
    WARNING: Attempt #3 of 3 failed: The connection attempt failed..

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at 
org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:45)
        at 
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.createTable(HadoopFormatIOIT.java:154)
        at 
org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:89)
        at 
org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:69)
        at 
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.setUp(HadoopFormatIOIT.java:149)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at 
org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 13 more

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > classMethod FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at 
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at 
org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at 
org.apache.beam.sdk.io.common.DatabaseTestHelper.deleteTable(DatabaseTestHelper.java:63)
        at 
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.deleteTable(HadoopFormatIOIT.java:197)
        at 
org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:89)
        at 
org.apache.beam.sdk.io.common.IOITHelper.executeWithRetry(IOITHelper.java:69)
        at 
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.tearDown(HadoopFormatIOIT.java:190)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at 
org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at 
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 13 more

2 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
Watching 1566 directories to track changes
Watching 1572 directories to track changes
Watching 1573 directories to track changes
:sdks:java:io:hadoop-format:integrationTest (Thread[Daemon ****,5,main]) 
completed. Took 1 mins 17.728 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 53s
123 actionable tasks: 75 executed, 48 from cache
Watching 1573 directories to track changes

Publishing build scan...
https://gradle.com/s/5poo36g7shy5s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to