See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/2048/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-11910] Increase the bag page limit for continuation pages

[Ismaël Mejía] [BEAM-9282] Move structured streaming runner into Spark 2 
specific

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3 job-server

[Ismaël Mejía] [BEAM-9282] Separate modules for Spark 2/3 job-server container

[Ismaël Mejía] [BEAM-7092] Run PostCommit tests for Spark 3 module too

[Ismaël Mejía] [BEAM-7092] Update tests invocation for Spark 2 module

[Ismaël Mejía] [BEAM-9283] Add Spark 3 test jobs to the CI (Java 11)

[Ismaël Mejía] [BEAM-11654] Publish Spark 2 and 3 specific Job-Server containers

[Ismaël Mejía] [BEAM-7092] Add paranamer 2.8 license to container (Spark 3 / 
Avro)


------------------------------------------
[...truncated 286.87 KB...]
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Daemon 
****,5,main]) completed. Took 0.285 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Watching 1329 directories to track changes
Watching 1329 directories to track changes
Watching 1329 directories to track changes
Watching 1341 directories to track changes
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is 87485af66c2ed94d2cca847aefc75c06
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
Watching 1341 directories to track changes
Watching 1341 directories to track changes
Watching 1341 directories to track changes
Watching 1352 directories to track changes
Watching 1353 directories to track changes
Watching 1354 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
87485af66c2ed94d2cca847aefc75c06
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for 
':',5,main]) completed. Took 0.3 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for 
':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for 
':',5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for 
':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Watching 1354 directories to track changes
Watching 1354 directories to track changes
Watching 1355 directories to track changes
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
Watching 1355 directories to track changes
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
Watching 1355 directories to track changes
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for 
':',5,main]) completed. Took 0.052 secs.

> Task :sdks:java:io:synthetic:compileJava
Note: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java>
 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Created classpath snapshot for incremental compilation in 1.768 secs. 1 
duplicate classes found in classpath (see all with --debug).
Watching 1364 directories to track changes
Watching 1366 directories to track changes
Watching 1368 directories to track changes
Stored cache entry for task ':sdks:java:io:synthetic:compileJava' with cache 
key bbc524c5d2cf0c8e4d1b8c0ce55fa5ac
:sdks:java:io:synthetic:compileJava (Thread[Execution **** for ':' Thread 
7,5,main]) completed. Took 15.725 secs.
:sdks:java:io:synthetic:classes (Thread[Execution **** for ':' Thread 
7,5,main]) started.

> Task :sdks:java:io:synthetic:classes
Skipping task ':sdks:java:io:synthetic:classes' as it has no actions.
:sdks:java:io:synthetic:classes (Thread[Execution **** for ':' Thread 
7,5,main]) completed. Took 0.0 secs.
:sdks:java:io:synthetic:jar (Thread[Execution **** for ':' Thread 3,5,main]) 
started.

> Task :sdks:java:io:synthetic:jar
Watching 1368 directories to track changes
Watching 1368 directories to track changes
Watching 1369 directories to track changes
Caching disabled for task ':sdks:java:io:synthetic:jar' because:
  Caching has not been enabled for the task
Task ':sdks:java:io:synthetic:jar' is not up-to-date because:
  No history is available.
Watching 1369 directories to track changes
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/synthetic/build/resources/main',>
 not found
Watching 1370 directories to track changes
:sdks:java:io:synthetic:jar (Thread[Execution **** for ':' Thread 3,5,main]) 
completed. Took 0.02 secs.
:sdks:java:io:kafka:compileTestJava (Thread[Execution **** for ':' Thread 
3,5,main]) started.

> Task :sdks:java:io:kafka:compileTestJava
Watching 1370 directories to track changes
Watching 1370 directories to track changes
Watching 1370 directories to track changes
Watching 1370 directories to track changes
Watching 1378 directories to track changes
Custom actions are attached to task ':sdks:java:io:kafka:compileTestJava'.
Build cache key for task ':sdks:java:io:kafka:compileTestJava' is 
4b02179db99d0c26615ee1db692718c3
Task ':sdks:java:io:kafka:compileTestJava' is not up-to-date because:
  No history is available.
Watching 1378 directories to track changes
Watching 1378 directories to track changes
Watching 1378 directories to track changes
The input changes require a full rebuild for incremental task 
':sdks:java:io:kafka:compileTestJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Created classpath snapshot for incremental compilation in 1.023 secs. 910 
duplicate classes found in classpath (see all with --debug).
Watching 1385 directories to track changes
Watching 1386 directories to track changes
Watching 1387 directories to track changes
Stored cache entry for task ':sdks:java:io:kafka:compileTestJava' with cache 
key 4b02179db99d0c26615ee1db692718c3
:sdks:java:io:kafka:compileTestJava (Thread[Execution **** for ':' Thread 
3,5,main]) completed. Took 5.7 secs.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 
3,5,main]) started.

> Task :sdks:java:io:kafka:testClasses
Skipping task ':sdks:java:io:kafka:testClasses' as it has no actions.
:sdks:java:io:kafka:testClasses (Thread[Execution **** for ':' Thread 
3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 
3,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:io:kafka:integrationTest
Watching 1387 directories to track changes
Watching 1387 directories to track changes
Watching 1387 directories to track changes
Watching 1387 directories to track changes
Custom actions are attached to task ':sdks:java:io:kafka:integrationTest'.
Build cache key for task ':sdks:java:io:kafka:integrationTest' is 
96b7d3797a9c334357ddf5d015004c3b
Task ':sdks:java:io:kafka:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Watching 1387 directories to track changes
Watching 1387 directories to track changes
Starting process 'Gradle Test Executor 2'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"100000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=kafkaioit_results","--influxMeasurement=kafkaioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--kafkaBootstrapServerAddresses=34.122.229.207:32400,34.68.151.196:32401,35.224.118.99:32402","--kafkaTopic=beam","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.io.kafka.KafkaIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInBatch STANDARD_ERROR
    Mar 13, 2021 12:31:42 PM org.apache.beam.sdk.coders.SerializableCoder 
checkEqualsMethodDefined
    WARNING: Can't verify serialized elements of type Shard have well defined 
equals method. This may produce incorrect results on some PipelineRunner
    Mar 13, 2021 12:31:43 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Mar 13, 2021 12:31:44 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Mar 13, 2021 12:31:44 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Mar 13, 2021 12:31:44 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 225 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 13, 2021 12:31:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 13, 2021 12:31:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Mar 13, 2021 12:31:48 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <103499 bytes, hash 
713218ad2982e90af8d5c8a4d4919494f9e4c9454a7ce8216fe0df7c3846a585> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-cTIYrSmC6Qr41cik1JGUlPnkyUVKfOghb-DffDhGpYU.pb
    Mar 13, 2021 12:31:50 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 13, 2021 12:31:50 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT-uXgcIsOvpjCXaws8FmMEfFc_TCNwHEALQP9hQSyI4iI.jar
    Mar 13, 2021 12:31:50 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test5389926878165105445.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-1NzzbBIFfUxEmUMmCXCnIxjndY1aMNkfGi6EJGZZ73o.jar
    Mar 13, 2021 12:31:50 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/main549428641942857610.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-q8ANfbaFkpLcsoH-ne00xSwyhhlw4ZmfeSDJ4mZEHvY.jar

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInBatch FAILED
    java.lang.RuntimeException: Error while staging packages
        at 
org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:372)
        at 
org.apache.beam.runners.dataflow.util.PackageUtil.stageClasspathElements(PackageUtil.java:238)
        at 
org.apache.beam.runners.dataflow.util.GcsStager.stageFiles(GcsStager.java:53)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.stageArtifacts(DataflowRunner.java:900)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:995)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:202)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:334)
        at 
org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInBatch(KafkaIOIT.java:192)

        Caused by:
        java.io.IOException: Error executing batch GCS request
            at 
org.apache.beam.sdk.extensions.gcp.util.GcsUtil.executeBatches(GcsUtil.java:585)
            at 
org.apache.beam.sdk.extensions.gcp.util.GcsUtil.getObjects(GcsUtil.java:314)
            at 
org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystem.matchNonGlobs(GcsFileSystem.java:249)
            at 
org.apache.beam.sdk.extensions.gcp.storage.GcsFileSystem.match(GcsFileSystem.java:102)
            at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:125)
            at 
org.apache.beam.sdk.io.FileSystems.matchSingleFileSpec(FileSystems.java:189)
            at 
org.apache.beam.runners.dataflow.util.PackageUtil.alreadyStaged(PackageUtil.java:128)
            at 
org.apache.beam.runners.dataflow.util.PackageUtil.stagePackageSynchronously(PackageUtil.java:152)
            at 
org.apache.beam.runners.dataflow.util.PackageUtil.lambda$stagePackage$1(PackageUtil.java:142)
            at 
org.apache.beam.sdk.util.MoreFutures.lambda$supplyAsync$0(MoreFutures.java:107)
            at 
java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1640)
            at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
            at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
            at java.lang.Thread.run(Thread.java:748)

            Caused by:
            java.util.concurrent.ExecutionException: java.io.IOException: 
Premature EOF
                at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
                at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
                at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:60)
                at 
org.apache.beam.sdk.extensions.gcp.util.GcsUtil.executeBatches(GcsUtil.java:577)
                ... 13 more

                Caused by:
                java.io.IOException: Premature EOF
                    at 
sun.net.www.http.ChunkedInputStream.readAheadBlocking(ChunkedInputStream.java:565)
                    at 
sun.net.www.http.ChunkedInputStream.readAhead(ChunkedInputStream.java:609)
                    at 
sun.net.www.http.ChunkedInputStream.read(ChunkedInputStream.java:696)
                    at 
java.io.FilterInputStream.read(FilterInputStream.java:133)
                    at 
sun.net.www.protocol.http.HttpURLConnection$HttpInputStream.read(HttpURLConnection.java:3454)
                    at 
com.google.api.client.http.javanet.NetHttpResponse$SizeValidatingInputStream.read(NetHttpResponse.java:164)
                    at 
java.io.FilterInputStream.read(FilterInputStream.java:133)
                    at 
java.io.FilterInputStream.read(FilterInputStream.java:107)
                    at 
com.google.common.io.ByteStreams.exhaust(ByteStreams.java:274)
                    at 
com.google.api.client.http.ConsumingInputStream.close(ConsumingInputStream.java:40)
                    at 
java.util.zip.InflaterInputStream.close(InflaterInputStream.java:227)
                    at 
java.util.zip.GZIPInputStream.close(GZIPInputStream.java:136)
                    at 
com.google.api.client.googleapis.batch.BatchUnparsedResponse.checkForFinalBoundary(BatchUnparsedResponse.java:303)
                    at 
com.google.api.client.googleapis.batch.BatchUnparsedResponse.parseNextResponse(BatchUnparsedResponse.java:164)
                    at 
com.google.api.client.googleapis.batch.BatchRequest.execute(BatchRequest.java:267)
                    at 
org.apache.beam.sdk.extensions.gcp.util.GcsUtil.lambda$executeBatches$0(GcsUtil.java:573)
                    at 
org.apache.beam.sdk.util.MoreFutures.lambda$runAsync$2(MoreFutures.java:140)
                    ... 4 more

1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
Watching 1389 directories to track changes
Watching 1395 directories to track changes
Watching 1396 directories to track changes

> Task :sdks:java:io:kafka:integrationTest FAILED
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread 
3,5,main]) completed. Took 21.8 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 7s
107 actionable tasks: 66 executed, 41 from cache
Watching 1396 directories to track changes

Publishing build scan...
https://gradle.com/s/yvezqjcrfmqio

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to