See 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Flink_Batch/301/display/redirect?page=changes>

Changes:

[noreply] Remove permitAll flag from seed & dependency check jenkins jobs 
(#12319)

[noreply] [BEAM-7390] Add combineperkey code snippets (#12277)

[noreply] Move more files to impl sub-directory (#12302)

[Robert Bradshaw] Update portability status and add some more documentation.

[noreply] [BEAM-10411] Adds an example that use Python cross-language Kafka

[noreply] [BEAM-10274] Fix translation of json pipeline options. (#12333)

[noreply] [BEAM-10545] Initialize an empty extension (#12327)

[Kenneth Knowles] Add analyzer-friendly checkArgumentNotNull

[Kenneth Knowles] Fix typo in error message in RowWithGetters

[Kenneth Knowles] Improve error message in ApiSurface tests

[Kenneth Knowles] Skip nullness analysis of AutoValue_ classes

[Kenneth Knowles] [BEAM-10547][BEAM-10548] Schema support for all sorts of 
Nullable and on

[Kenneth Knowles] Migrate to checkerframework nullness annotations

[Kenneth Knowles] [BEAM-10540] Fix nullability in equals methods globally

[noreply] [BEAM-10551] Implement Navigation Functions FIRST_VALUE and LAST_VALUE


------------------------------------------
[...truncated 61.19 KB...]
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromCalciteCore
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromSrc
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :sdks:java:extensions:sql:generateFmppSources
> Task :model:pipeline:jar

> Task :sdks:java:extensions:sql:compileJavacc
Java Compiler Compiler Version 4.0 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Flink_Batch/ws/src/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj>
 . . .
Note: UNICODE_INPUT option is specified. Please make sure you create the 
parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD 
is more than 1.  Set option FORCE_LA_CHECK to true to force checking.

> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes

> Task :sdks:java:extensions:sql:compileJavacc
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :sdks:java:extensions:sql:processResources
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:extensions:join-library:compileJava FROM-CACHE
> Task :sdks:java:extensions:join-library:classes UP-TO-DATE
> Task :sdks:java:extensions:join-library:jar
> Task :sdks:java:io:mongodb:compileJava FROM-CACHE
> Task :sdks:java:io:mongodb:classes UP-TO-DATE
> Task :sdks:java:io:hadoop-common:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-common:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:io:hadoop-common:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:mongodb:jar
> Task :runners:local-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:parquet:compileJava FROM-CACHE
> Task :sdks:java:io:parquet:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:io:parquet:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-construction-java:jar
> Task :sdks:java:core:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:direct-java:shadowJar
> Task :sdks:java:extensions:sql:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:extensions:sql:classes
> Task :sdks:java:extensions:sql:jar

> Task :sdks:java:extensions:sql:zetasql:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:extensions:sql:zetasql:classes
> Task :sdks:java:extensions:sql:zetasql:jar
> Task :sdks:java:extensions:sql:expansion-service:compileJava
> Task :sdks:java:extensions:sql:expansion-service:classes
> Task :sdks:java:extensions:sql:expansion-service:jar
> Task :sdks:java:extensions:sql:expansion-service:shadowJar
> Task :runners:flink:1.10:job-server:shadowJar
> Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.10:job-server-container:dockerPrepare
> Task :runners:flink:1.10:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 6m 10s
88 actionable tasks: 62 executed, 25 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/6bnqlkq4h726q

[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins4633510291781088567.sh
+ echo 'Tagging image...'
Tagging image...
[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins6542301416464777577.sh
+ docker tag 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins4901552284734032523.sh
+ echo 'Pushing image...'
Pushing image...
[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins5686840938951589903.sh
+ docker push 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
The push refers to repository 
[gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server]
b09b5ca020f6: Preparing
e664edcc6285: Preparing
7348ba05c81d: Preparing
c61c572a31b4: Preparing
1323b50ec3b6: Preparing
dec5058e1e26: Preparing
68bb2d422178: Preparing
f5181c7ef902: Preparing
2e5b4ca91984: Preparing
527ade4639e0: Preparing
c2c789d2d3c5: Preparing
8803ef42039d: Preparing
dec5058e1e26: Waiting
68bb2d422178: Waiting
f5181c7ef902: Waiting
2e5b4ca91984: Waiting
527ade4639e0: Waiting
c2c789d2d3c5: Waiting
8803ef42039d: Waiting
b09b5ca020f6: Pushed
e664edcc6285: Pushed
7348ba05c81d: Pushed
dec5058e1e26: Layer already exists
2e5b4ca91984: Layer already exists
68bb2d422178: Layer already exists
f5181c7ef902: Layer already exists
527ade4639e0: Layer already exists
8803ef42039d: Layer already exists
c2c789d2d3c5: Layer already exists
1323b50ec3b6: Pushed
c61c572a31b4: Pushed
latest: digest: 
sha256:cb4e5d0bcb8faf382c8a21dd860512acb1942fcea5756a60e13f3cb6d18e1e9c size: 
2841
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-loadtests-python-pardo-flink-batch-301
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-python-pardo-flink-batch-301
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins6833731207758399780.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Python_ParDo_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins5409566013515308562.sh
+ cd 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-python-pardo-flink-batch-301-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.7 KiB]                                                
/ [3 files][ 13.7 KiB/ 13.7 KiB]                                                
Operation completed over 3 objects/13.7 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-python-pardo-flink-batch-301 
--region=global --num-****s=6 --initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python3.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/1d40edcd-2005-3df1-92ea-4ed6f57fd615].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
..............................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation 
[projects/apache-beam-testing/regions/global/operations/1d40edcd-2005-3df1-92ea-4ed6f57fd615]
 failed: Initialization action failed. Failed action 
'gs://beam-flink-cluster/init-actions/docker.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/d7adb09c-a6a3-40af-99ab-9319085e3758/beam-loadtests-python-pardo-flink-batch-301-w-5/dataproc-initialization-script-0_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to