See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/182/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-10136] Add JdbcIO Write Cross-language transform

[piotr.szuberski] [BEAM-10135] Add JdbcIO Read Cross-language transform

[piotr.szuberski] [BEAM-10135][BEAM-10136] Add Python wrapper for 
Cross-language JdbcIO

[piotr.szuberski] [BEAM-10135][BEAM-10136] Add integration tests for JdbcIO 
python wrapper

[piotr.szuberski] [BEAM-10135][BEAM-10136] AddJdbcIO python wrapper integration 
tests to

[piotr.szuberski] [BEAM-10171] Update the website with JdbcIO cross-language 
support

[je.ik] [BEAM-10533] Remove watermark hold from RequiresTimeSortedInput

[Damian Gadomski] Fix uploading wheels to GCS (on github actions)


------------------------------------------
[...truncated 57.57 KB...]
> Task :sdks:java:io:hadoop-common:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:io:mongodb:processResources NO-SOURCE
> Task :sdks:java:io:parquet:processResources NO-SOURCE
> Task :sdks:java:extensions:sql:zetasql:processResources NO-SOURCE
> Task :sdks:java:extensions:sql:expansion-service:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:flink:1.10:job-server-container:copyLicenses
> Task :model:fn-execution:processResources
> Task :runners:flink:1.10:job-server-container:dockerClean UP-TO-DATE
> Task :runners:flink:1.10:copySourceOverrides
> Task :runners:flink:1.10:copyTestResourcesOverrides NO-SOURCE
> Task :runners:flink:1.10:processResources
> Task :sdks:java:build-tools:compileJava FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:java:build-tools:processResources
> Task :sdks:java:build-tools:classes
> Task :sdks:java:build-tools:jar
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromCalciteCore
> Task :sdks:java:extensions:sql:copyFmppTemplatesFromSrc
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :sdks:java:extensions:sql:generateFmppSources
> Task :model:pipeline:jar
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes

> Task :sdks:java:extensions:sql:compileJavacc
Java Compiler Compiler Version 4.0 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/src/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj>
 . . .
Note: UNICODE_INPUT option is specified. Please make sure you create the 
parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD 
is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :sdks:java:extensions:sql:processResources
> Task :model:pipeline:shadowJar
> Task :model:job-management:shadowJar
> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar
> Task :sdks:java:io:hadoop-common:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-common:classes UP-TO-DATE
> Task :sdks:java:io:mongodb:compileJava FROM-CACHE
> Task :sdks:java:io:mongodb:classes UP-TO-DATE
> Task :sdks:java:io:hadoop-common:jar
> Task :sdks:java:extensions:join-library:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:join-library:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:join-library:jar
> Task :runners:local-java:jar
> Task :sdks:java:io:mongodb:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:io:parquet:compileJava FROM-CACHE
> Task :sdks:java:io:parquet:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:io:parquet:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:core:jar
> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:direct-java:shadowJar
> Task :sdks:java:extensions:sql:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:classes
> Task :sdks:java:extensions:sql:jar
> Task :sdks:java:extensions:sql:zetasql:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:zetasql:classes UP-TO-DATE
> Task :sdks:java:extensions:sql:zetasql:jar
> Task :sdks:java:extensions:sql:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:expansion-service:classes UP-TO-DATE
> Task :sdks:java:extensions:sql:expansion-service:jar
> Task :sdks:java:extensions:sql:expansion-service:shadowJar
> Task :runners:flink:1.10:job-server:shadowJar
> Task :runners:flink:1.10:job-server-container:copyDockerfileDependencies
> Task :runners:flink:1.10:job-server-container:dockerPrepare
> Task :runners:flink:1.10:job-server-container:docker

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 2m 10s
88 actionable tasks: 59 executed, 28 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/a3ltdip6uvor2

[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins1235145712456238515.sh
+ echo 'Tagging image...'
Tagging image...
[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins8508913594170244790.sh
+ docker tag 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins6419822655106591951.sh
+ echo 'Pushing image...'
Pushing image...
[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins4293351302552606575.sh
+ docker push 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
The push refers to repository 
[gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server]
a757d51e295a: Preparing
f3ffdff0af1e: Preparing
efa879193e44: Preparing
1208198de2bf: Preparing
836105ffc707: Preparing
dec5058e1e26: Preparing
68bb2d422178: Preparing
f5181c7ef902: Preparing
2e5b4ca91984: Preparing
527ade4639e0: Preparing
c2c789d2d3c5: Preparing
8803ef42039d: Preparing
68bb2d422178: Waiting
f5181c7ef902: Waiting
2e5b4ca91984: Waiting
527ade4639e0: Waiting
dec5058e1e26: Waiting
c2c789d2d3c5: Waiting
efa879193e44: Pushed
f3ffdff0af1e: Pushed
a757d51e295a: Pushed
dec5058e1e26: Layer already exists
68bb2d422178: Layer already exists
f5181c7ef902: Layer already exists
2e5b4ca91984: Layer already exists
527ade4639e0: Layer already exists
c2c789d2d3c5: Layer already exists
8803ef42039d: Layer already exists
836105ffc707: Pushed
1208198de2bf: Pushed
latest: digest: 
sha256:97e50311ea984f64c1b2b31606a5bb75ab9b9da571c447f136248e66c6ed2008 size: 
2841
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
CLUSTER_NAME=beam-postcommit-python-chicago-taxi-flink-182
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-postcommit-python-chicago-taxi-flink-182
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins7250729996410507446.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_PostCommit_Python_Chicago_Taxi_Flink] $ /bin/bash -xe 
/tmp/jenkins2598013809780659027.sh
+ cd 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Chicago_Taxi_Flink/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-postcommit-python-chicago-taxi-flink-182-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.7 KiB]                                                
/ [3 files][ 13.7 KiB/ 13.7 KiB]                                                
Operation completed over 3 objects/13.7 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_workers=6
+ gcloud dataproc clusters create beam-postcommit-python-chicago-taxi-flink-182 
--region=global --num-workers=6 --initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.10.1/flink-1.10.1-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-9.0/flink-shaded-hadoop-2-uber-2.8.3-9.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_python2.7_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.10_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/9757e2ff-3b02-3209-9ce3-a3e8063fdcc2].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
......................................................................................................................................................................................................................................................done.
ERROR: (gcloud.dataproc.clusters.create) Operation 
[projects/apache-beam-testing/regions/global/operations/9757e2ff-3b02-3209-9ce3-a3e8063fdcc2]
 failed: Initialization action failed. Failed action 
'gs://beam-flink-cluster/init-actions/docker.sh', see output in: 
gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/google-cloud-dataproc-metainfo/97e68762-3a1c-42bc-8a76-7778baedd75e/beam-postcommit-python-chicago-taxi-flink-182-w-1/dataproc-initialization-script-0_output.
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to