See 
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/242/display/redirect?page=changes>

Changes:

[kedin] [SQL] Replace planner.compilePipeline() with sqlEnv.parseQuery()

[kedin] [SQL] Add sqlEnv.executeDdl()

[kedin] [SQL] Make planner package-private

[kedin] [SQL] Add factory methods to BeamSqlEnv

[kedin] [SQL] Rename ReadOnlyTableProvider

[cademarkegard] [BEAM-4303] Enforce ErrorProne analysis in examples project

[kedin] [SQL] Wrap SQL parsing exceptions in ParseException

[tgroh] Add an abstraction for State and Timers

[tgroh] DirectRunner Cleanups

[tgroh] Link up the Portable DirectRunner

[tgroh] Reuse ID Generators across Environments

[aromanenko.dev] [BEAM-4421] Fix for issue with reading s3 files using ParquetIO

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace 
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e33da0da577b062d796dc33032214cd4846092b4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e33da0da577b062d796dc33032214cd4846092b4
Commit message: "Merge pull request #5530: [SQL] BeamSqlEnv refactor"
 > git rev-list --no-walk 29066a4b8fc4e0dd4b14b1373e2dc35c28e2e8e0 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins8373307969922470032.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a 
--verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: 
[--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4552011880182804144.sh
+ cp /home/jenkins/.kube/config 
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-242>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe 
/tmp/jenkins4418684147303798074.sh
+ kubectl 
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-242>
 create namespace filebasedioithdfs-242
Error from server (AlreadyExists): namespaces "filebasedioithdfs-242" already 
exists
Build step 'Execute shell' marked build as failure

Reply via email to