See 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/211/display/redirect?page=changes>

Changes:

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[ehudm] junitxml_report: Add failure tag support

[jeff] BEAM-8745 More fine-grained controls for the size of a BigQuery Load job

[suztomo] google_auth_version 0.19.0

[kcweaver] [BEAM-9070] tests use absolute paths for job server jars

[apilloud] [BEAM-9027] Unparse DOY/DOW/WEEK Enums properly for ZetaSQL

[kcweaver] [BEAM-8337] Hard-code Flink versions.

[echauchot] [BEAM-9019] Remove BeamCoderWrapper to avoid extra object 
allocation and

[lukecwik] [BEAM-8624] Implement Worker Status FnService in Dataflow runner

[github] [BEAM-5605] Add support for executing pair with restriction, split

[kcweaver] fix indentation

[kcweaver] Update release guide

[lostluck] [BEAM-9080] Support KVs in the Go SDK's Partition

[github] Rephrasing lull logging to avoid alarming users (#10446)

[robertwb] [BEAM-8575] Added counter tests for CombineFn (#10190)

[github] [BEAM-8490] Fix instance_to_type for empty containers (#9894)

[apilloud] [BEAM-9027] Backport BigQuerySqlDialect fixes

[robertwb] [BEAM-8575] Test hot-key fanout with accumulation modes. (#10159)

[github] [BEAM-9059] Use string constants in PTransformTranslation instead of


------------------------------------------
[...truncated 274.21 KB...]
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:job_endpoint:v1"
    value {
      string_value: "localhost:8099"
    }
  }
  fields {
    key: "beam:option:job_name:v1"
    value {
      string_value: "load_tests_Python_Flink_Batch_GBK_3_0111100245"
    }
  }
  fields {
    key: "beam:option:job_port:v1"
    value {
      string_value: "0"
    }
  }
  fields {
    key: "beam:option:job_server_timeout:v1"
    value {
      string_value: "60"
    }
  }
  fields {
    key: "beam:option:load_balance_bundles:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:no_auth:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:object_reuse:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:parallelism:v1"
    value {
      string_value: "5"
    }
  }
  fields {
    key: "beam:option:pipeline_type_check:v1"
    value {
      bool_value: true
    }
  }
  fields {
    key: "beam:option:profile_cpu:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_memory:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:profile_sample_rate:v1"
    value {
      number_value: 1.0
    }
  }
  fields {
    key: "beam:option:project:v1"
    value {
      string_value: "apache-beam-testing"
    }
  }
  fields {
    key: "beam:option:retain_docker_containers:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:runtime_type_check:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:save_main_session:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:sdk_location:v1"
    value {
      string_value: "container"
    }
  }
  fields {
    key: "beam:option:sdk_worker_parallelism:v1"
    value {
      string_value: "1"
    }
  }
  fields {
    key: "beam:option:shutdown_sources_on_final_watermark:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:spark_master_url:v1"
    value {
      string_value: "local[4]"
    }
  }
  fields {
    key: "beam:option:spark_submit_uber_jar:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:streaming:v1"
    value {
      bool_value: false
    }
  }
  fields {
    key: "beam:option:type_check_strictness:v1"
    value {
      string_value: "DEFAULT_TO_ANY"
    }
  }
  fields {
    key: "beam:option:update:v1"
    value {
      bool_value: false
    }
  }
}
job_name: "job"

apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job 
failed. (JobID: e87fbc02178af84fed0d295fca910f84)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
        at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
        at 
org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
        at 
org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
        at 
org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
        at 
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
        at 
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
        at 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
        at 
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at 
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
        ... 16 more
Caused by: java.util.concurrent.ExecutionException: 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: 
cancelled before receiving half close
        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:345)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:204)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:204)
        at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
        ... 1 more
Caused by: org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: 
CANCELLED: cancelled before receiving half close
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        ... 1 more

root: ERROR: 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: 
cancelled before receiving half close
apache_beam.runners.portability.portable_runner: INFO: Job state changed to 
FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 60.205s

FAILED (errors=1)

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 53

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 9s
5 actionable tasks: 3 executed, 2 up-to-date

Publishing build scan...
https://gradle.com/s/udykecg74ihnq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to