Thanks Chris. That's useful to know.

I cross-posted the question to SO [1] and was told that the error message was because I had forgotten to authorise the dataflow API for this project in the Cloud Console (PEBCAK).

1. https://stackoverflow.com/questions/41815519/cloud-dataflow-job-failed-without-reason/


On 23. jan. 2017 23:34, Chris Fei wrote:
I believe that documentation refers to the 1.X releases of the Google Dataflow SDK. BlockingDataflowPipelineRunner exists there, but it doesn't exist in Beam 0.4.0. Take a look at the breaking changes for the 2.X releases at https://cloud.google.com/dataflow/release-notes/release-notes-java-2, which outline how to migrate away from BlockingDataflowPipelineRunner.

You should just be able to specify DataflowRunner in your options, and then add the --blockOnRun option.

Chris


On Mon, Jan 23, 2017, at 05:03 PM, Gareth Western wrote:

Thank you. How does one use the BlockingDataflowPipelineRunner with Beam?

Specifying it in the PipelineOptions results in an Exception with a message stating that it is not one of the supported pipeline runners:

"Exception in thread "main" java.lang.IllegalArgumentException: Class 'com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner' does not implement PipelineRunner. Supported pipeline runners [DataflowRunner, DirectRunner, FlinkRunner, TestFlinkRunner]"


On 23. jan. 2017 22:31, Lukasz Cwik wrote:
Please take a look at https://cloud.google.com/dataflow/pipelines/logging#monitoring-pipeline-logs

On Mon, Jan 23, 2017 at 12:21 PM, Gareth Western <[email protected] <mailto:[email protected]>> wrote:

    I'm having trouble running my pipeline using the dataflow
    runner. The job is submitted successfully:

    Dataflow SDK version: 0.4.0
    Submitted job: 2017-01-23_12_17_20-13351949104581541182

    but in the Dataflow console it simply says " workflow failed". I
    can't seem to find any more details regarding the cause of the
    failure. Where should I be looking?



Reply via email to