[ 
https://issues.apache.org/jira/browse/BEAM-291?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15286608#comment-15286608
 ] 

Aljoscha Krettek commented on BEAM-291:
---------------------------------------

Also, I think {{PDone}} is never to be used as the return type of a {{DoFn}} or 
any operation that produces data. It is just the {{OutputT}} of a 
{{PTransform}} when we want to signify that it is a terminal operation.

> PDone type translation fails
> ----------------------------
>
>                 Key: BEAM-291
>                 URL: https://issues.apache.org/jira/browse/BEAM-291
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-flink
>            Reporter: Maximilian Michels
>            Assignee: Maximilian Michels
>
> The {{PDone}} output type is currently not supported by the Flink Runner 
> because it doesn't have a Coder associated. This could also get in the way 
> when translating native Beam sinks which would likely return PDone.
> The simplest solution is to create a dummy PDone coder. Alternatively, we 
> could check for the PDone return type during translation and not retrieve the 
> coder at all.
> {noformat}
> Exception in thread "main" java.lang.IllegalStateException: Unable to return 
> a default Coder for AnonymousParDo.out [PCollection]. Correct one of the 
> following root causes:
>   No Coder has been manually specified;  you may do so using .setCoder().
>   Inferring a Coder from the CoderRegistry failed: Unable to provide a 
> default Coder for org.apache.beam.sdk.values.PDone. Correct one of the 
> following root causes:
>   Building a Coder using a registered CoderFactory failed: Cannot provide 
> coder based on value with class org.apache.beam.sdk.values.PDone: No 
> CoderFactory has been registered for the class.
>   Building a Coder from the @DefaultCoder annotation failed: Class 
> org.apache.beam.sdk.values.PDone does not have a @DefaultCoder annotation.
>   Building a Coder from the fallback CoderProvider failed: Cannot provide 
> coder for type org.apache.beam.sdk.values.PDone: 
> org.apache.beam.sdk.coders.protobuf.ProtoCoder$1@72ef8d15 could not provide a 
> Coder for type org.apache.beam.sdk.values.PDone: Cannot provide ProtoCoder 
> because org.apache.beam.sdk.values.PDone is not a subclass of 
> com.google.protobuf.Message; 
> org.apache.beam.sdk.coders.SerializableCoder$1@6aa8e115 could not provide a 
> Coder for type org.apache.beam.sdk.values.PDone: Cannot provide 
> SerializableCoder because org.apache.beam.sdk.values.PDone does not implement 
> Serializable.
>   Using the default output Coder from the producing PTransform failed: Unable 
> to provide a default Coder for org.apache.beam.sdk.values.PDone. Correct one 
> of the following root causes:
>   Building a Coder using a registered CoderFactory failed: Cannot provide 
> coder based on value with class org.apache.beam.sdk.values.PDone: No 
> CoderFactory has been registered for the class.
>   Building a Coder from the @DefaultCoder annotation failed: Class 
> org.apache.beam.sdk.values.PDone does not have a @DefaultCoder annotation.
>   Building a Coder from the fallback CoderProvider failed: Cannot provide 
> coder for type org.apache.beam.sdk.values.PDone: 
> org.apache.beam.sdk.coders.protobuf.ProtoCoder$1@72ef8d15 could not provide a 
> Coder for type org.apache.beam.sdk.values.PDone: Cannot provide ProtoCoder 
> because org.apache.beam.sdk.values.PDone is not a subclass of 
> com.google.protobuf.Message; 
> org.apache.beam.sdk.coders.SerializableCoder$1@6aa8e115 could not provide a 
> Coder for type org.apache.beam.sdk.values.PDone: Cannot provide 
> SerializableCoder because org.apache.beam.sdk.values.PDone does not implement 
> Serializable.
>       at 
> org.apache.beam.sdk.values.TypedPValue.inferCoderOrFail(TypedPValue.java:196)
>       at org.apache.beam.sdk.values.TypedPValue.getCoder(TypedPValue.java:49)
>       at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:138)
>       at 
> org.apache.beam.runners.flink.translation.FlinkStreamingTransformTranslators$ParDoBoundStreamingTranslator.translateNode(FlinkStreamingTransformTranslators.java:315)
>       at 
> org.apache.beam.runners.flink.translation.FlinkStreamingTransformTranslators$ParDoBoundStreamingTranslator.translateNode(FlinkStreamingTransformTranslators.java:305)
>       at 
> org.apache.beam.runners.flink.translation.FlinkStreamingPipelineTranslator.applyStreamingTransform(FlinkStreamingPipelineTranslator.java:108)
>       at 
> org.apache.beam.runners.flink.translation.FlinkStreamingPipelineTranslator.visitPrimitiveTransform(FlinkStreamingPipelineTranslator.java:89)
>       at 
> org.apache.beam.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:225)
>       at 
> org.apache.beam.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:220)
>       at 
> org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:104)
>       at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:292)
>       at 
> org.apache.beam.runners.flink.translation.FlinkPipelineTranslator.translate(FlinkPipelineTranslator.java:34)
>       at 
> org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:132)
>       at 
> org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:108)
>       at 
> org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:49)
>       at org.apache.beam.sdk.Pipeline.run(Pipeline.java:182)
>       at 
> org.apache.beam.runners.flink.examples.streaming.KafkaBeamExample.main(KafkaBeamExample.java:57)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to