Hi, Reuven thanks for your reply.

Ok let me try it since this kind of transform generates a PCollection<Row>,
so probably I need to adapt further steps of the pipeline to this new
contract.

I'm gonna work on this and I'll let you know my results, thanks again.

El mar, 31 ago 2021 a las 0:43, Reuven Lax (<[email protected]>) escribió:

> Schemas work best when not using KV objects. Are you able to use the Group
> transform? e.g.
>
> input.apply(Group.byFieldNames("orgUnitParentId"));
>
> On Mon, Aug 30, 2021 at 9:51 PM Juan Pablo Feliciano Báez <
> [email protected]> wrote:
>
>> Hi everyone.
>>
>> I'm trying to understand the way to resolve an issue I'm getting when I
>> run a pipeline locally after I did adapt a Java POJO class to an Apache
>> Beam Schema by using the AutoValue approach. Below, the code of the
>> transformation where the exception is thrown:
>>
>> public class ApplyParentChildTransform extends
>>     PTransform<PCollection<KV<Integer, MyObject>>, PCollection<KV<Integer, 
>> Iterable<MyObject>>>> {
>>
>>   private static final long serialVersionUID = 1053787833909150353L;
>>
>>   @Override
>>   public PCollection<KV<Integer, Iterable<MyObject>>> expand(
>>       PCollection<KV<Integer, MyObject>> input) {
>>     PCollection<KV<Integer, MyObject>> mapParentChildTransform = input
>>         .apply("MapParentChildTransform",
>>             MapElements.into(
>>                 TypeDescriptors.kvs(TypeDescriptors.integers(),
>>                     of(MyObject.class))).via(
>>                 myObject -> KV
>>                     
>> .of(Objects.requireNonNull(myObject.getValue()).getOrgUnitParentId(),
>>                         myObject.getValue())));
>>     return mapParentChildTransform.apply("GroupChildByParentKey", 
>> GroupByKey.create());
>>   }
>> }
>>
>> During the pipeline execution, and in particular, once the transformation
>> containing the code above is called I'm getting the next exception:
>>
>> java.lang.IllegalStateException:
>> Unable to return a default Coder for
>> ApplyParentChildTransform/MapParentChildTransform/Map/ParMultiDo(Anonymous).output
>> [PCollection@759448233]. Correct one of the following root causes:
>>   No Coder has been manually specified;  you may do so using .setCoder().
>>   Inferring a Coder from the CoderRegistry failed: Cannot provide coder
>> for parameterized type org.apache.beam.sdk.values.KV<java.lang.Integer,
>> com.mybeamproject.model.MyObject>: Unable to provide a Coder for
>> com.mybeamproject.model.MyObject.
>>   Building a Coder using a registered CoderProvider failed.
>>   See suppressed exceptions for detailed failures.
>>   Using the default output Coder from the producing PTransform failed:
>> PTransform.getOutputCoder called.
>>
>> As I mentioned above, MyObject class is using the AutoValueSchema schema
>> and as I understand, one of the benefits of using schemas is the
>> auto-detection of the coder, so in this case, I shouldn't explicitly set a
>> specific coder.
>>
>> I'd appreciate your support to move forward with this issue, thank you.
>> --
>> Juan Pablo Feliciano Báez.
>>
>

-- 
Juan Pablo Feliciano Báez.

Reply via email to