[ https://issues.apache.org/jira/browse/BEAM-3002?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16189944#comment-16189944 ]
Luke Cwik edited comment on BEAM-3002 at 10/3/17 4:43 PM: ---------------------------------------------------------- This is unlikely a bug inside the SDK and has to do with your maven shade plugin configuration being incorrect. This is evident since mvn exec:java and Intellij are correctly building and packaging your application. You need to be using maven shade 3.0.0 or higher with a org.apache.maven.plugins.shade.resource.ServicesResourceTransformer configured. See: https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html#ServicesResourceTransformer for more details. Re-open if you have further questions. was (Author: lcwik): This is unlikely a bug inside the SDK and has to do with your maven shade plugin configuration being incorrect. You need to be using maven shade 3.0.0 or higher with a org.apache.maven.plugins.shade.resource.ServicesResourceTransformer configured. See: https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html#ServicesResourceTransformer for more details. > Unable to provide a Coder for org.apache.hadoop.hbase.client.Mutation > --------------------------------------------------------------------- > > Key: BEAM-3002 > URL: https://issues.apache.org/jira/browse/BEAM-3002 > Project: Beam > Issue Type: Bug > Components: sdk-java-core > Affects Versions: 2.1.0 > Environment: hadoop2.8.0, hbase1.2.6 > Reporter: huangjianhuang > Assignee: Kenneth Knowles > Priority: Minor > Fix For: Not applicable > > > i write a demo with HbaseIO, and format data into Mutation to write to hbase. > The demo works fine on idea or using mvn exec:java command, but doesn't work > after shade packaged as jar (run with java -cp). > The error message is : > {code:java} > Using the default output Coder from the producing PTransform failed: Unable > to provide a Coder for org.apache.hadoop.hbase.client.Mutation. > Building a Coder using a registered CoderProvider failed. > See suppressed exceptions for detailed failures. > at > org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkState(Preconditions.java:444) > at > org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:257) > at > org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:106) > at > org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifying(TransformHierarchy.java:222) > at > org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:208) > at > org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:440) > at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:552) > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:296) > at org.apache.beam.sdk.Pipeline.run(Pipeline.java:283) > at com.joe.FlinkDemoFinal.main(FlinkDemoFinal.java:113) > {code} > And i tried to print the default coder of Mutation, on IDEA it works fine and > print "HBaseMutationCoder", but show nothing by running as jar. > And then i tried to register "HBaseMutationCoder" manully, but found that the > HBaseMutationCoder is a private class, i don't know how to register a coder > for Mutation. > Part of my code: > {code:java} > .apply("Hbase data format", > ParDo.of(new DoFn<Long, Mutation>() { > @ProcessElement > public void processElement(ProcessContext context) { > System.out.println(context.element()); > byte[] qual = Bytes.toBytes("count"); > byte[] cf = Bytes.toBytes("cf"); > byte[] row = Bytes.toBytes("kafka"); > byte[] val = > Bytes.toBytes(context.element().toString()); > Mutation mutation = new Put(row).addColumn(cf, qual, > val); > context.output(mutation); > } > })); > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029)