[
https://issues.apache.org/jira/browse/BEAM-6884?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Yohei Onishi updated BEAM-6884:
-------------------------------
Description:
I tried to change Apache beam version from 2.9.0 to 2.10.0 and deploy it to
Dataflow but I got this error. It works with 2.9.0. Am I missing something?
{code:java}
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.beam.model.pipeline.v1.RunnerApi$BeamConstants$Constants.getValueDescriptor()Lorg/apache/beam/vendor/grpc/v1p13p1/com/google/protobuf/Descriptors$EnumValueDescriptor;
at
org.apache.beam.sdk.transforms.windowing.BoundedWindow.extractTimestampFromProto(BoundedWindow.java:84)
at
org.apache.beam.sdk.transforms.windowing.BoundedWindow.<clinit>(BoundedWindow.java:49)
at
org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:140)
at
org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:97)
at org.apache.beam.sdk.coders.CoderRegistry.<clinit>(CoderRegistry.java:160)
at org.apache.beam.sdk.Pipeline.getCoderRegistry(Pipeline.java:326)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:707)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:309)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:488)
at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:182)
{code}
My code is in Scala but it works well with Beam 2.9.0.
{code:java}
val p = Pipeline.create(options)
p.apply(s"${bu.name}_ReadFromKafka", KafkaIO.read()
.withBootstrapServers(options.getBootstreapServers)
.updateConsumerProperties(config)
.withTopics(util.Arrays.asList(topicName))
.withKeyDeserializer(classOf[LongDeserializer])
.withValueDeserializer(classOf[StringDeserializer])
.withConsumerFactoryFn(
new KafkaTLSConsumerFactory(
projectId, options.getSourceBucket, options.getTrustStoreGCSKey,
options.getKeyStoreGCSKey)))
.apply(s"${bu.name}_Convert", ParDo.of(new
ConvertJSONTextToEPCTransaction(bu)))
.apply(s"${bu.name}_WriteToBQ", BigQueryIO.write()
.to(bqDestTable)
.withSchema(schema)
.withFormatFunction(new ConvertMessageToTable())
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND))
}
p.run
{code}
According to the error log, it failed at this part.
https://github.com/apache/beam/blob/v2.10.0/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/BoundedWindow.java#L81-L85
{code}
private static Instant
extractTimestampFromProto(RunnerApi.BeamConstants.Constants constant) {
return new Instant(
Long.parseLong(
constant.getValueDescriptor().getOptions().getExtension(RunnerApi.beamConstant)));
}
{code}
This constant come from this part.
https://github.com/apache/beam/blob/v2.10.0/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/BoundedWindow.java#L48-L49
{code}
public static final Instant TIMESTAMP_MIN_VALUE =
extractTimestampFromProto(RunnerApi.BeamConstants.Constants.MIN_TIMESTAMP_MILLIS);
{code}
was:
I tried to change Apache beam version from 2.9.0 to 2.10.0 and deploy it to
Dataflow but I got this error. It works with 2.9.0. Am I missing something?
{code:java}
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.beam.model.pipeline.v1.RunnerApi$BeamConstants$Constants.getValueDescriptor()Lorg/apache/beam/vendor/grpc/v1p13p1/com/google/protobuf/Descriptors$EnumValueDescriptor;
at
org.apache.beam.sdk.transforms.windowing.BoundedWindow.extractTimestampFromProto(BoundedWindow.java:84)
at
org.apache.beam.sdk.transforms.windowing.BoundedWindow.<clinit>(BoundedWindow.java:49)
at
org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:140)
at
org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:97)
at org.apache.beam.sdk.coders.CoderRegistry.<clinit>(CoderRegistry.java:160)
at org.apache.beam.sdk.Pipeline.getCoderRegistry(Pipeline.java:326)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:707)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:309)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:488)
at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:182)
{code}
My code is in Scala but it should work well.
{code:java}
val p = Pipeline.create(options)
p.apply(s"${bu.name}_ReadFromKafka", KafkaIO.read()
.withBootstrapServers(options.getBootstreapServers)
.updateConsumerProperties(config)
.withTopics(util.Arrays.asList(topicName))
.withKeyDeserializer(classOf[LongDeserializer])
.withValueDeserializer(classOf[StringDeserializer])
.withConsumerFactoryFn(
new KafkaTLSConsumerFactory(
projectId, options.getSourceBucket, options.getTrustStoreGCSKey,
options.getKeyStoreGCSKey)))
.apply(s"${bu.name}_Convert", ParDo.of(new
ConvertJSONTextToEPCTransaction(bu)))
.apply(s"${bu.name}_WriteToBQ", BigQueryIO.write()
.to(bqDestTable)
.withSchema(schema)
.withFormatFunction(new ConvertMessageToTable())
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND))
}
p.run
{code}
> NoSuchMethodError: descriptors$EnumValueDescriptor when deploying Beam 2.10.0
> to Dataflow
> -----------------------------------------------------------------------------------------
>
> Key: BEAM-6884
> URL: https://issues.apache.org/jira/browse/BEAM-6884
> Project: Beam
> Issue Type: Bug
> Components: beam-model
> Affects Versions: 2.10.0
> Reporter: Yohei Onishi
> Priority: Major
>
> I tried to change Apache beam version from 2.9.0 to 2.10.0 and deploy it to
> Dataflow but I got this error. It works with 2.9.0. Am I missing something?
> {code:java}
> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.beam.model.pipeline.v1.RunnerApi$BeamConstants$Constants.getValueDescriptor()Lorg/apache/beam/vendor/grpc/v1p13p1/com/google/protobuf/Descriptors$EnumValueDescriptor;
> at
> org.apache.beam.sdk.transforms.windowing.BoundedWindow.extractTimestampFromProto(BoundedWindow.java:84)
> at
> org.apache.beam.sdk.transforms.windowing.BoundedWindow.<clinit>(BoundedWindow.java:49)
> at
> org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:140)
> at
> org.apache.beam.sdk.coders.CoderRegistry$CommonTypes.<init>(CoderRegistry.java:97)
> at org.apache.beam.sdk.coders.CoderRegistry.<clinit>(CoderRegistry.java:160)
> at org.apache.beam.sdk.Pipeline.getCoderRegistry(Pipeline.java:326)
> at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:707)
> at org.apache.beam.sdk.io.kafka.KafkaIO$Read.expand(KafkaIO.java:309)
> at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
> at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:488)
> at org.apache.beam.sdk.values.PBegin.apply(PBegin.java:56)
> at org.apache.beam.sdk.Pipeline.apply(Pipeline.java:182)
> {code}
> My code is in Scala but it works well with Beam 2.9.0.
> {code:java}
> val p = Pipeline.create(options)
> p.apply(s"${bu.name}_ReadFromKafka", KafkaIO.read()
> .withBootstrapServers(options.getBootstreapServers)
> .updateConsumerProperties(config)
> .withTopics(util.Arrays.asList(topicName))
> .withKeyDeserializer(classOf[LongDeserializer])
> .withValueDeserializer(classOf[StringDeserializer])
> .withConsumerFactoryFn(
> new KafkaTLSConsumerFactory(
> projectId, options.getSourceBucket, options.getTrustStoreGCSKey,
> options.getKeyStoreGCSKey)))
> .apply(s"${bu.name}_Convert", ParDo.of(new
> ConvertJSONTextToEPCTransaction(bu)))
> .apply(s"${bu.name}_WriteToBQ", BigQueryIO.write()
> .to(bqDestTable)
> .withSchema(schema)
> .withFormatFunction(new ConvertMessageToTable())
> .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER)
> .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND))
> }
> p.run
> {code}
> According to the error log, it failed at this part.
> https://github.com/apache/beam/blob/v2.10.0/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/BoundedWindow.java#L81-L85
> {code}
> private static Instant
> extractTimestampFromProto(RunnerApi.BeamConstants.Constants constant) {
> return new Instant(
> Long.parseLong(
>
> constant.getValueDescriptor().getOptions().getExtension(RunnerApi.beamConstant)));
> }
> {code}
> This constant come from this part.
> https://github.com/apache/beam/blob/v2.10.0/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/windowing/BoundedWindow.java#L48-L49
> {code}
> public static final Instant TIMESTAMP_MIN_VALUE =
>
> extractTimestampFromProto(RunnerApi.BeamConstants.Constants.MIN_TIMESTAMP_MILLIS);
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)