shroffru opened a new issue, #25165:
URL: https://github.com/apache/beam/issues/25165

   Attempting to run the Apache Beam Maven WordCount archetype on a Kubernetes 
cluster with the Spark operator. I have not changed the pom of the project, but 
I keep on running into the exception when I use the `SparkRunner`
   
   ```
   Exception in thread "main" java.lang.RuntimeException: 
java.lang.VerifyError: Bad type on operand stack
   Exception Details:
     Location:
       
com/google/api/ClientProto.registerAllExtensions(Lcom/google/protobuf/ExtensionRegistryLite;)V
 @4: invokevirtual
     Reason:
       Type 'com/google/protobuf/GeneratedMessage$GeneratedExtension' (current 
frame, stack[1]) is not assignable to 'com/google/protobuf/ExtensionLite'
     Current Frame:
       bci: @4
       flags: { }
       locals: { 'com/google/protobuf/ExtensionRegistryLite' }
       stack: { 'com/google/protobuf/ExtensionRegistryLite', 
'com/google/protobuf/GeneratedMessage$GeneratedExtension' }
     Bytecode:
       0000000: 2ab2 0002 b600 032a b200 04b6 0003 2ab2
       0000010: 0005 b600 03b1                         
   
        at 
org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:60)
        at 
org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:77)
        at 
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:104)
        at 
org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:92)
   ```
   
   This is what my deployment yaml looks like
   
   ```
   apiVersion: "sparkoperator.k8s.io/v1beta2"
   kind: SparkApplication
   metadata:
     name: spark-beam
     namespace: spark-k8s
   spec:
       type: Java
       mode: cluster
       image: "gcr.io/spark-operator/spark:v3.1.1"
       imagePullPolicy: Always
       
       mainClass: com.sharechat.WordCount
       mainApplicationFile: 
"local:///home/artifacts/wordcount-bundled-0.1-rushabh.jar"
       arguments: ["--runner=SparkRunner", "--output=gs://spark-beam/output/"]
       
       sparkVersion: "3.1.1"
       restartPolicy:
         type: Never
   
       volumes: ...
   ```
   
   This is what my pom looks like - 
`https://github.com/apache/beam/blob/master/sdks/java/maven-archetypes/examples/src/main/resources/archetype-resources/pom.xml`
   
   So it seems that the beam pipeline is using an incorrect version of 
protobuf-java. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to