ruanhang1993 edited a comment on pull request #14376:
URL: https://github.com/apache/flink/pull/14376#issuecomment-791176342


   There is no problem when packaging flink project. The problem occurs when 
submitting job by flink cli. 
   I get the exception without the provided scope, like this. The test job aims 
to write data from kafka to hive.
   ```java
   Caused by: java.lang.ClassCastException: 
com.google.protobuf.Descriptors$Descriptor cannot be cast to 
com.google.protobuf.Descriptors$Descriptor
           at 
org.apache.flink.formats.protobuf.PbFormatUtils.getDescriptor(PbFormatUtils.java:81)
 ~[?:?]
          ......
   ```
   The flink directory `lib` contains the `flink-dist` jar(protobuf 3.11.1) and 
`flink-sql-connector-hive-1.2.2_2.11` jar(protobuf 2.5.0 relocated by me). The 
`flink-protobuf` jar(protobuf 3.11.1) is in my job jar. And submit job by this 
command:
   ```bash
   flink run  -m  yarn-cluster  \
   -yd  ...... -yt  protobufMessage.jar  \
   -c  package.Main  myJob.jar  jobParams \
   ```
   ---------------------------------------------------------------------------
   After a few tests, I think the problem is about the class loading in flink, 
not conflicting with other modules as I thought. 
   
   I need to place the `flink-protobuf` jar under the `lib` directory like 
other formats, e.g. `flink-json`. And every problem is gone. We don't need to 
change the version in `flink-protobuf` to `protoc.version` or relocate it in 
`flink-sql-connector-hive`.
   
   It seems that I use the jar in a wrong way. Thanks a lot for your answer.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to