Hi Rohil,

this sounds a little bit strange. If the GoogleHadoopFileSystem jar is on
the classpath and the implementation is specified in core-site.xml, then
the Hadoop Filesystem should be able to load the GCS filesystem. I just
tried it out locally (without K8s though) and it seemed to work.

Could you maybe share a little bit more information about your setup. Which
Hadoop version are you running? Maybe you could share the complete
`JobManager` log with us. Does the same problem arise if you use Flink 1.5?

Cheers,
Till

On Thu, Jun 28, 2018 at 1:41 PM Rohil Surana <rohilsuran...@gmail.com>
wrote:

> Hi
>
> I was trying to setup checkpointing on Google Cloud Storage with Flink on
> Kubernetes, but was facing issues with Google Cloud Storage Connector
> classes not loading, even though in the logs I can see it being included in
> the classpath.
>
> Logs showing classpath - *https://pastebin.com/R1P7Eepz
> <https://pastebin.com/R1P7Eepz>*
> Logs showing ClassNotFound Exception for class GoogleHadoopFileSystem -
> https://pastebin.com/LGMPzVbp
> Hadoop conf core-site.xml - https://pastebin.com/CfEmTk2t
>
> What extra I have done -
> 1.) Create a new Flink image with Google Cloud Storage connector jar
> <https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage>
> in /etc/flink/lib folder.
> 2.) Add GCS service account credentials as Kubernetes secret
> 3.) Mount secret and hadoop, flink ConfigMaps on the taskmanager and
> jobmanager deployments.
>
> Flink version - 1.4.2
> GCS Hadoop - gcs-connector-latest-hadoop2.jar
>
> Any help is appreciated.
> Thank you.
>
> Rohil
>
>

Reply via email to