[ 
https://issues.apache.org/jira/browse/HUDI-4931?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17610422#comment-17610422
 ] 

Pramod Biligiri commented on HUDI-4931:
---------------------------------------

Some useful references regarding this:
- GCP docs on Cloud Storage connector: 
[https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage]
- Hudi docs on GCS connectivity: https://hudi.apache.org/docs/gcs_hoodie/

 

> Explore fat jar option for gcs-connector lib used during GCS Ingestion
> ----------------------------------------------------------------------
>
>                 Key: HUDI-4931
>                 URL: https://issues.apache.org/jira/browse/HUDI-4931
>             Project: Apache Hudi
>          Issue Type: Task
>            Reporter: Pramod Biligiri
>            Priority: Major
>
> Currently, the GCS Ingestion (HUDI-4850) expects recent versions of Jars like 
> protobuf and Guava to be provided to spark-submit explicitly, to override 
> older versions shipped with Spark. These Jars are used by the gcs-connector 
> which is a library from Google that helps connect to GCS. For more details 
> see 
> [https://docs.google.com/document/d/1VfvtdvhXw6oEHPgZ_4Be2rkPxIzE0kBCNUiVDsXnSAA/edit#]
>  (section titled "Configure Spark to use newer versions of some Jars").
> See if it's possible to create a shaded+fat jar of gcs-connector for this use 
> case instead, and avoid specifying things to spark-submit on the command line.
> An alternate approach to consider for the long term is HUDI-4930 (slim 
> bundles).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to