[ 
https://issues.apache.org/jira/browse/FLINK-38509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18029423#comment-18029423
 ] 

Royston Tauro edited comment on FLINK-38509 at 10/13/25 8:18 AM:
-----------------------------------------------------------------

[~gaborgsomogyi] Had found something along these lines in a few articles and 
stack overflow answers

conf = SparkConf() \
.setMaster('local[*]') \
.setAppName('test') \
.set("spark.jars", "./lib/gcs-connector-hadoop3-latest.jar") \
.set("spark.hadoop.google.cloud.auth.service.account.enable", "true") \
.set("spark.hadoop.google.cloud.auth.service.account.json.keyfile", 
credentials_location)
 


was (Author: JIRAUSER311187):
conf = SparkConf() \
    .setMaster('local[*]') \
    .setAppName('test') \
    .set("spark.jars", "./lib/gcs-connector-hadoop3-latest.jar") \
    .set("spark.hadoop.google.cloud.auth.service.account.enable", "true") \
    .set("spark.hadoop.google.cloud.auth.service.account.json.keyfile", 
credentials_location)
 

> Dynamic Credentials in Flink for Google Cloud Storage
> -----------------------------------------------------
>
>                 Key: FLINK-38509
>                 URL: https://issues.apache.org/jira/browse/FLINK-38509
>             Project: Flink
>          Issue Type: Improvement
>          Components: Connectors / FileSystem
>    Affects Versions: 2.1.0
>            Reporter: Royston Tauro
>            Priority: Major
>
> Currently in session cluster mode, the only way to provide credentials for 
> Google Cloud Storage is using the env or the core-site.xml in hadoop config 
> which is read during the creation of the cluster only.
> Are there plans to make it dynamic to Jobs i.e each job can have have its own 
> credentials via flinkConfig something similar to how spark allows it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to