moelhoussein commented on PR #4984:
URL: https://github.com/apache/kyuubi/pull/4984#issuecomment-2299681166

   Thanks for spending time on this. To be clear, I am able to submit batch 
jobs, but as you suggested the kubernetes client is defaulting to the local 
context, here is my kyuubi config:
   
   ```
   # Batch job to remote cluster isost10 
   kyuubi.batchConf.spark.spark.master=k8s://isost10:443
   
kyuubi.batchConf.spark.spark.kubernetes.authenticate.driver.serviceAccountName=spark
   
kyuubi.batchConf.spark.spark.kubernetes.authenticate.submission.caCertFile=/opt/kyuubi/isost10-np-cacert
   kyuubi.batchConf.spark.spark.kubernetes.namespace=spark
   kyuubi.batchConf.spark.spark.kubernetes.serviceAccountName=spark
   # remote clusters context
   kyuubi.kubernetes.trust.certificates=true
   kyuubi.kubernetes.context.allow.list=isost10
   kyuubi.kubernetes.isost10.master.address=k8s://isost10:443
   
kyuubi.kubernetes.isost10.spark.authenticate.oauthTokenFile=/etc/isost10-np/token
   kyuubi.kubernetes.isost10.spark.kubernetes.serviceAccountName=spark
   ```
   I see in the logs:
   ```
   Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure 
executing: GET at: 
https://192.168.0.1:443/api/v1/namespaces/spark/pods?labelSelector
   ```
   Am I doing anything wrong here?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to