AwasthiSomesh commented on issue #12235:
URL: https://github.com/apache/iceberg/issues/12235#issuecomment-2653907845

   @nastra  my use case is data transfer from source to target .
   
   source will read data from AWSDATACATALOG1 and target will write in to  
AWSDATACATALOG2 
   Then how we can use single spark to pass two default catalog value 
   
   is there any way or its limitation from spark side.
   
   Could you please confirm this point only.
   
   **Note:-** Spark is allowing "spark.sql.extensions" key with 2 diff value 
using semicolon separated but not for other keys.
      .config("spark.sql.extensions", 
"org.apache.iceberg.spark.extensions.DeltaSparkSessionExtensions , 
org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
   
   Thanks,
   Somesh


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to