atinvento100 commented on issue #13358:
URL: https://github.com/apache/iceberg/issues/13358#issuecomment-3021905978

   I havent removed jars. And what i'm trying with is in pyspark and it isnt a 
spark submit. I have just installed pyspark and created a sparksession in local 
mode. Here is the sparksession that i use. Also I do see the mentioned 
jar(libthrift-0.16.0.jar) under pyspark lib that i had installed in the venv.
   
   
   ```
   spark = spark.config("spark.jars.packages",
                            
"org.apache.iceberg:iceberg-spark-runtime-4.0_2.13:1.10.0-SNAPSHOT,"
                   "org.apache.hadoop:hadoop-aws:3.4.1,") \
       .config("spark.jars.repositories", 
"https://repository.apache.org/content/repositories/snapshots";) \
       .config("spark.hadoop.fs.s3a.path.style.access", 'True') \
       .config("spark.sql.catalogImplementation", 'hive') \
       .config("spark.sql.extensions", 
'org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions') \
       .config("spark.sql.iceberg.vectorization.enabled", 'False') \
       .config("spark.sql.catalog.{}".format(configs['CATALOG_NAME']), 
'org.apache.iceberg.spark.SparkCatalog') \
       .config("spark.sql.catalog.{}.type".format(configs['CATALOG_NAME']), 
'hive') \
       .config("spark.sql.catalog.{}.uri".format(configs['CATALOG_NAME']), 
configs['CATALOG_METASTORE_URI']) \
       
.config("spark.sql.catalog.{}.warehouse".format(configs['CATALOG_NAME']), 
"spark-warehouse/iceberg") \
       .config("spark.hadoop.hive.metastore.uris", 
configs['CATALOG_METASTORE_URI']) \
       .config("spark.hive.metastore.client.auth.mode", 'PLAIN') \
       .config("spark.hive.metastore.client.plain.username", 'ibmlhapikey') \
       .config("spark.hive.metastore.client.plain.password", 
configs['WXD_API_KEY']) \
       .config("spark.hive.metastore.use.SSL", 'true') \
       .config("spark.hive.metastore.truststore.type", 'JKS') \
       .config("spark.hive.metastore.truststore.path", 
configs['CATALOG_METASTORE_TRUSTSTORE_PATH']) \
       .config("spark.hive.metastore.truststore.password", 
configs['CATALOG_METASTORE_TRUSTSTORE_PASSWORD']) \
       .config('spark.driver.extraClassPath', configs['DRIVER_PATH']) \
       
.config("spark.hadoop.fs.s3a.bucket.{}.endpoint".format(configs['CATALOG_COS_BUCKET_NAME']),
 configs['CATALOG_COS_BUCKET_ENDPOINT']) \
       
.config("spark.hadoop.fs.s3a.bucket.{}.access.key".format(configs['CATALOG_COS_BUCKET_NAME']),
               configs['CATALOG_COS_BUCKET_ACCESS_KEY']) \
       
.config("spark.hadoop.fs.s3a.bucket.{}.secret.key".format(configs['CATALOG_COS_BUCKET_NAME']),
               configs['CATALOG_COS_BUCKET_SECRET_KEY']) \
       .config("spark.sql.legacy.parquet.nanosAsLong", "true") \
       .enableHiveSupport() \
       .getOrCreate()
   ```
   
   using python3.11
   
   lib dependencies installed in venv
   ```
   python = "^3.11"
   pyspark = "4.0.0"
   ipykernel = "^6.29.5"
   python-dotenv = "^1.0.1"
   ibm-cos-sdk = "^2.13.6"
   ibm-secrets-manager-sdk = "^2.1.8"
   pyyaml = "6.0.1"
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to