RussellSpitzer commented on issue #2593:
URL: https://github.com/apache/iceberg/issues/2593#issuecomment-841291217


   This means 1 of two things,
   Either the target of the table is not an Iceberg table, or the Spark 
Extensions was not set when the session was created.
   
   In your case I believe it's number 2, since Spark-Shell already starts up a 
SparkContext I believe you will in the above code just be deriving a new 
session from that context, so some of your config options will be treated as 
runtime rather than as creation time options. This should only be an issue for
   `spark.jars.packages` and `spark.sql.extensions`
   I believe you should be setting these earlier in the initialization, for 
spark-shell this means either on the CLI or in a spark-defaults file.
   
   You already cover "spark.jars.packages" with your --packages command so the 
only thing to add is 
   "--conf 
spark.sql.extension=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
   
   Which should fix your issue
   
   See the docs https://iceberg.apache.org/getting-started/#adding-catalogs 
   For an example. 
   
   The catalogs do not need to be configured before the session is initilzaed 
so you can leave that code as is if you like but I would recommend setting 
everything on the CLI or by using
   spark.conf.set()
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to