robertnagy1 commented on issue #876:
URL: https://github.com/apache/sedona/issues/876#issuecomment-1612916277

   By the way when creating a spark session and sparkContext and then checking 
with spark.sparkContext.getConf().getAll() i see that the buffer max value does 
not change the result is ('spark.kryoserializer.buffer.max', '128m'). So these 
parameters set in this Spark session do not change.
   spark = SparkSession.\
       builder.\
       master("local[*]").\
       appName("Sedona App").\
       config("spark.serializer", KryoSerializer.getName).\
       config("spark.kryo.registrator", SedonaKryoRegistrator.getName).\
       config('spark.kryoserializer.buffer.max',"2g").\
       config('spark.executor.memory', "10g").\
       config("spark.driver.memory", "10g").\
       config("spark.jars.packages", 
"org.apache.sedona:sedona-spark-shaded-3.0_2.12:1.4.0,org.datasyslab:geotools-wrapper:1.4.0-28.2").\
       getOrCreate()
   
   from sedona.register.geo_registrator import SedonaRegistrator
   
   SedonaRegistrator.registerAll(spark)
   sc = spark.sparkContext
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to