Hi folks,

I have the following setup:

minio
nessie 1.7.0
spark 3.5.6

I'm trying to read from something from minio, but I'm getting the error
posted to the bottom of this email.

What I fail to understand is that I've included the TLS certificates into
my keystore from the k8s cluster and I can run this Spark workload on the
cluster; but in client mode, it's fails and I cannot figure out why.

Any tips/pointers of what to try, would be most helpful.

Thank in advance,
Hamish

======

Using properties file:
/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/spark-defaults.conf
25/09/01 12:59:54 WARN Utils: Your hostname, ukhozi resolves to a loopback
address: 127.0.1.1; using 192.168.0.219 instead (on interface wlp59s0)
25/09/01 12:59:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
Adding default property:
spark.sql.catalog.core=org.apache.iceberg.spark.SparkCatalog
Adding default property:
spark.serializer=org.apache.spark.serializer.KryoSerializer
Adding default property: spark.hadoop.fs.s3a.connection.ssl.enabled=false
Adding default property:
spark.executor.extraJavaOptions=-Daws.java.v1.disableDeprecationAnnouncement=true
-Daws.region=us-east-1
Adding default property: spark.io.compression.codec=zstd
Adding default property: spark.hadoop.fs.s3a.multipart.threshold=128m
Adding default property: spark.sql.catalog.core.authentication.type=NONE
Adding default property: spark.sql.catalog.core.ref=main
Adding default property: spark.ssl.enabled=false
Adding default property: spark.hadoop.fs.s3a.path.style.access=true
Adding default property: spark.hadoop.fs.s3a.multipart.size=128m
Adding default property:
spark.hadoop.fs.s3a.aws.credentials.provider=org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider
Adding default property: spark.hadoop.fs.s3a.connection.maximum=100
Adding default property: spark.hadoop.fs.s3a.connection.timeout=100
Adding default property:
spark.sql.catalog.core.io-impl=org.apache.iceberg.aws.s3.S3FileIO
Adding default property: spark.sql.catalog.core.s3.endpoint=
https://datalake.dev.<customer-name-redacted>.com
Adding default property: spark.hadoop.fs.s3a.paging.maximum=1000
Adding default property: spark.hadoop.fs.s3a.attempts.maximum=10
Adding default property: spark.sql.ansi.enabled=true
Adding default property: spark.hadoop.fs.s3a.region=eu-west-1
Adding default property: spark.sql.catalog.core.uri=https://nessie.dev
.<customer-name-redacted>.com/iceberg
Adding default property:
spark.driver.extraJavaOptions=-Daws.java.v1.disableDeprecationAnnouncement=true
-Daws.region=us-east-1
Adding default property: spark.hadoop.fs.s3a.analytics.enabled=false
Adding default property: spark.hadoop.fs.s3a.endpoint=https://datalake.dev
.<customer-name-redacted>.com
Adding default property: spark.sql.catalog.core.type=rest
Adding default property: spark.sql.catalog.core.warehouse=s3://core/
Adding default property: spark.hadoop.fs.s3a.endpoint.region=eu-west-1
Adding default property:
spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem
Adding default property: spark.sql.shuffle.partitions=1
Warning: Ignoring non-Spark config property:
log4j.logger.org.apache.hadoop.fs.s3a
Parsed arguments:
  master                  local[*]
  remote                  null
  deployMode              client
  executorMemory          null
  executorCores           null
  totalExecutorCores      null
  propertiesFile
 /home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/spark-defaults.conf
  driverMemory            null
  driverCores             null
  driverExtraClassPath    null
  driverExtraLibraryPath  null
  driverExtraJavaOptions  -Djavax.net.debug=ssl,handshake:verbose
-Daws.java.v1.disableDeprecationAnnouncement=true -Daws.region=us-east-1
  supervise               false
  queue                   null
  numExecutors            null
  files                   null
  pyFiles
file:/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/<customer-name-redacted>.zip,file:/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/datacore.zip
  archives                null
  mainClass               null
  primaryResource         pyspark-shell
  name                    PySparkShell
  childArgs               []
  jars                    null
  packages
 
org.apache.hadoop:hadoop-aws:3.3.4,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0,org.apache.iceberg:iceberg-aws-bundle:1.9.0,org.projectnessie.nessie-integrations:nessie-spark-extensions-3.5_2.12:0.103.6,org.postgresql:postgresql:42.7.3,org.apache.iceberg:iceberg-spark3-extensions:0.13.2
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file
/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/spark-defaults.conf:
  (spark.driver.extraJavaOptions,-Djavax.net.debug=ssl,handshake:verbose
-Daws.java.v1.disableDeprecationAnnouncement=true -Daws.region=us-east-1)
  (spark.executor.extraJavaOptions,-Djavax.net.debug=ssl,handshake:verbose
-Daws.java.v1.disableDeprecationAnnouncement=true -Daws.region=us-east-1)
  (spark.hadoop.fs.s3a.access.key,*********(redacted))
  (spark.hadoop.fs.s3a.analytics.enabled,false)
  (spark.hadoop.fs.s3a.attempts.maximum,10)

(spark.hadoop.fs.s3a.aws.credentials.provider,org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider)
  (spark.hadoop.fs.s3a.connection.maximum,100)
  (spark.hadoop.fs.s3a.connection.ssl.enabled,false)
  (spark.hadoop.fs.s3a.connection.timeout,100)
  (spark.hadoop.fs.s3a.endpoint,https://datalake.dev
.<customer-name-redacted>.com)
  (spark.hadoop.fs.s3a.endpoint.region,us-east-1)
  (spark.hadoop.fs.s3a.impl,org.apache.hadoop.fs.s3a.S3AFileSystem)
  (spark.hadoop.fs.s3a.multipart.size,128m)
  (spark.hadoop.fs.s3a.multipart.threshold,128m)
  (spark.hadoop.fs.s3a.paging.maximum,1000)
  (spark.hadoop.fs.s3a.path.style.access,true)
  (spark.hadoop.fs.s3a.region,us-east-1)
  (spark.hadoop.fs.s3a.secret.key,*********(redacted))
  (spark.hadoop.fs.s3a.ssl.verify,true)
  (spark.io.compression.codec,zstd)
  (spark.serializer,org.apache.spark.serializer.KryoSerializer)
  (spark.sql.ansi.enabled,true)
  (spark.sql.catalog.core,org.apache.iceberg.spark.SparkCatalog)
  (spark.sql.catalog.core.authentication.type,NONE)
  (spark.sql.catalog.core.io-impl,org.apache.iceberg.aws.s3.S3FileIO)
  (spark.sql.catalog.core.ref,main)
  (spark.sql.catalog.core.s3.endpoint,https://datalake.dev
.<customer-name-redacted>.com)
  (spark.sql.catalog.core.type,rest)
  (spark.sql.catalog.core.uri,https://nessie.dev
.<customer-name-redacted>.com/iceberg)
  (spark.sql.catalog.core.warehouse,s3://core/)

(spark.sql.extensions,org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,org.projectnessie.spark.extensions.NessieSparkSessionExtensions)
  (spark.sql.shuffle.partitions,1)
  (spark.ssl.enabled,false)


:: loading settings :: url =
jar:file:/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/spark-3.5.6-bin-hadoop3/jars/ivy-2.5.1.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /home/hamish/.ivy2/cache
The jars for the packages stored in: /home/hamish/.ivy2/jars
org.apache.hadoop#hadoop-aws added as a dependency
org.apache.iceberg#iceberg-spark-runtime-3.5_2.12 added as a dependency
org.apache.iceberg#iceberg-aws-bundle added as a dependency
org.projectnessie.nessie-integrations#nessie-spark-extensions-3.5_2.12
added as a dependency
org.postgresql#postgresql added as a dependency
org.apache.iceberg#iceberg-spark3-extensions added as a dependency
:: resolving dependencies ::
org.apache.spark#spark-submit-parent-ac98772d-9b6a-4e6e-b3b7-46d0bbd1b694;1.0
confs: [default]
found org.apache.hadoop#hadoop-aws;3.3.4 in central
found com.amazonaws#aws-java-sdk-bundle;1.12.262 in central
found org.wildfly.openssl#wildfly-openssl;1.0.7.Final in central
found org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.9.0 in central
found org.apache.iceberg#iceberg-aws-bundle;1.9.0 in central
found
org.projectnessie.nessie-integrations#nessie-spark-extensions-3.5_2.12;0.103.6
in central
found org.postgresql#postgresql;42.7.3 in central
found org.checkerframework#checker-qual;3.42.0 in central
found org.apache.iceberg#iceberg-spark3-extensions;0.13.2 in central
found org.antlr#antlr4;4.7.1 in central
found org.antlr#antlr4-runtime;4.7.1 in central
found org.antlr#antlr-runtime;3.5.2 in central
found org.antlr#ST4;4.0.8 in central
found org.abego.treelayout#org.abego.treelayout.core;1.0.3 in central
found org.glassfish#javax.json;1.0.4 in central
found com.ibm.icu#icu4j;58.2 in central
found org.slf4j#slf4j-api;1.7.25 in local-m2-cache
found com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 in central
:: resolution report :: resolve 360ms :: artifacts dl 16ms
:: modules in use:
com.amazonaws#aws-java-sdk-bundle;1.12.262 from central in [default]
com.github.stephenc.findbugs#findbugs-annotations;1.3.9-1 from central in
[default]
com.ibm.icu#icu4j;58.2 from central in [default]
org.abego.treelayout#org.abego.treelayout.core;1.0.3 from central in
[default]
org.antlr#ST4;4.0.8 from central in [default]
org.antlr#antlr-runtime;3.5.2 from central in [default]
org.antlr#antlr4;4.7.1 from central in [default]
org.antlr#antlr4-runtime;4.7.1 from central in [default]
org.apache.hadoop#hadoop-aws;3.3.4 from central in [default]
org.apache.iceberg#iceberg-aws-bundle;1.9.0 from central in [default]
org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.9.0 from central in
[default]
org.apache.iceberg#iceberg-spark3-extensions;0.13.2 from central in
[default]
org.checkerframework#checker-qual;3.42.0 from central in [default]
org.glassfish#javax.json;1.0.4 from central in [default]
org.postgresql#postgresql;42.7.3 from central in [default]
org.projectnessie.nessie-integrations#nessie-spark-extensions-3.5_2.12;0.103.6
from central in [default]
org.slf4j#slf4j-api;1.7.25 from local-m2-cache in [default]
org.wildfly.openssl#wildfly-openssl;1.0.7.Final from central in [default]
---------------------------------------------------------------------
|                  |            modules            ||   artifacts   |
|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
|      default     |   18  |   0   |   0   |   0   ||   18  |   0   |
---------------------------------------------------------------------
:: retrieving ::
org.apache.spark#spark-submit-parent-ac98772d-9b6a-4e6e-b3b7-46d0bbd1b694
confs: [default]
0 artifacts copied, 18 already retrieved (0kB/13ms)
25/09/01 12:59:55 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Main class:
org.apache.spark.api.python.PythonGatewayServer
Arguments:

Spark config:
(spark.app.name,PySparkShell)
(spark.app.submitTime,1756724395155)
(spark.driver.extraJavaOptions,-Djavax.net.debug=ssl,handshake:verbose
-Daws.java.v1.disableDeprecationAnnouncement=true -Daws.region=us-east-1)
(spark.executor.extraJavaOptions,-Djavax.net.debug=ssl,handshake:verbose
-Daws.java.v1.disableDeprecationAnnouncement=true -Daws.region=us-east-1)
(spark.files,file:///home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/<customer-name-redacted>.zip,file:///home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/datacore.zip,file:///home/hamish/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.3.4.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark-runtime-3.5_2.12-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-aws-bundle-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.projectnessie.nessie-integrations_nessie-spark-extensions-3.5_2.12-0.103.6.jar,file:///home/hamish/.ivy2/jars/org.postgresql_postgresql-42.7.3.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark3-extensions-0.13.2.jar,file:///home/hamish/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.12.262.jar,file:///home/hamish/.ivy2/jars/org.wildfly.openssl_wildfly-openssl-1.0.7.Final.jar,file:///home/hamish/.ivy2/jars/org.checkerframework_checker-qual-3.42.0.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-runtime-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar,file:///home/hamish/.ivy2/jars/org.antlr_ST4-4.0.8.jar,file:///home/hamish/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar,file:///home/hamish/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar,file:///home/hamish/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar,file:///home/hamish/.ivy2/jars/org.slf4j_slf4j-api-1.7.25.jar,file:///home/hamish/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar)
(spark.hadoop.fs.s3a.access.key,*********(redacted))
(spark.hadoop.fs.s3a.analytics.enabled,false)
(spark.hadoop.fs.s3a.attempts.maximum,10)
(spark.hadoop.fs.s3a.aws.credentials.provider,org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider)
(spark.hadoop.fs.s3a.connection.maximum,100)
(spark.hadoop.fs.s3a.connection.ssl.enabled,false)
(spark.hadoop.fs.s3a.connection.timeout,100)
(spark.hadoop.fs.s3a.endpoint,https://datalake.dev
.<customer-name-redacted>.com)
(spark.hadoop.fs.s3a.endpoint.region,us-east-1)
(spark.hadoop.fs.s3a.impl,org.apache.hadoop.fs.s3a.S3AFileSystem)
(spark.hadoop.fs.s3a.multipart.size,128m)
(spark.hadoop.fs.s3a.multipart.threshold,128m)
(spark.hadoop.fs.s3a.paging.maximum,1000)
(spark.hadoop.fs.s3a.path.style.access,true)
(spark.hadoop.fs.s3a.region,us-east-1)
(spark.hadoop.fs.s3a.secret.key,*********(redacted))
(spark.hadoop.fs.s3a.ssl.verify,true)
(spark.io.compression.codec,zstd)
(spark.jars,file:///home/hamish/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.3.4.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark-runtime-3.5_2.12-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-aws-bundle-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.projectnessie.nessie-integrations_nessie-spark-extensions-3.5_2.12-0.103.6.jar,file:///home/hamish/.ivy2/jars/org.postgresql_postgresql-42.7.3.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark3-extensions-0.13.2.jar,file:///home/hamish/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.12.262.jar,file:///home/hamish/.ivy2/jars/org.wildfly.openssl_wildfly-openssl-1.0.7.Final.jar,file:///home/hamish/.ivy2/jars/org.checkerframework_checker-qual-3.42.0.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-runtime-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar,file:///home/hamish/.ivy2/jars/org.antlr_ST4-4.0.8.jar,file:///home/hamish/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar,file:///home/hamish/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar,file:///home/hamish/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar,file:///home/hamish/.ivy2/jars/org.slf4j_slf4j-api-1.7.25.jar,file:///home/hamish/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar)
(spark.master,local[*])
(spark.repl.local.jars,file:///home/hamish/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.3.4.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark-runtime-3.5_2.12-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-aws-bundle-1.9.0.jar,file:///home/hamish/.ivy2/jars/org.projectnessie.nessie-integrations_nessie-spark-extensions-3.5_2.12-0.103.6.jar,file:///home/hamish/.ivy2/jars/org.postgresql_postgresql-42.7.3.jar,file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark3-extensions-0.13.2.jar,file:///home/hamish/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.12.262.jar,file:///home/hamish/.ivy2/jars/org.wildfly.openssl_wildfly-openssl-1.0.7.Final.jar,file:///home/hamish/.ivy2/jars/org.checkerframework_checker-qual-3.42.0.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr4-runtime-4.7.1.jar,file:///home/hamish/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar,file:///home/hamish/.ivy2/jars/org.antlr_ST4-4.0.8.jar,file:///home/hamish/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar,file:///home/hamish/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar,file:///home/hamish/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar,file:///home/hamish/.ivy2/jars/org.slf4j_slf4j-api-1.7.25.jar,file:///home/hamish/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar)
(spark.serializer,org.apache.spark.serializer.KryoSerializer)
(spark.sql.ansi.enabled,true)
(spark.sql.catalog.core,org.apache.iceberg.spark.SparkCatalog)
(spark.sql.catalog.core.authentication.type,NONE)
(spark.sql.catalog.core.io-impl,org.apache.iceberg.aws.s3.S3FileIO)
(spark.sql.catalog.core.ref,main)
(spark.sql.catalog.core.s3.endpoint,https://datalake.dev
.<customer-name-redacted>.com)
(spark.sql.catalog.core.type,rest)
(spark.sql.catalog.core.uri,https://nessie.dev
.<customer-name-redacted>.com/iceberg)
(spark.sql.catalog.core.warehouse,s3://core/)
(spark.sql.extensions,org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,org.projectnessie.spark.extensions.NessieSparkSessionExtensions)
(spark.sql.shuffle.partitions,1)
(spark.ssl.enabled,false)
(spark.submit.deployMode,client)
(spark.submit.pyFiles,/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/<customer-name-redacted>.zip,/home/hamish/WorkDocs/cx/<customer-name-redacted>/cf-dev/datacore.zip,/home/hamish/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.3.4.jar,/home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark-runtime-3.5_2.12-1.9.0.jar,/home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-aws-bundle-1.9.0.jar,/home/hamish/.ivy2/jars/org.projectnessie.nessie-integrations_nessie-spark-extensions-3.5_2.12-0.103.6.jar,/home/hamish/.ivy2/jars/org.postgresql_postgresql-42.7.3.jar,/home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark3-extensions-0.13.2.jar,/home/hamish/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.12.262.jar,/home/hamish/.ivy2/jars/org.wildfly.openssl_wildfly-openssl-1.0.7.Final.jar,/home/hamish/.ivy2/jars/org.checkerframework_checker-qual-3.42.0.jar,/home/hamish/.ivy2/jars/org.antlr_antlr4-4.7.1.jar,/home/hamish/.ivy2/jars/org.antlr_antlr4-runtime-4.7.1.jar,/home/hamish/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar,/home/hamish/.ivy2/jars/org.antlr_ST4-4.0.8.jar,/home/hamish/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar,/home/hamish/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar,/home/hamish/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar,/home/hamish/.ivy2/jars/org.slf4j_slf4j-api-1.7.25.jar,/home/hamish/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar)
(spark.ui.showConsoleProgress,true)
Classpath elements:
file:///home/hamish/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.3.4.jar
file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark-runtime-3.5_2.12-1.9.0.jar
file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-aws-bundle-1.9.0.jar
file:///home/hamish/.ivy2/jars/org.projectnessie.nessie-integrations_nessie-spark-extensions-3.5_2.12-0.103.6.jar
file:///home/hamish/.ivy2/jars/org.postgresql_postgresql-42.7.3.jar
file:///home/hamish/.ivy2/jars/org.apache.iceberg_iceberg-spark3-extensions-0.13.2.jar
file:///home/hamish/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.12.262.jar
file:///home/hamish/.ivy2/jars/org.wildfly.openssl_wildfly-openssl-1.0.7.Final.jar
file:///home/hamish/.ivy2/jars/org.checkerframework_checker-qual-3.42.0.jar
file:///home/hamish/.ivy2/jars/org.antlr_antlr4-4.7.1.jar
file:///home/hamish/.ivy2/jars/org.antlr_antlr4-runtime-4.7.1.jar
file:///home/hamish/.ivy2/jars/org.antlr_antlr-runtime-3.5.2.jar
file:///home/hamish/.ivy2/jars/org.antlr_ST4-4.0.8.jar
file:///home/hamish/.ivy2/jars/org.abego.treelayout_org.abego.treelayout.core-1.0.3.jar
file:///home/hamish/.ivy2/jars/org.glassfish_javax.json-1.0.4.jar
file:///home/hamish/.ivy2/jars/com.ibm.icu_icu4j-58.2.jar
file:///home/hamish/.ivy2/jars/org.slf4j_slf4j-api-1.7.25.jar
file:///home/hamish/.ivy2/jars/com.github.stephenc.findbugs_findbugs-annotations-1.3.9-1.jar


Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 3.5.6
      /_/

Using Python version 3.10.12 (main, Aug 15 2025 14:32:43)
Spark context Web UI available at http://192.168.0.219:4040
Spark context available as 'sc' (master = local[*], app id =
local-1756724396269).
SparkSession available as 'spark'.
Python 3.10.12 (main, Aug 15 2025, 14:32:43) [GCC 11.4.0]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.4.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: 25/09/01 13:00:47 WARN MetricsConfig: Cannot locate configuration:
tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.134
SAST|SSLCipher.java:466|jdk.tls.keyLimits:  entry = AES/GCM/NoPadding
KeyUpdate 2^37. AES/GCM/NOPADDING:KEYUPDATE = 137438953472
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.135
SAST|SSLCipher.java:466|jdk.tls.keyLimits:  entry =  ChaCha20-Poly1305
KeyUpdate 2^37. CHACHA20-POLY1305:KEYUPDATE = 137438953472
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.447
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_AES_256_GCM_SHA384 for TLSv1.2
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.447
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_AES_128_GCM_SHA256 for TLSv1.2
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.447
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_CHACHA20_POLY1305_SHA256 for TLSv1.2
javax.net.ssl|INFO|41|Thread-4|2025-09-01 13:00:48.456
SAST|AlpnExtension.java:182|No available application protocols
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.456
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
application_layer_protocol_negotiation
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.457
SAST|SessionTicketExtension.java:353|Stateless resumption supported
javax.net.ssl|ALL|41|Thread-4|2025-09-01 13:00:48.457
SAST|SignatureScheme.java:412|Ignore disabled signature scheme: rsa_md5
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.457
SAST|SSLExtensions.java:272|Ignore, context unavailable extension: cookie
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.475
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
renegotiation_info
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.475
SAST|PreSharedKeyExtension.java:661|No session to resume.
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.475
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
pre_shared_key
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.480
SAST|ClientHello.java:639|Produced ClientHello handshake message (
"ClientHello": {
  "client version"      : "TLSv1.2",
  "random"              :
"A92B9585BFA4832C939BE725DDCF9B16853F612C71A22B1494198A544781CC3E",
  "session id"          :
"D6659DF9B0E8C7B323272528337F1C90CE6F0495F7AF238F43C1267D34936B9D",
  "cipher suites"       : "[TLS_AES_256_GCM_SHA384(0x1302),
TLS_AES_128_GCM_SHA256(0x1301), TLS_CHACHA20_POLY1305_SHA256(0x1303),
TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384(0xC02C),
TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256(0xC02B),
TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256(0xCCA9),
TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384(0xC030),
TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256(0xCCA8),
TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256(0xC02F),
TLS_DHE_RSA_WITH_AES_256_GCM_SHA384(0x009F),
TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256(0xCCAA),
TLS_DHE_DSS_WITH_AES_256_GCM_SHA384(0x00A3),
TLS_DHE_RSA_WITH_AES_128_GCM_SHA256(0x009E),
TLS_DHE_DSS_WITH_AES_128_GCM_SHA256(0x00A2),
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384(0xC024),
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384(0xC028),
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256(0xC023),
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256(0xC027),
TLS_DHE_RSA_WITH_AES_256_CBC_SHA256(0x006B),
TLS_DHE_DSS_WITH_AES_256_CBC_SHA256(0x006A),
TLS_DHE_RSA_WITH_AES_128_CBC_SHA256(0x0067),
TLS_DHE_DSS_WITH_AES_128_CBC_SHA256(0x0040),
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA(0xC00A),
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA(0xC014),
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA(0xC009),
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA(0xC013),
TLS_DHE_RSA_WITH_AES_256_CBC_SHA(0x0039),
TLS_DHE_DSS_WITH_AES_256_CBC_SHA(0x0038),
TLS_DHE_RSA_WITH_AES_128_CBC_SHA(0x0033),
TLS_DHE_DSS_WITH_AES_128_CBC_SHA(0x0032),
TLS_RSA_WITH_AES_256_GCM_SHA384(0x009D),
TLS_RSA_WITH_AES_128_GCM_SHA256(0x009C),
TLS_RSA_WITH_AES_256_CBC_SHA256(0x003D),
TLS_RSA_WITH_AES_128_CBC_SHA256(0x003C),
TLS_RSA_WITH_AES_256_CBC_SHA(0x0035), TLS_RSA_WITH_AES_128_CBC_SHA(0x002F),
TLS_EMPTY_RENEGOTIATION_INFO_SCSV(0x00FF)]",
  "compression methods" : "00",
  "extensions"          : [
    "server_name (0)": {
      type=host_name (0), value=datalake.dev.<customer-name-redacted>.com
    },
    "status_request (5)": {
      "certificate status type": ocsp
      "OCSP status request": {
        "responder_id": <empty>
        "request extensions": {
          <empty>
        }
      }
    },
    "supported_groups (10)": {
      "versions": [x25519, secp256r1, secp384r1, secp521r1, x448,
ffdhe2048, ffdhe3072, ffdhe4096, ffdhe6144, ffdhe8192]
    },
    "ec_point_formats (11)": {
      "formats": [uncompressed]
    },
    "status_request_v2 (17)": {
      "cert status request": {
        "certificate status type": ocsp_multi
        "OCSP status request": {
          "responder_id": <empty>
          "request extensions": {
            <empty>
          }
        }
      }
    },
    "extended_master_secret (23)": {
      <empty>
    },
    "session_ticket (35)": {
      <empty>
    },
    "signature_algorithms (13)": {
      "signature schemes": [ecdsa_secp256r1_sha256, ecdsa_secp384r1_sha384,
ecdsa_secp521r1_sha512, ed25519, ed448, rsa_pss_rsae_sha256,
rsa_pss_rsae_sha384, rsa_pss_rsae_sha512, rsa_pss_pss_sha256,
rsa_pss_pss_sha384, rsa_pss_pss_sha512, rsa_pkcs1_sha256, rsa_pkcs1_sha384,
rsa_pkcs1_sha512, dsa_sha256, ecdsa_sha224, rsa_sha224, dsa_sha224,
ecdsa_sha1, rsa_pkcs1_sha1, dsa_sha1]
    },
    "supported_versions (43)": {
      "versions": [TLSv1.3, TLSv1.2]
    },
    "psk_key_exchange_modes (45)": {
      "ke_modes": [psk_dhe_ke]
    },
    "signature_algorithms_cert (50)": {
      "signature schemes": [ecdsa_secp256r1_sha256, ecdsa_secp384r1_sha384,
ecdsa_secp521r1_sha512, ed25519, ed448, rsa_pss_rsae_sha256,
rsa_pss_rsae_sha384, rsa_pss_rsae_sha512, rsa_pss_pss_sha256,
rsa_pss_pss_sha384, rsa_pss_pss_sha512, rsa_pkcs1_sha256, rsa_pkcs1_sha384,
rsa_pkcs1_sha512, dsa_sha256, ecdsa_sha224, rsa_sha224, dsa_sha224,
ecdsa_sha1, rsa_pkcs1_sha1, dsa_sha1]
    },
    "key_share (51)": {
      "client_shares": [
        {
          "named group": x25519
          "key_exchange": {
            0000: 29 D9 58 6A D7 6A 22 4E   54 D2 15 0D 84 48 3D 88
 ).Xj.j"NT....H=.
            0010: F0 E9 58 30 BD DF DB 59   14 00 B0 C9 12 C9 F5 42
 ..X0...Y.......B
          }
        },
        {
          "named group": secp256r1
          "key_exchange": {
            0000: 04 24 90 37 E0 55 14 7E   76 AD 6D E8 A2 70 12 2D
 .$.7.U..v.m..p.-
            0010: 70 8D 74 77 8E 4B 39 05   BA 3E FB A5 3C B9 D3 3D
 p.tw.K9..>..<..=
            0020: EC 49 C6 B9 30 07 D1 94   DE D6 53 F5 4F AE EC B7
 .I..0.....S.O...
            0030: F4 49 46 87 DE 8B 86 E2   89 AD 9C AE FA AB 05 81
 .IF.............
            0040: E9
          }
        },
      ]
    }
  ]
}
)
javax.net.ssl|WARNING|41|Thread-4|2025-09-01 13:00:48.584
SAST|SSLSocketImpl.java:1676|handling exception (
"throwable" : {
  java.net.SocketTimeoutException: Read timed out
  at java.base/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:288)
  at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:314)
  at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:355)
  at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:808)
  at java.base/java.net.Socket$SocketInputStream.read(Socket.java:966)
  at
java.base/sun.security.ssl.SSLSocketInputRecord.read(SSLSocketInputRecord.java:484)
  at
java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(SSLSocketInputRecord.java:478)
  at
java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:160)
  at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111)
  at
java.base/sun.security.ssl.SSLSocketImpl.decode(SSLSocketImpl.java:1510)
  at
java.base/sun.security.ssl.SSLSocketImpl.readHandshakeRecord(SSLSocketImpl.java:1425)
  at
java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:455)
  at
java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:426)
  at
com.amazonaws.thirdparty.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:436)
  at
com.amazonaws.thirdparty.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:384)
  at
com.amazonaws.http.conn.ssl.SdkTLSSocketFactory.connectSocket(SdkTLSSocketFactory.java:142)
  at
com.amazonaws.thirdparty.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
  at
com.amazonaws.thirdparty.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
  at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
  at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.base/java.lang.reflect.Method.invoke(Method.java:569)
  at
com.amazonaws.http.conn.ClientConnectionManagerFactory$Handler.invoke(ClientConnectionManagerFactory.java:76)
  at com.amazonaws.http.conn.$Proxy33.connect(Unknown Source)
  at
com.amazonaws.thirdparty.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
  at
com.amazonaws.thirdparty.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
  at
com.amazonaws.thirdparty.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
  at
com.amazonaws.thirdparty.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
  at
com.amazonaws.thirdparty.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
  at
com.amazonaws.thirdparty.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
  at
com.amazonaws.http.apache.client.impl.SdkHttpClient.execute(SdkHttpClient.java:72)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1346)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1157)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:814)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
  at
com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
  at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
  at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
  at
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456)
  at
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403)
  at
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5397)
  at
com.amazonaws.services.s3.AmazonS3Client.listObjectsV2(AmazonS3Client.java:971)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listObjects$11(S3AFileSystem.java:2595)
  at
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
  at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414)
  at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.listObjects(S3AFileSystem.java:2586)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3832)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3688)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$isDirectory$35(S3AFileSystem.java:4724)
  at
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
  at
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:444)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2337)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2356)
  at
org.apache.hadoop.fs.s3a.S3AFileSystem.isDirectory(S3AFileSystem.java:4722)
  at
org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:54)
  at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:366)
  at
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:229)
  at
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:211)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
  at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:563)
  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
  at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
  at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.base/java.lang.reflect.Method.invoke(Method.java:569)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
  at py4j.Gateway.invoke(Gateway.java:282)
  at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
  at py4j.commands.CallCommand.execute(CallCommand.java:79)
  at
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
  at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
  at java.base/java.lang.Thread.run(Thread.java:840)}

)
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.683
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_AES_256_GCM_SHA384 for TLSv1.2
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.684
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_AES_128_GCM_SHA256 for TLSv1.2
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.684
SAST|HandshakeContext.java:298|Ignore unsupported cipher suite:
TLS_CHACHA20_POLY1305_SHA256 for TLSv1.2
javax.net.ssl|INFO|41|Thread-4|2025-09-01 13:00:48.687
SAST|AlpnExtension.java:182|No available application protocols
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.687
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
application_layer_protocol_negotiation
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.688
SAST|SessionTicketExtension.java:353|Stateless resumption supported
javax.net.ssl|ALL|41|Thread-4|2025-09-01 13:00:48.689
SAST|SignatureScheme.java:412|Ignore disabled signature scheme: rsa_md5
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.689
SAST|SSLExtensions.java:272|Ignore, context unavailable extension: cookie
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.708
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
renegotiation_info
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.708
SAST|PreSharedKeyExtension.java:661|No session to resume.
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.709
SAST|SSLExtensions.java:272|Ignore, context unavailable extension:
pre_shared_key
javax.net.ssl|DEBUG|41|Thread-4|2025-09-01 13:00:48.714
SAST|ClientHello.java:639|Produced ClientHello handshake message (
"ClientHello": {
  "client version"      : "TLSv1.2",
  "random"              :
"F9790D79790B66A2854BFA5462FB5BAADEB9B6FD6C8C41ABAF45612F6845C35B",
  "session id"          :
"27B3062A1575936D382CFCE5B0873FABFC7BC2A0EFE564F21A6BB71EBA0D40F4",
  "cipher suites"       : "[TLS_AES_256_GCM_SHA384(0x1302),
TLS_AES_128_GCM_SHA256(0x1301), TLS_CHACHA20_POLY1305_SHA256(0x1303),
TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384(0xC02C),
TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256(0xC02B),
TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256(0xCCA9),
TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384(0xC030),
TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256(0xCCA8),
TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256(0xC02F),
TLS_DHE_RSA_WITH_AES_256_GCM_SHA384(0x009F),
TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256(0xCCAA),
TLS_DHE_DSS_WITH_AES_256_GCM_SHA384(0x00A3),
TLS_DHE_RSA_WITH_AES_128_GCM_SHA256(0x009E),
TLS_DHE_DSS_WITH_AES_128_GCM_SHA256(0x00A2),
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384(0xC024),
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384(0xC028),
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256(0xC023),
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256(0xC027),
TLS_DHE_RSA_WITH_AES_256_CBC_SHA256(0x006B),
TLS_DHE_DSS_WITH_AES_256_CBC_SHA256(0x006A),
TLS_DHE_RSA_WITH_AES_128_CBC_SHA256(0x0067),
TLS_DHE_DSS_WITH_AES_128_CBC_SHA256(0x0040),
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA(0xC00A),
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA(0xC014),
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA(0xC009),
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA(0xC013),
TLS_DHE_RSA_WITH_AES_256_CBC_SHA(0x0039),
TLS_DHE_DSS_WITH_AES_256_CBC_SHA(0x0038),
TLS_DHE_RSA_WITH_AES_128_CBC_SHA(0x0033),
TLS_DHE_DSS_WITH_AES_128_CBC_SHA(0x0032),
TLS_RSA_WITH_AES_256_GCM_SHA384(0x009D),
TLS_RSA_WITH_AES_128_GCM_SHA256(0x009C),
TLS_RSA_WITH_AES_256_CBC_SHA256(0x003D),
TLS_RSA_WITH_AES_128_CBC_SHA256(0x003C),
TLS_RSA_WITH_AES_256_CBC_SHA(0x0035), TLS_RSA_WITH_AES_128_CBC_SHA(0x002F),
TLS_EMPTY_RENEGOTIATION_INFO_SCSV(0x00FF)]",
  "compression methods" : "00",
  "extensions"          : [
    "server_name (0)": {
      type=host_name (0), value=datalake.dev.<customer-name-redacted>.com
    },
    "status_request (5)": {
      "certificate status type": ocsp
      "OCSP status request": {
        "responder_id": <empty>
        "request extensions": {
          <empty>
        }
      }
    },
    "supported_groups (10)": {
      "versions": [x25519, secp256r1, secp384r1, secp521r1, x448,
ffdhe2048, ffdhe3072, ffdhe4096, ffdhe6144, ffdhe8192]
    },
    "ec_point_formats (11)": {
      "formats": [uncompressed]
    },
    "status_request_v2 (17)": {
      "cert status request": {
        "certificate status type": ocsp_multi
        "OCSP status request": {
          "responder_id": <empty>
          "request extensions": {
            <empty>
          }
        }
      }
    },
    "extended_master_secret (23)": {
      <empty>
    },
    "session_ticket (35)": {
      <empty>
    },
    "signature_algorithms (13)": {
      "signature schemes": [ecdsa_secp256r1_sha256, ecdsa_secp384r1_sha384,
ecdsa_secp521r1_sha512, ed25519, ed448, rsa_pss_rsae_sha256,
rsa_pss_rsae_sha384, rsa_pss_rsae_sha512, rsa_pss_pss_sha256,
rsa_pss_pss_sha384, rsa_pss_pss_sha512, rsa_pkcs1_sha256, rsa_pkcs1_sha384,
rsa_pkcs1_sha512, dsa_sha256, ecdsa_sha224, rsa_sha224, dsa_sha224,
ecdsa_sha1, rsa_pkcs1_sha1, dsa_sha1]
    },
    "supported_versions (43)": {
      "versions": [TLSv1.3, TLSv1.2]
    },
    "psk_key_exchange_modes (45)": {
      "ke_modes": [psk_dhe_ke]
    },
    "signature_algorithms_cert (50)": {
      "signature schemes": [ecdsa_secp256r1_sha256, ecdsa_secp384r1_sha384,
ecdsa_secp521r1_sha512, ed25519, ed448, rsa_pss_rsae_sha256,
rsa_pss_rsae_sha384, rsa_pss_rsae_sha512, rsa_pss_pss_sha256,
rsa_pss_pss_sha384, rsa_pss_pss_sha512, rsa_pkcs1_sha256, rsa_pkcs1_sha384,
rsa_pkcs1_sha512, dsa_sha256, ecdsa_sha224, rsa_sha224, dsa_sha224,
ecdsa_sha1, rsa_pkcs1_sha1, dsa_sha1]
    },
    "key_share (51)": {
      "client_shares": [
        {
          "named group": x25519
          "key_exchange": {
            0000: B0 65 29 F7 FD 73 51 C2   2C 3C 78 A9 5C E8 BF 2D
 .e)..sQ.,<x.\..-
            0010: 57 3E EF 05 94 AC 98 8D   20 E0 BA A6 1D 53 6A 7E
 W>...... ....Sj.
          }
        },
        {
          "named group": secp256r1
          "key_exchange": {
            0000: 04 E6 D9 22 A3 6D 23 FE   DD F4 F9 15 32 FD A0 E9
 ...".m#.....2...
            0010: 95 E2 79 A8 A2 7A 1D 11   1D 0F 50 84 5C 27 5D 05
 ..y..z....P.\'].
            0020: F2 D1 AD 32 E8 9C A9 B6   85 49 6A 9C E9 95 63 7E
 ...2.....Ij...c.
            0030: F3 54 EE 6B 3C A3 D9 FC   2E 59 86 C5 74 9A 4B 47
 .T.k<....Y..t.KG
            0040: E1
          }
        },
      ]
    }
  ]
}
)
javax.net.ssl|WARNING|41|Thread-4|2025-09-01 13:00:48.818
SAST|SSLSocketImpl.java:1676|handling exception (
"throwable" : {
  java.net.SocketTimeoutException: Read timed out
  at java.base/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:288)

Reply via email to