manjum-a11y commented on issue #14942:
URL: https://github.com/apache/iceberg/issues/14942#issuecomment-3713980538

   Thankyou for getting back!
   Here is the full logs ..
   ```
   Traceback (most recent call last):                         (38381 + 82) / 
94119]
     File "<stdin>", line 1, in <module>
     File "/opt/spark/python/pyspark/sql/dataframe.py", line 1240, in count
       return int(self._jdf.count())
     File "/opt/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", 
line 1322, in __call__
     File "/opt/spark/python/pyspark/errors/exceptions/captured.py", line 179, 
in deco
       return f(*a, **kw)
     File "/opt/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 
326, in get_return_value
   py4j.protocol.Py4JJavaError: An error occurred while calling o315.count.
   : org.apache.spark.SparkException: Job aborted due to stage failure: Task 
38448 in stage 1.0 failed 4 times, most recent failure: Lost task 38448.3 in 
stage 1.0 (TID 38579) (10.30.15.179 executor 51): 
java.lang.NullPointerException: Encountered a null value when resolving 
configuration attributes. This is commonly caused by concurrent modifications 
to non-thread-safe types. Ensure you're synchronizing access to all 
non-thread-safe types.
        at software.amazon.awssdk.utils.Validate.notNull(Validate.java:119)
        at 
software.amazon.awssdk.utils.AttributeMap$Builder.resolveValue(AttributeMap.java:396)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
        at 
software.amazon.awssdk.utils.AttributeMap$Builder.build(AttributeMap.java:362)
        at 
software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.asyncClientConfiguration(SdkDefaultClientBuilder.java:222)
        at 
software.amazon.awssdk.services.s3control.DefaultS3ControlAsyncClientBuilder.buildClient(DefaultS3ControlAsyncClientBuilder.java:38)
        at 
software.amazon.awssdk.services.s3control.DefaultS3ControlAsyncClientBuilder.buildClient(DefaultS3ControlAsyncClientBuilder.java:25)
        at 
software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.build(SdkDefaultClientBuilder.java:169)
        at 
software.amazon.awssdk.s3accessgrants.plugin.S3AccessGrantsIdentityProvider.resolveIdentity(S3AccessGrantsIdentityProvider.java:151)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.lambda$trySelectAuthScheme$6(S3AuthSchemeInterceptor.java:169)
        at 
software.amazon.awssdk.core.internal.util.MetricUtils.reportDuration(MetricUtils.java:80)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.trySelectAuthScheme(S3AuthSchemeInterceptor.java:169)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.selectAuthScheme(S3AuthSchemeInterceptor.java:87)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.beforeExecution(S3AuthSchemeInterceptor.java:67)
        at 
software.amazon.awssdk.core.interceptor.ExecutionInterceptorChain.lambda$beforeExecution$1(ExecutionInterceptorChain.java:59)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
        at 
software.amazon.awssdk.core.interceptor.ExecutionInterceptorChain.beforeExecution(ExecutionInterceptorChain.java:59)
        at 
software.amazon.awssdk.awscore.internal.AwsExecutionContextBuilder.runInitialInterceptors(AwsExecutionContextBuilder.java:254)
        at 
software.amazon.awssdk.awscore.internal.AwsExecutionContextBuilder.invokeInterceptorsAndCreateExecutionContext(AwsExecutionContextBuilder.java:144)
        at 
software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.invokeInterceptorsAndCreateExecutionContext(AwsSyncClientHandler.java:67)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$0(BaseSyncClientHandler.java:62)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:182)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:60)
        at 
software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:52)
        at 
software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:60)
        at 
software.amazon.awssdk.services.s3.DefaultS3Client.getObject(DefaultS3Client.java:5863)
        at 
org.apache.iceberg.aws.s3.S3InputStream.openStream(S3InputStream.java:240)
        at 
org.apache.iceberg.aws.s3.S3InputStream.openStream(S3InputStream.java:225)
        at 
org.apache.iceberg.aws.s3.S3InputStream.positionStream(S3InputStream.java:221)
        at org.apache.iceberg.aws.s3.S3InputStream.read(S3InputStream.java:122)
        at 
org.apache.iceberg.shaded.org.apache.parquet.io.DelegatingSeekableInputStream.read(DelegatingSeekableInputStream.java:61)
        at 
org.apache.iceberg.shaded.org.apache.parquet.bytes.BytesUtils.readIntLittleEndian(BytesUtils.java:83)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:556)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.<init>(ParquetFileReader.java:799)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.open(ParquetFileReader.java:666)
        at org.apache.iceberg.parquet.ReadConf.newReader(ReadConf.java:238)
        at org.apache.iceberg.parquet.ReadConf.<init>(ReadConf.java:81)
        at 
org.apache.iceberg.parquet.VectorizedParquetReader.init(VectorizedParquetReader.java:90)
        at 
org.apache.iceberg.parquet.VectorizedParquetReader.iterator(VectorizedParquetReader.java:99)
        at 
org.apache.iceberg.spark.source.BatchDataReader.open(BatchDataReader.java:109)
        at 
org.apache.iceberg.spark.source.BatchDataReader.open(BatchDataReader.java:41)
        at org.apache.iceberg.spark.source.BaseReader.next(BaseReader.java:143)
        at 
org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:120)
        at 
org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:158)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.$anonfun$hasNext$1(DataSourceRDD.scala:63)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.$anonfun$hasNext$1$adapted(DataSourceRDD.scala:63)
        at scala.Option.exists(Option.scala:406)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.hasNext(DataSourceRDD.scala:63)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.advanceToNextIter(DataSourceRDD.scala:97)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.hasNext(DataSourceRDD.scala:63)
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
        at scala.collection.Iterator$$anon$9.hasNext(Iterator.scala:576)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.columnartorow_nextBatch_0$(Unknown
 Source)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
 Source)
        at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
        at scala.collection.Iterator$$anon$9.hasNext(Iterator.scala:576)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:621)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:624)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:840)
   
   Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2898)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2834)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2833)
        at scala.collection.immutable.List.foreach(List.scala:333)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2833)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1253)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1253)
        at scala.Option.foreach(Option.scala:437)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1253)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3102)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3036)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3025)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
   Caused by: java.lang.NullPointerException: Encountered a null value when 
resolving configuration attributes. This is commonly caused by concurrent 
modifications to non-thread-safe types. Ensure you're synchronizing access to 
all non-thread-safe types.
        at software.amazon.awssdk.utils.Validate.notNull(Validate.java:119)
        at 
software.amazon.awssdk.utils.AttributeMap$Builder.resolveValue(AttributeMap.java:396)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
        at 
software.amazon.awssdk.utils.AttributeMap$Builder.build(AttributeMap.java:362)
        at 
software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.asyncClientConfiguration(SdkDefaultClientBuilder.java:222)
        at 
software.amazon.awssdk.services.s3control.DefaultS3ControlAsyncClientBuilder.buildClient(DefaultS3ControlAsyncClientBuilder.java:38)
        at 
software.amazon.awssdk.services.s3control.DefaultS3ControlAsyncClientBuilder.buildClient(DefaultS3ControlAsyncClientBuilder.java:25)
        at 
software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.build(SdkDefaultClientBuilder.java:169)
        at 
software.amazon.awssdk.s3accessgrants.plugin.S3AccessGrantsIdentityProvider.resolveIdentity(S3AccessGrantsIdentityProvider.java:151)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.lambda$trySelectAuthScheme$6(S3AuthSchemeInterceptor.java:169)
        at 
software.amazon.awssdk.core.internal.util.MetricUtils.reportDuration(MetricUtils.java:80)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.trySelectAuthScheme(S3AuthSchemeInterceptor.java:169)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.selectAuthScheme(S3AuthSchemeInterceptor.java:87)
        at 
software.amazon.awssdk.services.s3.auth.scheme.internal.S3AuthSchemeInterceptor.beforeExecution(S3AuthSchemeInterceptor.java:67)
        at 
software.amazon.awssdk.core.interceptor.ExecutionInterceptorChain.lambda$beforeExecution$1(ExecutionInterceptorChain.java:59)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
        at 
software.amazon.awssdk.core.interceptor.ExecutionInterceptorChain.beforeExecution(ExecutionInterceptorChain.java:59)
        at 
software.amazon.awssdk.awscore.internal.AwsExecutionContextBuilder.runInitialInterceptors(AwsExecutionContextBuilder.java:254)
        at 
software.amazon.awssdk.awscore.internal.AwsExecutionContextBuilder.invokeInterceptorsAndCreateExecutionContext(AwsExecutionContextBuilder.java:144)
        at 
software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.invokeInterceptorsAndCreateExecutionContext(AwsSyncClientHandler.java:67)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$0(BaseSyncClientHandler.java:62)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:182)
        at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:60)
        at 
software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:52)
        at 
software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:60)
        at 
software.amazon.awssdk.services.s3.DefaultS3Client.getObject(DefaultS3Client.java:5863)
        at 
org.apache.iceberg.aws.s3.S3InputStream.openStream(S3InputStream.java:240)
        at 
org.apache.iceberg.aws.s3.S3InputStream.openStream(S3InputStream.java:225)
        at 
org.apache.iceberg.aws.s3.S3InputStream.positionStream(S3InputStream.java:221)
        at org.apache.iceberg.aws.s3.S3InputStream.read(S3InputStream.java:122)
        at 
org.apache.iceberg.shaded.org.apache.parquet.io.DelegatingSeekableInputStream.read(DelegatingSeekableInputStream.java:61)
        at 
org.apache.iceberg.shaded.org.apache.parquet.bytes.BytesUtils.readIntLittleEndian(BytesUtils.java:83)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:556)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.<init>(ParquetFileReader.java:799)
        at 
org.apache.iceberg.shaded.org.apache.parquet.hadoop.ParquetFileReader.open(ParquetFileReader.java:666)
        at org.apache.iceberg.parquet.ReadConf.newReader(ReadConf.java:238)
        at org.apache.iceberg.parquet.ReadConf.<init>(ReadConf.java:81)
        at 
org.apache.iceberg.parquet.VectorizedParquetReader.init(VectorizedParquetReader.java:90)
        at 
org.apache.iceberg.parquet.VectorizedParquetReader.iterator(VectorizedParquetReader.java:99)
        at 
org.apache.iceberg.spark.source.BatchDataReader.open(BatchDataReader.java:109)
        at 
org.apache.iceberg.spark.source.BatchDataReader.open(BatchDataReader.java:41)
        at org.apache.iceberg.spark.source.BaseReader.next(BaseReader.java:143)
        at 
org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:120)
        at 
org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:158)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.$anonfun$hasNext$1(DataSourceRDD.scala:63)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.$anonfun$hasNext$1$adapted(DataSourceRDD.scala:63)
        at scala.Option.exists(Option.scala:406)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.hasNext(DataSourceRDD.scala:63)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.advanceToNextIter(DataSourceRDD.scala:97)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$anon$1.hasNext(DataSourceRDD.scala:63)
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
        at scala.collection.Iterator$$anon$9.hasNext(Iterator.scala:576)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.columnartorow_nextBatch_0$(Unknown
 Source)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown
 Source)
        at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
        at scala.collection.Iterator$$anon$9.hasNext(Iterator.scala:576)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:621)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:624)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:840)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to