[ 
https://issues.apache.org/jira/browse/NIFI-11003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17651588#comment-17651588
 ] 

Mark Bathori commented on NIFI-11003:
-------------------------------------

Unfortunately you need to make a custom build from *_nifi-iceberg-bundle_* 
using either of the cloud profiles mentioned above to be able to use the 
processor with S3. The _*include-hadoop-aws*_ profile contains only S3 specific 
dependencies while _*include-hadoop-cloud-storage*_ profile will additionally 
include azure and gcp related dependencies.

> PutIceberg Failes to write to S3
> --------------------------------
>
>                 Key: NIFI-11003
>                 URL: https://issues.apache.org/jira/browse/NIFI-11003
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Core Framework
>    Affects Versions: 1.19.0, 1.19.1
>         Environment: Java11/17
> containerized deployment on RHEL 7.9
>            Reporter: Adam Turley
>            Priority: Major
>         Attachments: iceberg.png
>
>
> PutIceberg fails with the follow error when writing to S3. No issues when 
> writing to HDFS Iceberg
> "PutIceberg Failed to load table from catalog: java.lang.RuntimeException: 
> java.lang.ClassNotFoundException: Class 
> org.apache.hadoop.fs.s3a.S3AFileSystem not found
> - Caused by: java.lang.ClassNotFoundException: Class 
> org.apache.hadoop.fs.s3a.S3AFileSystem not found"



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to