[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2020-06-20 Thread Dennis Jaheruddin (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17141233#comment-17141233
 ] 

Dennis Jaheruddin commented on NIFI-2562:
-

Recommending to close this issue as recent versions of Nifi no longer have 
these old hadoop libraries.

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Critical
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-22 Thread Vik (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15431052#comment-15431052
 ] 

Vik commented on NIFI-2562:
---

Checking in to see, if you can suggest any other workarounds for this issue at 
this point :)

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-19 Thread Vik (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428313#comment-15428313
 ] 

Vik commented on NIFI-2562:
---

I tried 2.6.0-cdh5.8.0. It threw me the same error with exact corrupted 
content. 

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-19 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428242#comment-15428242
 ] 

Joseph Witt commented on NIFI-2562:
---

2.6.0-cdh5.8.0Could you try that? 

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-19 Thread Vik (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428229#comment-15428229
 ] 

Vik commented on NIFI-2562:
---

Yes, we are observing it at the end of the message. We don't have any other 
HDFS client versions. It works fine for non TDE scenarios in our case and it 
only fails for TDE scenarios. 

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-19 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428221#comment-15428221
 ] 

Joseph Witt commented on NIFI-2562:
---

Ok thanks for the screenshots.  Is the corruption issue you're observing 
consistently at the end of the message?

Are there other HDFS client versions for you to try?

The nifi portion of interacting with the client is quite straight forward and 
of course we've not seen this happen in non TDE scenarios so it seems unlikely, 
at least now, that it is a nifi client implementation issue.

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-19 Thread Vik (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15428209#comment-15428209
 ] 

Vik commented on NIFI-2562:
---

Here are the screenshots you asked for. Browsing through data-provenance tab, 
we could locate that within NiFi and PUTHDFS, we can view data without any 
corruption (image 1 is proof of that). The latter image shows the data in HDFS 
which is corrupted. So we can infer that, after the data flows through PutHDFS 
and before it's ingested into HDFS, something is fishy and we can't figure out 
what it is. Atleast, not yet.


!NiFi-PutHDFS.jpg|thumbnail, width=800px!

!HdfsCorrupted.jpg|thumbnail, width=800px!

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
> Attachments: HdfsCorrupted.jpg, NiFi-PutHDFS.jpg
>
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427137#comment-15427137
 ] 

Joseph Witt commented on NIFI-2562:
---

I am a bit skeptical of the corruption doing that.  I am wondering if there is 
an issue elsewhere.  Can you use NiFi's click-to-content feature off the 
provenance data to show from the PutHDFS the actual content using NiFi's 
content viewer in the UI?  You can make a screenshot and show that.  There is a 
hex version and that could be helpful to see the results of.

Thanks

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Rajeswari (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427073#comment-15427073
 ] 

Rajeswari commented on NIFI-2562:
-

The java code i wrote uses the exact same libraries as NiFi. If you refer to my 
Fix1 for Problem1, I didnt build the nifi-hadoop-nar, I just changed my 
configuration to resolve unknown host exception, after resolving the exception 
Nifi worked fine in writing files but file content was corrupted.

example: actual file content in source= "play framework is built in scala"
content in hdfs = "play framework is built in splay"

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427062#comment-15427062
 ] 

Joseph Witt commented on NIFI-2562:
---

So my initial thought is that if NiFi, using the Hadoop client, believes the 
data was properly written then there might be very little we can do.  Is there 
a newer hadoop client you can use or have tried?

You mentioned trying a custom writer.  Can you compare its use of the client 
libs to NiFi's?

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Rajeswari (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427061#comment-15427061
 ] 

Rajeswari commented on NIFI-2562:
-

Joseph, we understand there is only so much you can do being open source and we 
really appreciate your time on this. We have been successfully using NiFi over 
six months now until we ran into this issue. We have NiFi 0.6.0 version and 
hadoop 2.6.0-cdh5.7.0 version. But, NiFi 0.6.0 uses hadoop 2.6.2 version, so 
our hadoop configuration was not working as mentioned in problem 1 above. To 
resolve that, we changed NiFi 0.6.0 pom.xml to use hadoop 2.6.0-cdh5.7.0 
instead of 2.6.2 and build NiFi bundle, moved nifi-hadoop-nar bundles alone to 
respective directory. This nifi-hadoop-nar-1.0.0-SNAPSHOT.nar version was 
generated from pom.xml, but to sum it up our nifi is 0.6.0.

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427015#comment-15427015
 ] 

Joseph Witt commented on NIFI-2562:
---

Hello [~arajeswari].  I don't believe you'll be able to expect critical blocker 
production support from an open source community.  However, we're of course a 
helpful bunch and would like to see people succeed.

You show this affects version 0.6.0 but your build results above show you're 
building the Hadoop jars and such on the nifi 1.x line.  Can you confirm? 

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2562) PutHDFS writes corrupted data in the transparent disk encryption zone

2016-08-18 Thread Rajeswari (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2562?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15427002#comment-15427002
 ] 

Rajeswari commented on NIFI-2562:
-

Can someone provide an update over this issue, as this is a critical blocker 
for us and it impedes progress on our production environment.

> PutHDFS writes corrupted data in the transparent disk encryption zone
> -
>
> Key: NIFI-2562
> URL: https://issues.apache.org/jira/browse/NIFI-2562
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.6.0
>Reporter: Vik
>Priority: Blocker
>  Labels: encryption, security
>
> Problem 1: UnknownHostExcepion
> When NiFi is trying to ingest files into HDFS encryption zone, it was 
> throwing UnknownHostException
> Reason: In hadoop Configuration files, like core-site.xml and hdfs-site.xml, 
> kms hosts were mentioned in the following format "h...@xxx1.int..com; 
> xxx2.int..com:16000". 
> Since NiFi was using old hadoop libraries (2.6.2), It could not resolve two 
> hosts. So instead it considered two hosts as a single host and started 
> throwing UnknownHostExcepion.
> We tried a couple different fixes for this. 
> Fix 1: Changing configuration files from having property like:
>hadoop.security.key.provider.path 
> kms://h...@.int..com; 
> .int..com:16000/kms   
> to:
>hadoop.security.key.provider.path 
> kms://h...@.int..com:16000/kms   
>  
> Fix 2: Building NiFi nar files with hadoop version, as installed in our 
> system. (2.6.0-cdh5.7.0).
> Steps followed:
> a) Changed NiFi pom file hadoop version from 2.6.2 to 2.6.0-cdh5.7.0.
> b) Run mvn clean package -DskipTests
> c) Copy following nar files to /opt/nifi-dev/lib
> ./nifi-nar-bundles/nifi-hadoop-bundle/nifi-hadoop-nar/target/nifi-hadoop-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/target/nifi-hadoop-libraries-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-hbase-bundle/nifi-hbase-nar/target/nifi-hbase-nar-1.0.0-SNAPSHOT.nar
> ./nifi-nar-bundles/nifi-standard-services/nifi-http-context-map-bundle/nifi-http-context-map-nar/target/nifi-http-context-map-nar-1.0.0-SNAPSHOT.nar
> d)  Restart NiFi with bin/nifi.sh restart
> This fixes resolved the Unknown Host Exception for us but we ran into Problem 
> 2 mentioned below.
> Problem 2: Ingesting Corrupted data into HDFS encryption zone
> After resolving the UnknownHostException, NiFi was able to ingest files into 
> encryption zone but content of the file is corrupted. 
> Approaches:
> Tried to simulate error with sample Java program which uses similar logic and 
> same library, but it was ingesting files into encryption zone without any 
> problem.
> Checked NiFi log files to find the cause, found NiFi is making HTTP requests 
> to kms to decrypt keys but could not proceed  further as there is no error.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)