[jira] [Commented] (FLINK-9029) Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2

2018-04-25 Thread Tzu-Li (Gordon) Tai (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-9029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16451818#comment-16451818
 ] 

Tzu-Li (Gordon) Tai commented on FLINK-9029:


[~abareghi] do you have any updates on this?

> Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2
> ---
>
> Key: FLINK-9029
> URL: https://issues.apache.org/jira/browse/FLINK-9029
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.4.1, 1.4.2
> Environment: * Flink-1.4.2 (Flink-1.4.1)
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>Reporter: Mohammad Abareghi
>Priority: Major
>
> *Environment*
>  * Flink-1.4.2
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>  
> *Description*
> I have a Java job in flink-1.4.0 which writes to HDFS to a specific path. 
> After updating to flink-1.4.2 I'm getting the following error from Hadoop 
> complaining that the user doesn't have write permission to the given path:
> {code:java}
> WARN org.apache.hadoop.security.UserGroupInformation: 
> PriviledgedActionException as:xng (auth:SIMPLE) 
> cause:org.apache.hadoop.security.AccessControlException: Permission denied: 
> user=user1, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x
> {code}
> *NOTE*:
>  * If I run the same job on flink-1.4.0, Error disappears regardless of what 
> version of flink (1.4.0 or 1.4.2) dependencies I have for job
>  * Also if I run the job main method from my IDE and pass the same 
> parameters, I don't get above error.
> *NOTE*:
> It seems the problem somehow is in 
> {{flink-1.4.2/lib/flink-shaded-hadoop2-uber-1.4.2.jar}}. If I replace that 
> with {{flink-1.4.0/lib/flink-shaded-hadoop2-uber-1.4.0.jar}}, restart the 
> cluster and run my job (flink topology) then the error doesn't appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9029) Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2

2018-03-21 Thread Mohammad Abareghi (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-9029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16407582#comment-16407582
 ] 

Mohammad Abareghi commented on FLINK-9029:
--

[~StephanEwen] Yes. Security is OFF. 

I'll try to remove Hadoop uber jar ASAP (hopefully later today).

Will drop a comment here. 

> Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2
> ---
>
> Key: FLINK-9029
> URL: https://issues.apache.org/jira/browse/FLINK-9029
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.4.1, 1.4.2
> Environment: * Flink-1.4.2 (Flink-1.4.1)
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>Reporter: Mohammad Abareghi
>Priority: Major
>
> *Environment*
>  * Flink-1.4.2
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>  
> *Description*
> I have a Java job in flink-1.4.0 which writes to HDFS to a specific path. 
> After updating to flink-1.4.2 I'm getting the following error from Hadoop 
> complaining that the user doesn't have write permission to the given path:
> {code:java}
> WARN org.apache.hadoop.security.UserGroupInformation: 
> PriviledgedActionException as:xng (auth:SIMPLE) 
> cause:org.apache.hadoop.security.AccessControlException: Permission denied: 
> user=user1, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x
> {code}
> *NOTE*:
>  * If I run the same job on flink-1.4.0, Error disappears regardless of what 
> version of flink (1.4.0 or 1.4.2) dependencies I have for job
>  * Also if I run the job main method from my IDE and pass the same 
> parameters, I don't get above error.
> *NOTE*:
> It seems the problem somehow is in 
> {{flink-1.4.2/lib/flink-shaded-hadoop2-uber-1.4.2.jar}}. If I replace that 
> with {{flink-1.4.0/lib/flink-shaded-hadoop2-uber-1.4.0.jar}}, restart the 
> cluster and run my job (flink topology) then the error doesn't appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-9029) Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2

2018-03-20 Thread Stephan Ewen (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-9029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406901#comment-16406901
 ] 

Stephan Ewen commented on FLINK-9029:
-

Hmm, interesting. And security is OFF as you mentioned?

Can you try if it works if you completely remove the Hadoop uber jar from the 
lib folder, and instead export the hadoop classpath as in 
https://ci.apache.org/projects/flink/flink-docs-master/ops/deployment/hadoop.html

There was one fix to the Kerberos functionality between 1.4.0 and 1.4.2, which 
is 
https://github.com/apache/flink/commit/20146652baff31c3824f7ae31506b8c481cdf56c

The puzzling thing is that this particular commit does not modify the Hadoop 
Uber Jar.

[~tzulitai] Do you have any thoughts why that could happen?

> Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2
> ---
>
> Key: FLINK-9029
> URL: https://issues.apache.org/jira/browse/FLINK-9029
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.4.1, 1.4.2
> Environment: * Flink-1.4.2 (Flink-1.4.1)
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
>Reporter: Mohammad Abareghi
>Priority: Major
>
> *Environment*
>  * Flink-1.4.2
>  * Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and {{Security is off.}}
>  * Ubuntu 16.04.3 LTS
>  * Java 8
> *Description*
> I have a Java job in flink-1.4.0 which writes to HDFS in a specific path. 
> After updating to flink-1.4.2 I'm getting the following error from Hadoop 
> complaining that the user doesn't have write permission to the given path:
>  
> *Description*
> I have a Java job in flink-1.4.0 which writes to HDFS in a specific path. 
> After updating to flink-1.4.2 I'm getting the following error from Hadoop 
> complaining that the user doesn't have write permission to the given path:
> {code:java}
> WARN org.apache.hadoop.security.UserGroupInformation: 
> PriviledgedActionException as:xng (auth:SIMPLE) 
> cause:org.apache.hadoop.security.AccessControlException: Permission denied: 
> user=user1, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x
> {code}
> *NOTE*:
>  * If I run the same job on flink-1.4.0, Error disappears regardless of what 
> version of flink (1.4.0 or 1.4.2) dependencies I have for job
>  * Also if I run the job main method from my IDE and pass the same 
> parameters, I don't get above error.
> *NOTE*:
> It seems the problem somehow is in 
> {{flink-1.4.2/lib/flink-shaded-hadoop2-uber-1.4.2.jar}}. If I replace that 
> with {{flink-1.4.0/lib/flink-shaded-hadoop2-uber-1.4.0.jar}}, restart the 
> cluster and run my job (flink topology) then the error doesn't appear.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)