[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-05-17 Thread Rong Rong (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17109703#comment-17109703
 ] 

Rong Rong commented on FLINK-17386:
---

closed via: 1892bedeea9fa118b6e3bcb572f63c2e7f6d83e3

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Assignee: Rong Rong
>Priority: Major
>  Labels: pull-request-available
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-30 Thread Wenlong Lyu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17096232#comment-17096232
 ] 

Wenlong Lyu commented on FLINK-17386:
-

[~rongr] I have verified the patch, it works.

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Assignee: Rong Rong
>Priority: Major
>  Labels: pull-request-available
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-28 Thread Rong Rong (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094595#comment-17094595
 ] 

Rong Rong commented on FLINK-17386:
---

hmm. that's not what I expected. I have 2 theories:

1. in 1.10 impl the SecurityContext was override using the same logic. However, 
the difference is that the classloader used in 1.10 could've been the runtime 
classloader (SecurityUtils.class.getClassLoader());
while in 1.11 the classloader is actually the service provider classloader 
(HadoopSecurityContextFactory.class.getClassLoader())
  - this may be an issue but I am not exactly sure.

2. in 1.10 impl the Context installation catches {{LinkageError}} as well as 
ClassNotFoundExceptions. which is a much more strong catch term. and 
NoClassDefFoundError actually extends LinkageError. 

I am more convinced that the #2 is the root cause, would try to create a quick 
fix for this and attach a PR. If you could help testing the patch later that 
would be super great!.


> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-28 Thread Wenlong Lyu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094235#comment-17094235
 ] 

Wenlong Lyu commented on FLINK-17386:
-

[~rongr] I run a quick test by copy the statebackend to the lib of 1.10 and run 
a flink command, no error happen.

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-28 Thread Wenlong Lyu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094218#comment-17094218
 ] 

Wenlong Lyu commented on FLINK-17386:
-

yes, you are right about the use case. We are doing some testing based on the 
master branch, I can do a test on 1.10, and feed back later.

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-27 Thread Rong Rong (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094152#comment-17094152
 ] 

Rong Rong commented on FLINK-17386:
---

I might have some idea on why this is happening. but just to be sure: is this 
issue occurring only to the latest unreleased flink? or does it also affects 
previous version (I bet it does, but if you can verify that would be great).



> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-27 Thread Rong Rong (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094149#comment-17094149
 ] 

Rong Rong commented on FLINK-17386:
---

hmm. interesting. 
Just to understand more clearer, you are actually using some sort of hadoop 
functionality but not {{flink-shaded-hadoop-*}} module. correct? that causes 
Flink to falsely think it should use HadoopSecurityContext but it shouldn't. 

 however the exception said:

{code:java}
java.io.IOException: Process execution failed due error. Error 
output:java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hadoop.security.UserGroupInformation
org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)
...
{code}

so this means you are somehow including hadoop but not the hadoop-common? 


> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-27 Thread Wenlong Lyu (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17094097#comment-17094097
 ] 

Wenlong Lyu commented on FLINK-17386:
-

hi [~rongr], I don't want shaded-hadoop-lib in the lib, because I didn't want 
to submit job to yarn. I did more investigation on this issue, found that it is 
because there is a customized state backend in the lib which is designed to 
support write to hdfs and contains some of hadoop libs but not all. The root 
cause is that: the security loader only check whether Configuration and 
UserGroupInformation is in classpath or not, which may cause the exception 
above when no enough lib is in the lib dir.

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-17386) Exception in HadoopSecurityContextFactory.createContext while no shaded-hadoop-lib provided.

2020-04-26 Thread Rong Rong (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17092732#comment-17092732
 ] 

Rong Rong commented on FLINK-17386:
---

could you provide more information on how to reproduce this exception?

>From what I understand you do want to use shaded-hadoop-lib but somehow forgot 
>to include in the classpath. and your intention is to allow graceful shutdown 
>instead of a hard process exit. correct?

> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> 
>
> Key: FLINK-17386
> URL: https://issues.apache.org/jira/browse/FLINK-17386
> Project: Flink
>  Issue Type: Bug
>Reporter: Wenlong Lyu
>Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)