[ 
https://issues.apache.org/jira/browse/FLINK-17386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17094149#comment-17094149
 ] 

Rong Rong commented on FLINK-17386:
-----------------------------------

hmm. interesting. 
Just to understand more clearer, you are actually using some sort of hadoop 
functionality but not {{flink-shaded-hadoop-*}} module. correct? that causes 
Flink to falsely think it should use HadoopSecurityContext but it shouldn't. 

 however the exception said:

{code:java}
java.io.IOException: Process execution failed due error. Error 
output:java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hadoop.security.UserGroupInformation
org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)
...
{code}

so this means you are somehow including hadoop but not the hadoop-common? 


> Exception in HadoopSecurityContextFactory.createContext while no 
> shaded-hadoop-lib provided.
> --------------------------------------------------------------------------------------------
>
>                 Key: FLINK-17386
>                 URL: https://issues.apache.org/jira/browse/FLINK-17386
>             Project: Flink
>          Issue Type: Bug
>            Reporter: Wenlong Lyu
>            Priority: Major
>
> java.io.IOException: Process execution failed due error. Error 
> output:java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.hadoop.security.UserGroupInformation\n\tat 
> org.apache.flink.runtime.security.contexts.HadoopSecurityContextFactory.createContext(HadoopSecurityContextFactory.java:59)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.installContext(SecurityUtils.java:92)\n\tat
>  
> org.apache.flink.runtime.security.SecurityUtils.install(SecurityUtils.java:60)\n\tat
>  org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:964)\n\n\tat 
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:144)\n\tat
>  
> com.alibaba.flink.vvr.util.AutoClosableProcess$AutoClosableProcessBuilder.runBlocking(AutoClosableProcess.java:126)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.runSingleJobCompileCheck(VVRCompileTest.java:173)\n\tat
>  
> com.alibaba.flink.vvr.VVRCompileTest.lambda$runJobsCompileCheck$0(VVRCompileTest.java:101)\n\tat
>  
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat 
> java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1147)\n\tat
>  
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622)\n\tat
>  java.lang.Thread.run(Thread.java:834)
> I think it is because exception throw in the static code block of 
> UserInformation, we should catch Throwable instead of Exception in 
> HadoopSecurityContextFactory#createContext?
> [~rongr] what do you think?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to