green241 opened a new issue, #2252:
URL: https://github.com/apache/incubator-streampark/issues/2252

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-streampark/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   n pure kubernetes environment without hadoop installed, user  delete the 
default flink application job example failed.
   
   the logs of backend as following:
   
   2023-01-11 17:54:05 | ERROR | XNIO-1 task-6 | 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl:402] 
requirement failed: [StreamPark] FileUtils.getPathFromEnv: HADOOP_HOME is not 
set on system env
   java.lang.IllegalArgumentException: requirement failed: [StreamPark] 
FileUtils.getPathFromEnv: HADOOP_HOME is not set on system env
        at scala.Predef$.require(Predef.scala:281)
        at 
org.apache.streampark.common.util.FileUtils$.getPathFromEnv(FileUtils.scala:155)
        at 
org.apache.streampark.common.util.HadoopUtils$.hadoopConfDir$lzycompute(HadoopUtils.scala:95)
        at 
org.apache.streampark.common.util.HadoopUtils$.hadoopConfDir(HadoopUtils.scala:94)
        at 
org.apache.streampark.common.util.HadoopUtils$.$anonfun$hadoopConf$1(HadoopUtils.scala:166)
        at scala.Option.getOrElse(Option.scala:138)
        at 
org.apache.streampark.common.util.HadoopUtils$.hadoopConf(HadoopUtils.scala:165)
        at 
org.apache.streampark.common.util.HdfsUtils$.getDefaultFS(HdfsUtils.scala:32)
        at 
org.apache.streampark.common.conf.Workspace.WORKSPACE$lzycompute(Workspace.scala:92)
        at 
org.apache.streampark.common.conf.Workspace.WORKSPACE(Workspace.scala:80)
        at 
org.apache.streampark.common.conf.Workspace.APP_WORKSPACE$lzycompute(Workspace.scala:118)
        at 
org.apache.streampark.common.conf.Workspace.APP_WORKSPACE(Workspace.scala:118)
        at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.removeApp(ApplicationServiceImpl.java:461)
        at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.delete(ApplicationServiceImpl.java:393)
        at 
org.apache.streampark.console.core.service.impl.ApplicationServiceImpl$$FastClassBySpringCGLIB$$56060f90.invoke(<generated>)
   
   
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
   Probably cause:  the default flink application job example with yarn mode, 
which may check the hadoop environment.
   
   ### StreamPark Version
   
   dev 2.0.0
   
   ### Java Version
   
   1.8
   
   ### Flink Version
   
   1.14.3
   
   ### Scala Version of Flink
   
   2.12
   
   ### Error Exception
   
   ```log
   Referring the "What happened" module.
   ```
   
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to