laoniu603 opened a new issue, #4260:
URL: https://github.com/apache/streampark/issues/4260

   ### Search before asking
   
   - [x] I had searched in the 
[issues](https://github.com/apache/streampark/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### Java Version
   
   1.8
   
   ### Scala Version
   
   2.12.x
   
   ### StreamPark Version
   
   2.1.5
   
   ### Flink Version
   
   1.16
   
   ### Deploy mode
   
   kubernetes-application
   
   ### What happened
   
   Flink on k8s failed to start jm tm pod while running job
   
   ### Error Exception
   
   ```log
   $ kubectl -n streampark logs demok8s-7576558ccf-r78rz
   sed: couldn't open temporary file /opt/flink/conf/sedf76Cwj: Read-only file 
system
   sed: couldn't open temporary file /opt/flink/conf/sedOdIhP0: Read-only file 
system
   /docker-entrypoint.sh: line 73: /opt/flink/conf/flink-conf.yaml: Read-only 
file system
   /docker-entrypoint.sh: line 89: /opt/flink/conf/flink-conf.yaml.tmp: 
Read-only file system
   tee: /var/log/flink/taskmanager.out: No such file or directory
   tee: /var/log/flink/taskmanager.err: No such file or directory
   Failed to auto configure default logger context
   Reported exception:
   org.apache.streampark.shaded.ch.qos.logback.core.joran.spi.JoranException: 
Parser configuration error occurred
        at 
org.apache.streampark.shaded.ch.qos.logback.core.joran.event.SaxEventRecorder.buildSaxParser(SaxEventRecorder.java:89)
        at 
org.apache.streampark.shaded.ch.qos.logback.core.joran.event.SaxEventRecorder.recordEvents(SaxEventRecorder.java:57)
        at 
org.apache.streampark.shaded.ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:151)
        at 
org.apache.streampark.shaded.ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:104)
        at 
org.apache.streampark.common.util.LoggerFactory$ContextInitializer.configureByResource(Logger.scala:149)
        at 
org.apache.streampark.shaded.ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:134)
        at 
org.apache.streampark.common.util.LoggerFactory$.$anonfun$contextSelectorBinder$1(Logger.scala:101)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
        at scala.util.Try$.apply(Try.scala:209)
        at 
org.apache.streampark.common.util.LoggerFactory$.contextSelectorBinder$lzycompute(Logger.scala:101)
        at 
org.apache.streampark.common.util.LoggerFactory$.contextSelectorBinder(Logger.scala:98)
        at 
org.apache.streampark.common.util.LoggerFactory$.getLoggerFactory(Logger.scala:122)
        at org.apache.streampark.common.util.Logger.logger(Logger.scala:45)
        at org.apache.streampark.common.util.Logger.logger$(Logger.scala:43)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.logger(FlinkStreamingInitializer.scala:52)
        at org.apache.streampark.common.util.Logger.logWarn(Logger.scala:75)
        at org.apache.streampark.common.util.Logger.logWarn$(Logger.scala:74)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.logWarn(FlinkStreamingInitializer.scala:52)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.parseConfig(FlinkStreamingInitializer.scala:105)
        at 
org.apache.streampark.flink.core.FlinkTableInitializer.initParameter(FlinkTableInitializer.scala:177)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.configuration$lzycompute(FlinkStreamingInitializer.scala:80)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.configuration(FlinkStreamingInitializer.scala:80)
        at 
org.apache.streampark.flink.core.FlinkStreamingInitializer.<init>(FlinkStreamingInitializer.scala:63)
        at 
org.apache.streampark.flink.core.FlinkTableInitializer.<init>(FlinkTableInitializer.scala:80)
        at 
org.apache.streampark.flink.core.FlinkTableInitializer$.initialize(FlinkTableInitializer.scala:57)
        at 
org.apache.streampark.flink.core.scala.FlinkStreamTable.init(FlinkStreamTable.scala:45)
        at 
org.apache.streampark.flink.core.scala.FlinkStreamTable.main(FlinkStreamTable.scala:49)
        at 
org.apache.streampark.flink.core.scala.FlinkStreamTable.main$(FlinkStreamTable.scala:48)
        at 
org.apache.streampark.flink.cli.SqlClient$StreamSqlApp$.main(SqlClient.scala:89)
        at 
org.apache.streampark.flink.cli.SqlClient$.delayedEndpoint$org$apache$streampark$flink$cli$SqlClient$1(SqlClient.scala:78)
        at 
org.apache.streampark.flink.cli.SqlClient$delayedInit$body.apply(SqlClient.scala:34)
        at scala.Function0.apply$mcV$sp(Function0.scala:34)
        at scala.Function0.apply$mcV$sp$(Function0.scala:34)
        at 
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App.$anonfun$main$1$adapted(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:388)
        at scala.App.main(App.scala:76)
        at scala.App.main$(App.scala:74)
        at org.apache.streampark.flink.cli.SqlClient$.main(SqlClient.scala:34)
        at org.apache.streampark.flink.cli.SqlClient.main(SqlClient.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
        at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
        at 
org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:98)
        at 
org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.runApplicationEntryPoint(ApplicationDispatcherBootstrap.java:301)
        at 
org.apache.flink.client.deployment.application.ApplicationDispatcherBootstrap.lambda$runApplicationAsync$2(ApplicationDispatcherBootstrap.java:254)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.flink.runtime.concurrent.akka.ActorSystemScheduledExecutorAdapter$ScheduledFutureTask.run(ActorSystemScheduledExecutorAdapter.java:171)
        at 
org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
        at 
org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$withContextClassLoader$0(ClassLoadingUtils.java:41)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
        at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
   Caused by: javax.xml.parsers.ParserConfigurationException: SAX feature 
'http://xml.org/sax/features/external-general-entities' not supported.
        at 
oracle.xml.jaxp.JXSAXParserFactory.setFeature(JXSAXParserFactory.java:272)
        at 
org.apache.streampark.shaded.ch.qos.logback.core.joran.event.SaxEventRecorder.buildSaxParser(SaxEventRecorder.java:82)
        ... 59 more
   ```
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!(您是否要贡献这个PR?)
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to