Since we don't use 32bit, the reason must be - the virtual memory of the OS has been fully depleted
How can I check for this, and remedy it? fredag 5. juli 2019 03.17.38 UTC+2 skrev Jan Monterrubio følgende: > Correct me if I’m wrong but I don’t think increasing heap size will > actually affect your ability to create more native threads. > > See this for a possible explanation: > https://plumbr.io/outofmemoryerror/unable-to-create-new-native-thread > > On Thu, Jul 4, 2019 at 16:03 Baptiste Mathus <[email protected] > <javascript:>> wrote: > >> Did you enable GC logging to have a better understanding of the profile >> of your memory consumption? If not, I would recommend you do it first and >> analyze them. >> https://jenkins.io/blog/2016/11/21/gc-tuning/ explained this part (and >> much more) quite well. >> >> Then, once you understand better when it crashes, possibly you'll want to >> analyze a heap dump to see what is causing the problem. >> >> Cheers >> >> >> Le mar. 2 juil. 2019 à 15:30, Sverre Moe <[email protected] <javascript:>> >> a écrit : >> >>> Today it has been chaotic. >>> Several build agents disconnected >>> >>> Unexpected termination of the channel >>> >>> Many builds failed because of Memory error. >>> >>> I have tried restarting Jenkins several times today. >>> >>> Anyone have any suggestions? >>> >>> tirsdag 2. juli 2019 14.34.25 UTC+2 skrev Sverre Moe følgende: >>>> >>>> We have assigned 8GB of memory to our Jenkins instance. >>>> JAVA_OPTIONS=-Xmx8g >>>> >>>> Still we experience memory issues after a while running. >>>> java.lang.OutOfMemoryError: unable to create new native thread >>>> >>>> We have: >>>> aprox 40 connected build agents >>>> aprox 400 pipeline jobs >>>> >>>> We have a test Jenkins instance running with the same jobs, this one >>>> connects to the same build agents (though on a different home directory). >>>> >>>> Lately we have been getting disconnected build agents, that we cannot >>>> get up again without restarting Jenkins. >>>> >>>> Can we assign more memory to a build agent? Would it have any affect on >>>> this issue? >>>> >>>> This we got from one of our latest Pipeline builds that failed on a >>>> sh("find **** -exec ***") step. It failed on that build agent that is now >>>> disconnected. >>>> >>>> >>>> java.lang.OutOfMemoryError: unable to create new native thread >>>> at java.lang.Thread.start0(Native Method) >>>> at java.lang.Thread.start(Thread.java:714) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:950) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1368) >>>> at java.lang.UNIXProcess.initStreams(UNIXProcess.java:288) >>>> at java.lang.UNIXProcess.lambda$new$2(UNIXProcess.java:258) >>>> at java.security.AccessController.doPrivileged(Native Method) >>>> at java.lang.UNIXProcess.<init>(UNIXProcess.java:257) >>>> at java.lang.ProcessImpl.start(ProcessImpl.java:134) >>>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) >>>> at hudson.Proc$LocalProc.<init>(Proc.java:249) >>>> Also: java.io.IOException: error=11, Resource temporarily unavailable >>>> >>>> >>>> >>>> >>>> SEVERE: Unexpected error when retrieving changeset >>>> hudson.plugins.git.GitException: Error: git whatchanged --no-abbrev -M >>>> "--format=commit %H%ntree %T%nparent %P%nauthor %aN <%aE> % >>>> ai%ncommitter %cN <%cE> %ci%n%n%w(76,4,4)%s%n%n%b" -n 1 >>>> b2c871830a03ea5f2fd2b21245afb09d51d69686 in /home/build/jenkins/workspace/ >>>> project_user_work >>>> at >>>> org.jenkinsci.plugins.gitclient.CliGitAPIImpl$6.execute(CliGitAPIImpl.java:1012) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146) >>>> >>>> >>>> at hudson.remoting.UserRequest.perform(UserRequest.java:212) >>>> at hudson.remoting.UserRequest.perform(UserRequest.java:54) >>>> at hudson.remoting.Request$2.run(Request.java:369) >>>> at >>>> hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72) >>>> >>>> >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) >>>> >>>> >>>> at >>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) >>>> >>>> >>>> at java.lang.Thread.run(Thread.java:748) >>>> Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote >>>> call to master-sles12.3-x86_64_3 >>>> at >>>> hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1741) >>>> at >>>> hudson.remoting.UserRequest$ExceptionResponse.retrieve(UserRequest.java:357) >>>> >>>> >>>> at hudson.remoting.Channel.call(Channel.java:955) >>>> at >>>> org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.execute(RemoteGitImpl.java:146) >>>> >>>> >>>> at sun.reflect.GeneratedMethodAccessor678.invoke(Unknown >>>> Source) >>>> at >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>> >>>> >>>> at java.lang.reflect.Method.invoke(Method.java:498) >>>> at >>>> org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler.invoke(RemoteGitImpl.java:132) >>>> >>>> >>>> at com.sun.proxy.$Proxy104.execute(Unknown Source) >>>> at >>>> io.jenkins.blueocean.autofavorite.FavoritingScmListener.getChangeSet(FavoritingScmListener.java:159) >>>> >>>> >>>> at >>>> io.jenkins.blueocean.autofavorite.FavoritingScmListener.onCheckout(FavoritingScmListener.java:84) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.workflow.steps.scm.SCMStep.checkout(SCMStep.java:140) >>>> >>>> at >>>> org.jenkinsci.plugins.workflow.steps.scm.SCMStep$StepExecutionImpl.run(SCMStep.java:93) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.workflow.steps.scm.SCMStep$StepExecutionImpl.run(SCMStep.java:80) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingS >>>> tepExecution.java:47) >>>> >>>> Jul 01, 2019 11:51:12 AM >>>> hudson.init.impl.InstallUncaughtExceptionHandler$DefaultUncaughtExceptionHandler >>>> >>>> uncaughtException >>>> SEVERE: A thread (Timer-9692/111139) died unexpectedly due to an >>>> uncaught exception, this may leave your Jenkins in a bad way and >>>> is usually indicative of a bug in the code. >>>> java.lang.OutOfMemoryError: unable to create new native thread >>>> at java.lang.Thread.start0(Native Method) >>>> at java.lang.Thread.start(Thread.java:714) >>>> at java.util.Timer.<init>(Timer.java:160) >>>> at java.util.Timer.<init>(Timer.java:132) >>>> at >>>> org.jenkinsci.plugins.ssegateway.sse.EventDispatcher.scheduleRetryQueueProcessing(EventDispatcher.java:296) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.ssegateway.sse.EventDispatcher.processRetries(EventDispatcher.java:437) >>>> >>>> >>>> at >>>> org.jenkinsci.plugins.ssegateway.sse.EventDispatcher$1.run(EventDispatcher.java:299) >>>> >>>> >>>> at java.util.TimerThread.mainLoop(Timer.java:555) >>>> at java.util.TimerThread.run(Timer.java:505) >>>> >>>> INFO: Ping failed. Terminating the channel master-sles12.3-x86_64_3. >>>> java.util.concurrent.TimeoutException: Ping started at 1561982408948 >>>> hasn't completed by 1561982648948 >>>> at hudson.remoting.PingThread.ping(PingThread.java:134) >>>> at hudson.remoting.PingThread.run(PingThread.java:90) >>>> >>>> Jul 01, 2019 2:04:11 PM >>>> hudson.remoting.SynchronousCommandTransport$ReaderThread run >>>> INFO: I/O error in channel master-sles12.3-x86_64_3 >>>> java.io.IOException: Unexpected termination of the channel >>>> WARNING: Failed to monitor master-sles12.3-x86_64_3 for Free Temp Space >>>> >>>> Jul 01, 2019 2:04:11 PM >>>> hudson.node_monitors.AbstractAsyncNodeMonitorDescriptor monitorDetailed >>>> WARNING: Failed to monitor master-sles12.3-x86_64_3 for Free Swap >>>> Space >>>> >>>> >>>> >>>> The latest problem we got. It did not take down the build node. On all >>>> occasions of this problem it happens when the Pipeline is doing some IO on >>>> the Jenkins master. Here we manually restart the build again, and it >>>> builds >>>> fine. >>>> >>>> Running on Jenkins <https://build-ci.spacetec.no:8443/computer/(master)/> >>>> in /var/lib/jenkins/workspace/project_master[Pipeline] { >>>> <https://build-ci.spacetec.no:8443/job/mcap_hrdfep/job/JPSS/75/console#>[Pipeline] >>>> parallel >>>> <https://build-ci.spacetec.no:8443/job/mcap_hrdfep/job/JPSS/75/console#>[Pipeline] >>>> { (Branch: Setup) >>>> <https://build-ci.spacetec.no:8443/job/mcap_hrdfep/job/JPSS/75/console#>[Pipeline] >>>> End of Pipelinejava.lang.OutOfMemoryError: unable to create new native >>>> thread >>>> at java.lang.Thread.start0(Native Method) >>>> at java.lang.Thread.start(Thread.java:714) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:950) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1366) >>>> at >>>> com.google.common.eventbus.AsyncEventBus.dispatch(AsyncEventBus.java:90) >>>> at >>>> com.google.common.eventbus.AsyncEventBus.dispatchQueuedEvents(AsyncEventBus.java:81) >>>> at com.google.common.eventbus.EventBus.post(EventBus.java:264) >>>> at >>>> org.jenkinsci.plugins.pubsub.GuavaPubsubBus$1.publish(GuavaPubsubBus.java:70) >>>> at org.jenkinsci.plugins.pubsub.PubsubBus.publish(PubsubBus.java:141) >>>> at >>>> io.jenkins.blueocean.events.PipelineEventListener.publishEvent(PipelineEventListener.java:196) >>>> at >>>> io.jenkins.blueocean.events.PipelineEventListener.onNewHead(PipelineEventListener.java:85) >>>> at >>>> org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.notifyListeners(CpsFlowExecution.java:1463) >>>> at >>>> org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$3.run(CpsThreadGroup.java:458) >>>> at >>>> org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$1.run(CpsVmExecutorService.java:35) >>>> at >>>> hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131) >>>> at >>>> jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28) >>>> at >>>> jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59) >>>> at >>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) >>>> at >>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) >>>> at java.lang.Thread.run(Thread.java:745) >>>> Finished: FAILURE >>>> >>>> -- >>> You received this message because you are subscribed to the Google >>> Groups "Jenkins Users" group. >>> To unsubscribe from this group and stop receiving emails from it, send >>> an email to [email protected] <javascript:>. >>> To view this discussion on the web visit >>> https://groups.google.com/d/msgid/jenkinsci-users/6b1f3729-e456-41a9-a464-c63d061e2912%40googlegroups.com >>> >>> <https://groups.google.com/d/msgid/jenkinsci-users/6b1f3729-e456-41a9-a464-c63d061e2912%40googlegroups.com?utm_medium=email&utm_source=footer> >>> . >>> For more options, visit https://groups.google.com/d/optout. >>> >> -- >> You received this message because you are subscribed to the Google Groups >> "Jenkins Users" group. >> To unsubscribe from this group and stop receiving emails from it, send an >> email to [email protected] <javascript:>. >> To view this discussion on the web visit >> https://groups.google.com/d/msgid/jenkinsci-users/CANWgJS5qokC%3D6vVcku1A6OXm%2Boi%3D03YzuSNQrCarDtCyCPy4pQ%40mail.gmail.com >> >> <https://groups.google.com/d/msgid/jenkinsci-users/CANWgJS5qokC%3D6vVcku1A6OXm%2Boi%3D03YzuSNQrCarDtCyCPy4pQ%40mail.gmail.com?utm_medium=email&utm_source=footer> >> . >> For more options, visit https://groups.google.com/d/optout. >> > -- You received this message because you are subscribed to the Google Groups "Jenkins Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/09896d2e-2d44-44ba-86e6-8e10c8f4617d%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
