This looks like a deadlock: could you indicate the exact revision of OpenSIPS you are using? Also, when opensips is running in 100%, could you run the `opensipsctl trap` and post the resulted gdb file?

Best regards,
Răzvan

On 10/3/19 7:50 PM, Igor Olhovskiy wrote:
#Hi!

Is there any way of getting what eating 100% CPU on OpenSIPS 2.4.6?
Problem is it appears quite occasional and can't reproduce it, and problem disappears on reboot of process.

#opensipsctl fifo ps
Process::  ID=0 PID=24470 Type=attendant
Process::  ID=1 PID=24471 Type=MI FIFO
Process::  ID=2 PID=24472 Type=time_keeper
Process::  ID=3 PID=24473 Type=timer
Process::  ID=4 PID=24474 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=5 PID=24475 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=6 PID=24476 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=7 PID=24477 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=8 PID=24478 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=9 PID=24479 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=10 PID=24480 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=11 PID=24481 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=12 PID=24482 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=13 PID=24483 Type=SIP receiver udp:172.31.37.148:5060
Process::  ID=14 PID=24484 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=15 PID=24485 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=16 PID=24486 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=17 PID=24487 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=18 PID=24488 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=19 PID=24489 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=20 PID=24490 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=21 PID=24491 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=22 PID=24492 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=23 PID=24493 Type=SIP receiver udp:172.31.37.148:9094
Process::  ID=24 PID=24494 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=25 PID=24495 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=26 PID=24496 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=27 PID=24497 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=28 PID=24498 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=29 PID=24499 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=30 PID=24500 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=31 PID=24501 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=32 PID=24502 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=33 PID=24503 Type=SIP receiver udp:172.31.38.207:5060
Process::  ID=34 PID=24504 Type=TCP receiver
Process::  ID=35 PID=24505 Type=TCP receiver
Process::  ID=36 PID=24506 Type=TCP receiver
Process::  ID=37 PID=24507 Type=TCP receiver
Process::  ID=38 PID=24508 Type=TCP receiver
Process::  ID=39 PID=24509 Type=TCP receiver
Process::  ID=40 PID=24510 Type=TCP receiver
Process::  ID=41 PID=24511 Type=TCP receiver
Process::  ID=42 PID=24512 Type=Timer handler
Process::  ID=43 PID=24513 Type=TCP main

In the logs

Oct  3 16:22:37 ip-172-31-37-148 /usr/local/sbin/opensips[24473]: WARNING:core:timer_ticker: timer task <nh-timer> already scheduled for 29546470 ms (now 33307710 ms), it may overlap..

#opensipsctl fifo get_statistics load: core: shmem:
load:load:: 30
load:load1m:: 30
load:load10m:: 30
load:load-all:: 29
load:load1m-all:: 29
load:load10m-all:: 29
load:load-proc-1:: 0
load:load1m-proc-1:: 0
load:load10m-proc-1:: 0
load:load-proc-2:: 0
load:load1m-proc-2:: 0
load:load10m-proc-2:: 0
load:load-proc-3:: 0
load:load1m-proc-3:: 0
load:load10m-proc-3:: 0
load:load-proc-4:: 0
load:load1m-proc-4:: 0
load:load10m-proc-4:: 0
load:load-proc-5:: 0
load:load1m-proc-5:: 0
load:load10m-proc-5:: 0
load:load-proc-6:: 0
load:load1m-proc-6:: 0
load:load10m-proc-6:: 0
load:load-proc-7:: 0
load:load1m-proc-7:: 0
load:load10m-proc-7:: 0
load:load-proc-8:: 0
load:load1m-proc-8:: 0
load:load10m-proc-8:: 0
load:load-proc-9:: 0
load:load1m-proc-9:: 0
load:load10m-proc-9:: 0
load:load-proc-10:: 0
load:load1m-proc-10:: 0
load:load10m-proc-10:: 0
load:load-proc-11:: 0
load:load1m-proc-11:: 0
load:load10m-proc-11:: 0
load:load-proc-12:: 0
load:load1m-proc-12:: 0
load:load10m-proc-12:: 0
load:load-proc-13:: 0
load:load1m-proc-13:: 0
load:load10m-proc-13:: 0
load:load-proc-14:: 0
load:load1m-proc-14:: 0
load:load10m-proc-14:: 0
load:load-proc-15:: 0
load:load1m-proc-15:: 0
load:load10m-proc-15:: 0
load:load-proc-16:: 0
load:load1m-proc-16:: 0
load:load10m-proc-16:: 0
load:load-proc-17:: 0
load:load1m-proc-17:: 0
load:load10m-proc-17:: 0
load:load-proc-18:: 0
load:load1m-proc-18:: 0
load:load10m-proc-18:: 0
load:load-proc-19:: 0
load:load1m-proc-19:: 0
load:load10m-proc-19:: 0
load:load-proc-20:: 0
load:load1m-proc-20:: 0
load:load10m-proc-20:: 0
load:load-proc-21:: 0
load:load1m-proc-21:: 0
load:load10m-proc-21:: 0
load:load-proc-22:: 0
load:load1m-proc-22:: 0
load:load10m-proc-22:: 0
load:load-proc-23:: 0
load:load1m-proc-23:: 0
load:load10m-proc-23:: 0
load:load-proc-24:: 100
load:load1m-proc-24:: 100
load:load10m-proc-24:: 100
load:load-proc-25:: 100
load:load1m-proc-25:: 100
load:load10m-proc-25:: 100
load:load-proc-26:: 100
load:load1m-proc-26:: 100
load:load10m-proc-26:: 100
load:load-proc-27:: 100
load:load1m-proc-27:: 100
load:load10m-proc-27:: 100
load:load-proc-28:: 100
load:load1m-proc-28:: 100
load:load10m-proc-28:: 100
load:load-proc-29:: 100
load:load1m-proc-29:: 100
load:load10m-proc-29:: 100
load:load-proc-30:: 100
load:load1m-proc-30:: 100
load:load10m-proc-30:: 100
load:load-proc-31:: 100
load:load1m-proc-31:: 100
load:load10m-proc-31:: 100
load:load-proc-32:: 100
load:load1m-proc-32:: 100
load:load10m-proc-32:: 100
load:load-proc-33:: 100
load:load1m-proc-33:: 100
load:load10m-proc-33:: 100
load:load-proc-34:: 0
load:load1m-proc-34:: 0
load:load10m-proc-34:: 0
load:load-proc-35:: 0
load:load1m-proc-35:: 0
load:load10m-proc-35:: 0
load:load-proc-36:: 100
load:load1m-proc-36:: 100
load:load10m-proc-36:: 100
load:load-proc-37:: 0
load:load1m-proc-37:: 0
load:load10m-proc-37:: 0
load:load-proc-38:: 0
load:load1m-proc-38:: 0
load:load10m-proc-38:: 0
load:load-proc-39:: 0
load:load1m-proc-39:: 0
load:load10m-proc-39:: 0
load:load-proc-40:: 0
load:load1m-proc-40:: 0
load:load10m-proc-40:: 0
load:load-proc-41:: 0
load:load1m-proc-41:: 0
load:load10m-proc-41:: 0
load:load-proc-42:: 100
load:load1m-proc-42:: 100
load:load10m-proc-42:: 100
load:load-proc-43:: 0
load:load1m-proc-43:: 0
load:load10m-proc-43:: 0
core:rcv_requests:: 50509
core:rcv_replies:: 31039
core:fwd_requests:: 59
core:fwd_replies:: 0
core:drop_requests:: 0
core:drop_replies:: 0
core:err_requests:: 0
core:err_replies:: 0
core:bad_URIs_rcvd:: 0
core:unsupported_methods:: 0
core:bad_msg_hdr:: 0
core:timestamp:: 33581
shmem:total_size:: 33554432
shmem:used_size:: 5929544
shmem:real_used_size:: 6187752
shmem:max_used_size:: 7568672
shmem:free_size:: 27366680
shmem:fragments:: 8137

As OpenSIPS is not crashing, I also don't know how to get core dump here.

_______________________________________________
Users mailing list
[email protected]
http://lists.opensips.org/cgi-bin/mailman/listinfo/users


--
Răzvan Crainea
OpenSIPS Core Developer
  http://www.opensips-solutions.com

_______________________________________________
Users mailing list
[email protected]
http://lists.opensips.org/cgi-bin/mailman/listinfo/users

Reply via email to