What is the memory manager that you use, is the default one or do you
provide -x cli parameter?

Cheers,
Daniel

On 18.07.24 17:09, Ihor Olkhovskyi wrote:
> Daniel,
>
> thanks, so shm_status is returning A LOT of data, not sure how to
> analyze it, regarding shm_summary, I'm getting the same, so will just
> citate part of my previous message:
>
> corex.shm_summary returns the following:
>
> https://pastebin.com/tFYwygjU
>
> I see a lot of memory allocated to
>
> core/tcp_main.c: tcpconn_new(1201) (around 100 MB count= 1225),
> tls: tls_init.c: ser_malloc(364) (66 MB, count = 69815)
>
> But still kamctl stats shmem showing 2GB of used memory.
>
>
> Le jeu. 18 juil. 2024 à 09:53, Daniel-Constantin Mierla
> <[email protected]> a écrit :
>
>     Hello,
>
>     set memlog value lower or equal with the one for debug, and memdbg
>     to a higher value than debug.
>
>     Cheers,
>     Daniel
>
>     On 12.07.24 21:23, Ihor Olkhovskyi wrote:
>>
>>     Daniel,
>>
>>     Thanks for an answer,
>>
>>     Just a question, what should be options for
>>
>>     debug=2
>>     log_stderror=no
>>
>>     memdbg=5
>>     memlog=5
>>
>>     mem_join=1
>>     mem_safety=1
>>
>>     to get the most of
>>
>>     kamctl rpc corex.shm_summary ?
>>
>>     Cause when I'm setting
>>
>>     kamcmd corex.debug 5 
>>
>>     I'm getting all of the list of fragments which is really
>>     something not easy to read.
>>
>>     corex.shm_summary returns the following:
>>
>>     https://pastebin.com/tFYwygjU
>>
>>     I see a lot of memory allocated to
>>
>>     core/tcp_main.c: tcpconn_new(1201) (around 100 MB count= 1225),
>>     tls: tls_init.c: ser_malloc(364) (66 MB, count = 69815)
>>
>>     But still kamctl stats shmem showing 2GB of used memory.
>>
>>     Thanks in advance!
>>
>>     Le 10/07/2024 à 14:11, Daniel-Constantin Mierla a écrit :
>>>
>>>     Hello,
>>>
>>>     first, the value for -M is too high, I cannot easily think of a
>>>     case when one needs 1GB of private memory for each kamailio process.
>>>
>>>     You can try to run the rpc command shm.stats and look in the
>>>     syslog for the report to check if it is different that what you
>>>     get with mod.mem_stats.
>>>
>>>     Cheers,
>>>     Daniel
>>>
>>>     On 10.07.24 11:36, Ihor Olkhovskyi via sr-users wrote:
>>>>     Hello!
>>>>
>>>>     I'm trying to get where all the SHM memory gone. For the moment
>>>>     I'm using these settings:
>>>>     -m 8192 -M 1024
>>>>     which means 8G of SHM meory.
>>>>     Kamailio parameters are the following
>>>>
>>>>     fork=yes
>>>>     children=8
>>>>     tcp_children=12
>>>>     enable_tls=yes
>>>>     enable_sctp=no
>>>>     tls_threads_mode=2
>>>>     tcp_accept_no_cl=yes
>>>>     tcp_max_connections=63536
>>>>     tls_max_connections=63536
>>>>     tcp_accept_aliases=no
>>>>     tcp_async=yes
>>>>     tcp_connect_timeout=10
>>>>     tcp_conn_wq_max=63536
>>>>     tcp_crlf_ping=yes
>>>>     tcp_delayed_ack=yes
>>>>     tcp_fd_cache=yes
>>>>     tcp_keepalive=yes
>>>>     tcp_keepcnt=3
>>>>     tcp_keepidle=30
>>>>     tcp_keepintvl=10
>>>>     tcp_linger2=30
>>>>     tcp_rd_buf_size=80000
>>>>     tcp_send_timeout=10
>>>>     tcp_wq_blk_size=2100
>>>>     tcp_wq_max=10485760
>>>>     open_files_limit=63536
>>>>
>>>>     And having ~1000 TLS/WSS clients I'm getting this after 3 days
>>>>     of running:
>>>>     {
>>>>       "jsonrpc":  "2.0",
>>>>       "result": [
>>>>         "shmem:fragments = 2796",
>>>>         "shmem:free_size = 3737947072", (3.7 G)
>>>>         "shmem:max_used_size = 4857418512", (4.8 G)
>>>>         "shmem:real_used_size = 4851987520",
>>>>         "shmem:total_size = 8589934592",
>>>>         "shmem:used_size = 4838988096"  (4.8 G)
>>>>       ],
>>>>       "id": 984479
>>>>     }
>>>>     Means half of SHM memory is gone
>>>>
>>>>     When running kamcmd mod.stats all shm I'm getting (output
>>>>     truncated for the largest values)
>>>>
>>>>     Module: core
>>>>     {
>>>>           ...
>>>>             tcpconn_new(1201): 130669280 (0.13 G)
>>>>           ...
>>>>             Total: 131657632 (0.13 G)
>>>>     }
>>>>
>>>>     Module: sl
>>>>     {
>>>>      ...
>>>>             Total: 7520
>>>>     }
>>>>
>>>>     Module: siptrace
>>>>     {
>>>>      ...
>>>>             Total: 13520
>>>>     }
>>>>
>>>>     Module: rr
>>>>     {
>>>>             Total: 0
>>>>     }
>>>>
>>>>     Module: dialog
>>>>     {
>>>>     ...
>>>>             Total: 146080
>>>>     }
>>>>
>>>>     Module: permissions
>>>>     {
>>>>     ...
>>>>             Total: 62144
>>>>     }
>>>>
>>>>     Module: htable
>>>>     {
>>>>       ....
>>>>             Total: 3359552 (3.3 M)
>>>>     }
>>>>
>>>>     Module: rtpengine
>>>>     {
>>>>      ...
>>>>             Total: 31456
>>>>     }
>>>>
>>>>     Module: textopsx
>>>>     {
>>>>             Total: 0
>>>>     }
>>>>
>>>>     Module: tsilo
>>>>     {
>>>>      ...
>>>>             Total: 75072
>>>>     }
>>>>
>>>>     Module: tm
>>>>     {
>>>>      ....
>>>>             Total: 3459504 (3.4 M)
>>>>     }
>>>>
>>>>     Module: usrloc
>>>>     {
>>>>      ....
>>>>             Total: 1217616
>>>>     }
>>>>
>>>>     Module: pua_dialoginfo
>>>>     {
>>>>      ...
>>>>             Total: 8720
>>>>     }
>>>>
>>>>     Module: pua
>>>>     {
>>>>      ...
>>>>             Total: 150848
>>>>     }
>>>>
>>>>     Module: pike
>>>>     {
>>>>      ...
>>>>             Total: 9504
>>>>     }
>>>>
>>>>     Module: websocket
>>>>     {
>>>>             wsconn_add(198): 54139856 (54 M)
>>>>      ...
>>>>             Total: 54148096 (54 M)
>>>>     }
>>>>
>>>>     Module: debugger
>>>>     {
>>>>      ...
>>>>             Total: 21872
>>>>     }
>>>>
>>>>     Module: tmx
>>>>     {
>>>>             Total: 512
>>>>     }
>>>>
>>>>     Module: kex
>>>>     {
>>>>            ....
>>>>             Total: 1888
>>>>     }
>>>>
>>>>     Module: tls
>>>>     {
>>>>     ...
>>>>             ser_malloc(364): 87246112 (87 M)
>>>>      ...
>>>>             Total: 87997168 (87 M)
>>>>     }
>>>>
>>>>     Module: secfilter
>>>>     {
>>>>     ...
>>>>             Total: 768
>>>>     }
>>>>
>>>>     Module: exec
>>>>     {
>>>>      ...
>>>>             Total: 16
>>>>     }
>>>>
>>>>     Module: dispatcher
>>>>     {
>>>>      ...
>>>>             Total: 2992
>>>>     }
>>>>
>>>>     Module: cfgutils
>>>>     {
>>>>      ...
>>>>             Total: 48
>>>>     }
>>>>
>>>>     Module: app_python3
>>>>     {
>>>>      ...
>>>>             Total: 32
>>>>     }
>>>>
>>>>     So, I'm guessing, where all the memory gone? Is there any way
>>>>     to get more detailed info on SHM?
>>>>
>>>>     Kamailio 5.8.2
>>>>
>>>>     -- 
>>>>     Thanks in advance,
>>>>     Ihor
>>>>
>>>>     __________________________________________________________
>>>>     Kamailio - Users Mailing List - Non Commercial Discussions
>>>>     To unsubscribe send an email to [email protected]
>>>>     Important: keep the mailing list in the recipients, do not reply only 
>>>> to the sender!
>>>>     Edit mailing list options or unsubscribe:
>>>     -- 
>>>     Daniel-Constantin Mierla (@ asipto.com <http://asipto.com>)
>>>     twitter.com/miconda <http://twitter.com/miconda> -- 
>>> linkedin.com/in/miconda <http://linkedin.com/in/miconda>
>>>     Kamailio Consultancy, Training and Development Services -- asipto.com 
>>> <http://asipto.com>
>
>     -- 
>     Daniel-Constantin Mierla (@ asipto.com <http://asipto.com>)
>     twitter.com/miconda <http://twitter.com/miconda> -- 
> linkedin.com/in/miconda <http://linkedin.com/in/miconda>
>     Kamailio Consultancy, Training and Development Services -- asipto.com 
> <http://asipto.com>
>
>
>
> -- 
> Best regards,
> Ihor (Igor)

-- 
Daniel-Constantin Mierla (@ asipto.com)
twitter.com/miconda -- linkedin.com/in/miconda
Kamailio Consultancy, Training and Development Services -- asipto.com
__________________________________________________________
Kamailio - Users Mailing List - Non Commercial Discussions
To unsubscribe send an email to [email protected]
Important: keep the mailing list in the recipients, do not reply only to the 
sender!
Edit mailing list options or unsubscribe:

Reply via email to