Hi,

When I set an additional param “--multiplier 10” or other numbers (not the 
default 1), all the calls succeeded.

/usr/share/clearwater/bin/run_stress clearwater.opnfv 1000 10 --sipp-output 
--icscf-target 10.67.79.12:5052 --scscf-target 10.67.79.12:5054 --base-number 
2010000000 --multiplier 10  --ccf 10.67.79.11

Total calls: 1083
Successful calls: 1083 (100.0%)
Failed calls: 0 (0.0%)
Unfinished calls: 0

Two questions:
1) What is “multiplier” used for?
2) What does “ccf” refer to? Actually the ip of ccf here is where the ralf 
service runs on my deployment. Is it necessary here?


Thanks,
Linda
________________________________
发件人: wang wulin <[email protected]<mailto:[email protected]>>
发送时间: 2018年3月28日 14:40
收件人: 
[email protected]<mailto:[email protected]>
主题: [clearwater] Some calls failed with 480 when do SIPp testing


Hi clearwater team,



The successful call rate is quite low when I do SIPp testing via the command: 
/usr/share/clearwater/bin/run_stress clearwater.opnfv 100 30 --sipp-output 
--icscf-target sprout.clearwater.local:5052 --scscf-target 
sprout.clearwater.local:5054

Actually after 7 successful calls, the 480 error occurs.

Here is one capture when runing:

------------------------------ Scenario Screen -------- [1-9]: Change Screen --
  Call-rate(length)   Port   Total-time  Total-calls  Remote-host
0.0(5000 ms)/1.000s   5062    1234.12 s           22  10.67.79.12:5054(TCP)

  0 new calls during 1.004 s period      1 ms scheduler resolution
  0 calls (limit 1)                      Peak was 1 calls, after 55 s
  1 Running, 3 Paused, 3 Woken up
  0 dead call msg (discarded)            0 out-of-call msg (discarded)
  2 open sockets

                                 Messages  Retrans   Timeout   Unexpected-Msg
      INVITE ---------->         22        0         0
         100 <----------         22        0         0         0
         183 <----------         12        0         0         10
       PRACK ---------->         12        0
         200 <----------         12        0         0         0
      UPDATE ---------->         12        0
         201 <----------         12        0         0         0
         180 <----------  E-RTD1 12        0         0         0
         201 <----------         0         0         0         0
         200 <----------         12        0         0         0
         ACK ---------->         12        0
       Pause [   5000ms]         12                            0
         BYE ---------->         12        0         0
         200 <----------         12        0         0         0

------ [+|-|*|/]: Adjust rate ---- [q]: Soft exit ---- [p]: Pause traffic -----

Last Error: Aborting call on unexpected message for Call-Id '21-14814@10...


2018-03-28      06:33:54.190938 1522218834.190938: Aborting call on unexpected 
message for Call-Id '[email protected]': while expecting '183' (index 2), 
received 'SIP/2.0 480 Temporarily Unavailable
Via: SIP/2.0/TCP 
10.67.79.24:29014;received=10.67.79.24;branch=z9hG4bK-14814-17-0
Record-Route: 
<sip:scscf.sprout.clearwater.local:5054;transport=TCP;lr;billing-role=charge-term>
Record-Route: 
<sip:scscf.sprout.clearwater.local:5054;transport=TCP;lr;billing-role=charge-orig>
Call-ID: [email protected]<mailto:[email protected]>
From: <sip:[email protected]>;tag=14814SIPpTag0017
To: <sip:[email protected]>;tag=z9hG4bK-14814-17-0
CSeq: 1 INVITE
P-Charging-Vector: 
icid-value="14814SIPpTag0017";orig-ioi=clearwater.opnfv;term-ioi=clearwater.opnfv
P-Charging-Function-Addresses: ccf=0.0.0.0
Content-Length:  0

'.




1) I deoloyed clearwater via one testcase named "cloudify_ims" from 
opnfv/functest project, where 3 steps are run:

   * deploy a VNF orchestrator (Cloudify)

   * deploy a Clearwater vIMS (IP Multimedia Subsystem) VNF from this 
orchestrator based on a TOSCA blueprint defined in [1]
   * run suite of signaling tests on top of this VNF

[1]: 
https://github.com/Orange-OpenSource/opnfv-cloudify-clearwater/archive/master.zip



8 instances are created and I also created a new instance named "stress-node" 
according to this guidance: 
http://clearwater.readthedocs.io/en/stable/Clearwater_stress_testing.html-
bash-4.4# openstack server list
+--------------------------------------+-------------------------------------------------------+--------+---------------------------------------------------------------------------------------+----------------------+-----------+
| ID                                   | Name                                   
               | Status | Networks                                              
                                | Image                | Flavor    |
+--------------------------------------+-------------------------------------------------------+--------+---------------------------------------------------------------------------------------+----------------------+-----------+
| df2d9f82-8aa2-4514-9f07-27974267b590 | stress-node                            
               | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.24           
      | ubuntu_14.04         | m1.small  |
| 48a41da3-c55d-410c-a8f3-ef26bc409aef | 
server_clearwater-opnfv_bono_host_llp4mt              | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.23, 
192.168.36.113 | ubuntu_14.04         | m1.small  |
| a992435c-a397-4622-b0d3-b8e515ebab51 | 
server_clearwater-opnfv_homer_host_vz97vi             | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.19           
      | ubuntu_14.04         | m1.small  |
| 283fef46-78e4-4d0a-9e91-ffeb14e7bfb9 | 
server_clearwater-opnfv_sprout_host_nnoxye            | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.12           
      | ubuntu_14.04         | m1.small  |
| bfe5111d-ed7d-4c65-80c2-20144935ee8c | 
server_clearwater-opnfv_ellis_host_bo7auh             | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.14, 
192.168.36.111 | ubuntu_14.04         | m1.small  |
| b83e1727-b5a6-430e-8b5e-b3f0e6421675 | 
server_clearwater-opnfv_vellum_host_sh0ocy            | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.6            
      | ubuntu_14.04         | m1.small  |
| 8b03d8fa-5873-489b-994b-5f1de24a85c0 | 
server_clearwater-opnfv_bind_host_b9f49w              | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.16, 
192.168.36.110 | ubuntu_14.04         | m1.small  |
| 0ca72f55-7e05-4477-ad2e-2661ad49f7ce | 
server_clearwater-opnfv_dime_host_xryxen              | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.11           
      | ubuntu_14.04         | m1.small  |
| 7d51e479-24b6-4db4-9b8f-3ef42be0b87e | 
server_clearwater-opnfv_proxy_host_rqr7bo             | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.5            
      | ubuntu_14.04         | m1.small  |
| 82672984-b967-49ff-8f00-09ffe3f7fc87 | 
cloudify_manager-6c79129a-8384-4069-81f7-a024738102cd | ACTIVE | 
cloudify_ims_network-6c79129a-8384-4069-81f7-a024738102cd=10.67.79.13, 
192.168.36.105 | cloudify_manager_4.0 | m1.medium |
+--------------------------------------+-------------------------------------------------------+--------+---------------------------------------------------------------------------------------+----------------------+-----------+


2) I checked  the ralf log
root@dime-au6gte:/var/log/ralf# vim ralf_20180328T060000Z.txt
28-03-2018 06:25:57.063 UTC Debug dnscachedresolver.cpp:543: Create and execute 
DNS query transaction
28-03-2018 06:25:57.063 UTC Debug dnscachedresolver.cpp:556: Wait for query 
responses
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:704: Received DNS 
response for _diameter._tcp.clearwater.opnfv type SRV
28-03-2018 06:25:57.064 UTC Warning dnscachedresolver.cpp:846: Failed to 
retrieve record for _diameter._tcp.clearwater.opnfv: Domain name not found
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:946: Adding 
_diameter._tcp.clearwater.opnfv to cache expiry list with deletion time of 
1522218957
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:704: Received DNS 
response for _diameter._sctp.clearwater.opnfv type SRV
28-03-2018 06:25:57.064 UTC Warning dnscachedresolver.cpp:846: Failed to 
retrieve record for _diameter._sctp.clearwater.opnfv: Domain name not found
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:946: Adding 
_diameter._sctp.clearwater.opnfv to cache expiry list with deletion time of 
1522218957
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:560: Received all query 
responses
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:588: Pulling 0 records 
from cache for _diameter._tcp.clearwater.opnfv SRV
28-03-2018 06:25:57.064 UTC Debug dnscachedresolver.cpp:588: Pulling 0 records 
from cache for _diameter._sctp.clearwater.opnfv SRV
28-03-2018 06:25:57.064 UTC Debug diameterresolver.cpp:131: TCP SRV record 
_diameter._tcp.clearwater.opnfv returned 0 records
28-03-2018 06:25:57.064 UTC Debug diameterresolver.cpp:134: SCTP SRV record 
_diameter._sctp.clearwater.opnfv returned 0 records
28-03-2018 06:25:57.064 UTC Error diameterstack.cpp:864: No Diameter peers have 
been found
28-03-2018 06:25:57.068 UTC Verbose httpstack.cpp:345: Process request for URL 
/ping, args (null)
28-03-2018 06:25:57.068 UTC Verbose httpstack.cpp:93: Sending response 200 to 
request for URL /ping, args (null)
28-03-2018 06:25:58.027 UTC Verbose httpstack.cpp:345: Process request for URL 
/call-id/2010000020%2F%2F%2F14990-8708%4010.67.79.24, args (null)
28-03-2018 06:25:58.027 UTC Debug handlers.cpp:128: Handling request, body:
{
    "peers": {
        "ccf": [
            ""
        ]
    },
    "event": {
        "Accounting-Record-Type": 1,
        "Event-Timestamp": 1522218357,
        "Service-Information": {
            "Subscription-Id": [
                {
                    "Subscription-Id-Type": 2,
                    "Subscription-Id-Data": "sip:[email protected]"
                }
            ],
            "IMS-Information": {
                "Event-Type": {
                    "SIP-Method": "REGISTER",
                    "Expires": 3600
                },
                "Role-Of-Node": 0,
                "Node-Functionality": 1,
                "User-Session-Id": 
"2010000020///[email protected]<mailto:2010000020///[email protected]>",
                "Called-Party-Address": "sip:[email protected]",
                "Associated-URI": [
                    "sip:[email protected]"
                ],
                "Time-Stamps": {
                    "SIP-Request-Timestamp": 1522218357,
                    "SIP-Request-Timestamp-Fraction": 960,
                    "SIP-Response-Timestamp": 1522218357,
                    "SIP-Response-Timestamp-Fraction": 967
                },
                "IMS-Charging-Identifier": "",
                "Cause-Code": -1,
                "From-Address": 
"<sip:[email protected]>;tag=8708SIPpTag0014990",
                "Route-Header-Received": 
"<sip:clearwater.opnfv;transport=TCP;lr>",
                "Route-Header-Transmitted": 
"<sip:icscf.sprout.clearwater.local:5052;transport=TCP;lr;orig>",
                "Instance-Id": "<urn:uuid:00000000-0000-0000-0000-000000000001>"
            }
        }
    }
}
28-03-2018 06:25:58.027 UTC Debug handlers.cpp:244: Adding CCF
28-03-2018 06:25:58.027 UTC Debug handlers.cpp:82: Handle the received message
28-03-2018 06:25:58.027 UTC Debug peer_message_sender.cpp:84: Sending message 
to  (number 0)
28-03-2018 06:25:58.027 UTC Debug rf.cpp:63: Building an Accounting-Request
28-03-2018 06:25:58.027 UTC Debug freeDiameter: No Session-Id AVP found in 
message 0x7f764c008d50
28-03-2018 06:25:58.027 UTC Verbose diameterstack.cpp:1470: Sending Diameter 
message of type 271 on transaction 0x7f764c008d00
28-03-2018 06:25:58.027 UTC Verbose httpstack.cpp:93: Sending response 200 to 
request for URL /call-id/2010000020%2F%2F%2F14990-8708%4010.67.79.24, args 
(null)
28-03-2018 06:25:58.027 UTC Debug diameterstack.cpp:397: Routing out callback 
from freeDiameter
28-03-2018 06:25:58.027 UTC Error diameterstack.cpp:293: Routing error: 'No 
remaining suitable candidate to route the message to' for message with 
Command-Code 271, Destination-Host  and Destination-Realm clearwater.opnfv
28-03-2018 06:25:58.027 UTC Debug freeDiameter: Iterating on rules of COMMAND: 
'(generic error format)'.
28-03-2018 06:25:58.027 UTC Debug freeDiameter: Calling callback registered 
when query was sent (0x43cb30, 0x7f764c008d00)
28-03-2018 06:25:58.027 UTC Verbose diameterstack.cpp:1130: Got Diameter 
response of type 271 - calling callback on transaction 0x7f764c008d00
28-03-2018 06:25:58.027 UTC Warning peer_message_sender.cpp:125: Failed to send 
ACR to  (number 0)
28-03-2018 06:25:58.027 UTC Error peer_message_sender.cpp:145: Failed to 
connect to all CCFs, message not sent
28-03-2018 06:25:58.027 UTC Warning session_manager.cpp:414: Session for 
2010000020///[email protected]<mailto:2010000020///[email protected]> 
received error from CDF
28-03-2018 06:25:58.032 UTC Verbose httpstack.cpp:345: Process request for URL 
/call-id/2010000020%2F%2F%2F14990-8708%4010.67.79.24, args (null)
28-03-2018 06:25:58.032 UTC Debug handlers.cpp:128: Handling request, body:
{
    "peers": {
        "ccf": [
            ""
        ]
    },
    "event": {
        "Accounting-Record-Type": 1,
        "Event-Timestamp": 1522218357,
        "Service-Information": {
            "Subscription-Id": [
                {
                    "Subscription-Id-Type": 2,
                    "Subscription-Id-Data": "sip:[email protected]"
                }
            ],
            "IMS-Information": {
                "Event-Type": {
                    "SIP-Method": "REGISTER",
                    "Expires": 3600
                },
                "Role-Of-Node": 0,
                "Node-Functionality": 1,
                "User-Session-Id": 
"2010000020///[email protected]<mailto:2010000020///[email protected]>",
                "Called-Party-Address": "sip:[email protected]",
                "Associated-URI": [
                    "sip:[email protected]"
                ],
                "Time-Stamps": {
                    "SIP-Request-Timestamp": 1522218357,
                    "SIP-Request-Timestamp-Fraction": 968,
                    "SIP-Response-Timestamp": 1522218357,
                    "SIP-Response-Timestamp-Fraction": 974
                },
                "IMS-Charging-Identifier": "",
                "Cause-Code": -1,
                "From-Address": 
"<sip:[email protected]>;tag=8708SIPpTag0014990",
                "Route-Header-Received": 
"<sip:clearwater.opnfv;transport=TCP;lr>",
                "Route-Header-Transmitted": 
"<sip:icscf.sprout.clearwater.local:5052;transport=TCP;lr;orig>",
                "Instance-Id": "<urn:uuid:00000000-0000-0000-0000-000000000001>"
            }
        }
    }
}
28-03-2018 06:25:58.032 UTC Debug handlers.cpp:244: Adding CCF
28-03-2018 06:25:58.032 UTC Debug handlers.cpp:82: Handle the received message
28-03-2018 06:25:58.032 UTC Debug peer_message_sender.cpp:84: Sending message 
to  (number 0)
28-03-2018 06:25:58.032 UTC Debug rf.cpp:63: Building an Accounting-Request
28-03-2018 06:25:58.032 UTC Debug freeDiameter: No Session-Id AVP found in 
message 0x7f764c001310
28-03-2018 06:25:58.032 UTC Verbose diameterstack.cpp:1470: Sending Diameter 
message of type 271 on transaction 0x7f764c006f30
28-03-2018 06:25:58.032 UTC Verbose httpstack.cpp:93: Sending response 200 to 
request for URL /call-id/2010000020%2F%2F%2F14990-8708%4010.67.79.24, args 
(null)
28-03-2018 06:25:58.032 UTC Debug diameterstack.cpp:397: Routing out callback 
from freeDiameter
28-03-2018 06:25:58.032 UTC Error diameterstack.cpp:293: Routing error: 'No 
remaining suitable candidate to route the message to' for message with 
Command-Code 271, Destination-Host  and Destination-Realm clearwater.opnfv
28-03-2018 06:25:58.032 UTC Debug freeDiameter: Iterating on rules of COMMAND: 
'(generic error format)'.
28-03-2018 06:25:58.032 UTC Debug freeDiameter: Calling callback registered 
when query was sent (0x43cb30, 0x7f764c006f30)
28-03-2018 06:25:58.032 UTC Verbose diameterstack.cpp:1130: Got Diameter 
response of type 271 - calling callback on transaction 0x7f764c006f30
28-03-2018 06:25:58.032 UTC Warning peer_message_sender.cpp:125: Failed to send 
ACR to  (number 0)
28-03-2018 06:25:58.032 UTC Error peer_message_sender.cpp:145: Failed to 
connect to all CCFs, message not sent
28-03-2018 06:25:58.032 UTC Warning session_manager.cpp:414: Session for 
2010000020///[email protected]<mailto:2010000020///[email protected]> 
received error from CDF

Have I missed something in shared_config?

root@dime-au6gte:/var/log/ralf# cat /etc/clearwater/shared_config
# Deployment definitions
home_domain=clearwater.opnfv
sprout_hostname=sprout.clearwater.local
chronos_hostname=10.67.79.16:7253
hs_hostname=hs.clearwater.local:8888
hs_provisioning_hostname=hs-prov.clearwater.local:8889
sprout_impi_store=vellum.clearwater.local
sprout_registration_store=vellum.clearwater.local
cassandra_hostname=vellum.clearwater.local
chronos_hostname=vellum.clearwater.local
ralf_session_store=vellum.clearwater.local
ralf_hostname=ralf.clearwater.local:10888
xdms_hostname=homer.clearwater.local:7888
signaling_dns_server=10.67.79.16
snmp_ip=10.67.79.11
reg_max_expires=1800
bgcf=0



# Email server configuration
smtp_smarthost=localhost
smtp_username=username
smtp_password=password
[email protected]<mailto:[email protected]>

# Keys
signup_key=secret
turn_workaround=secret
ellis_api_key=secret
ellis_cookie_key=secret


root@dime-au6gte:~# cat /etc/clearwater/local_config
local_ip=10.67.79.11
public_ip=
public_hostname=dime-au6gte.clearwater.local



homestead seems fine except one line "Failed TWO read for get_row. Try ONE":

root@dime-au6gte:/var/log/homestead# vim homestead_20180328T060000Z.txt
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:425: Attempt to parse 
vellum.clearwater.local as IP address
28-03-2018 06:30:58.045 UTC Verbose dnscachedresolver.cpp:486: Check cache for 
vellum.clearwater.local type 1
28-03-2018 06:30:58.045 UTC Debug dnscachedresolver.cpp:588: Pulling 1 records 
from cache for vellum.clearwater.local A
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:366: Found 1 A/AAAA records, 
creating iterator
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:819: 10.67.79.6:9160 
transport 6 has state: WHITE
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:819: 10.67.79.6:9160 
transport 6 has state: WHITE
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:1004: Added a whitelisted 
server, now have 1 of 1
28-03-2018 06:30:58.045 UTC Debug connection_pool.h:231: Request for connection 
to IP: 10.67.79.6, port: 9160
28-03-2018 06:30:58.045 UTC Debug connection_pool.h:244: Found existing 
connection 0x7fed8c0115a0 in pool
28-03-2018 06:30:58.045 UTC Debug cassandra_store.cpp:159: Generated Cassandra 
timestamp 1522218658045331
28-03-2018 06:30:58.045 UTC Debug cache.cpp:350: Issuing get for key 
sip:[email protected]
28-03-2018 06:30:58.045 UTC Debug cassandra_store.cpp:731: Failed TWO read for 
get_row. Try ONE
28-03-2018 06:30:58.045 UTC Debug cache.cpp:370: Retrieved XML column with TTL 
0 and value <?xml version='1.0' encoding='UTF-8'?><IMSSubscription 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:noNamespaceSchemaLocation="CxDataType.xsd"><PrivateID>[email protected]</PrivateID><ServiceProfile><InitialFilterCriteria><TriggerPoint><ConditionTypeCNF>0</ConditionTypeCNF><SPT><ConditionNegated>0</ConditionNegated><Group>0</Group><Method>INVITE</Method><Extension<mailto:[email protected]%3c/PrivateID%3e%3cServiceProfile%3e%3cInitialFilterCriteria%3e%3cTriggerPoint%3e%3cConditionTypeCNF%3e0%3c/ConditionTypeCNF%3e%3cSPT%3e%3cConditionNegated%3e0%3c/ConditionNegated%3e%3cGroup%3e0%3c/Group%3e%3cMethod%3eINVITE%3c/Method%3e%3cExtension>
 
/></SPT></TriggerPoint><ApplicationServer><ServerName>sip:mmtel.clearwater.opnfv</ServerName><DefaultHandling>0</DefaultHandling></ApplicationServer></InitialFilterCriteria><PublicIdentity><BarringIndication>1</BarringIndication><Identity>sip:[email protected]</Identity></PublicIdentity></ServiceProfile></IMSSubscription<sip:mmtel.clearwater.opnfv%3c/ServerName%3e%3cDefaultHandling%3e0%3c/DefaultHandling%3e%3c/ApplicationServer%3e%3c/InitialFilterCriteria%3e%3cPublicIdentity%3e%3cBarringIndication%3e1%3c/BarringIndication%3e%3cIdentity%3esip:[email protected]%3c/Identity%3e%3c/PublicIdentity%3e%3c/ServiceProfile%3e%3c/IMSSubscription>>
28-03-2018 06:30:58.045 UTC Debug cache.cpp:388: Retrieved is_registered column 
with value False and TTL 0
28-03-2018 06:30:58.045 UTC Debug baseresolver.cpp:830: Successful response 
from  10.67.79.6:9160 transport 6
28-03-2018 06:30:58.045 UTC Debug connection_pool.h:267: Release connection to 
IP: 10.67.79.6, port: 9160 to pool
28-03-2018 06:30:58.045 UTC Debug handlers.cpp:1245: Got IMS subscription from 
cache
28-03-2018 06:30:58.045 UTC Debug handlers.cpp:1260: TTL for this database 
record is 0, IMS Subscription XML is not empty, registration state is 
UNREGISTERED, and the charging addresses are empty
28-03-2018 06:30:58.045 UTC Debug handlers.cpp:1510: Rejecting deregistration 
for user who was not registered
28-03-2018 06:30:58.045 UTC Verbose httpstack.cpp:93: Sending response 400 to 
request for URL /impu/sip%3A2010000021%40clearwater.opnfv/reg-data, args (null)
28-03-2018 06:30:58.046 UTC Debug cache.cpp:370: Retrieved XML column with TTL 
0 and value <?xml version='1.0' encoding='UTF-8'?><IMSSubscription 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:noNamespaceSchemaLocation="CxDataType.xsd"><PrivateID>[email protected]</PrivateID><ServiceProfile><InitialFilterCriteria><TriggerPoint><ConditionTypeCNF>0</ConditionTypeCNF><SPT><ConditionNegated>0</ConditionNegated><Group>0</Group><Method>INVITE</Method><Extension<mailto:[email protected]%3c/PrivateID%3e%3cServiceProfile%3e%3cInitialFilterCriteria%3e%3cTriggerPoint%3e%3cConditionTypeCNF%3e0%3c/ConditionTypeCNF%3e%3cSPT%3e%3cConditionNegated%3e0%3c/ConditionNegated%3e%3cGroup%3e0%3c/Group%3e%3cMethod%3eINVITE%3c/Method%3e%3cExtension>
 
/></SPT></TriggerPoint><ApplicationServer><ServerName>sip:mmtel.clearwater.opnfv</ServerName><DefaultHandling>0</DefaultHandling></ApplicationServer></InitialFilterCriteria><PublicIdentity><BarringIndication>1</BarringIndication><Identity>sip:[email protected]</Identity></PublicIdentity></ServiceProfile></IMSSubscription<sip:mmtel.clearwater.opnfv%3c/ServerName%3e%3cDefaultHandling%3e0%3c/DefaultHandling%3e%3c/ApplicationServer%3e%3c/InitialFilterCriteria%3e%3cPublicIdentity%3e%3cBarringIndication%3e1%3c/BarringIndication%3e%3cIdentity%3esip:[email protected]%3c/Identity%3e%3c/PublicIdentity%3e%3c/ServiceProfile%3e%3c/IMSSubscription>>
28-03-2018 06:30:58.046 UTC Debug cache.cpp:388: Retrieved is_registered column 
with value False and TTL 0
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:830: Successful response 
from  10.67.79.6:9160 transport 6
28-03-2018 06:30:58.046 UTC Debug connection_pool.h:267: Release connection to 
IP: 10.67.79.6, port: 9160 to pool
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1245: Got IMS subscription from 
cache
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1260: TTL for this database 
record is 0, IMS Subscription XML is not empty, registration state is 
UNREGISTERED, and the charging addresses are empty
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1286: Subscriber registering 
with new binding
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1457: Handling initial 
registration
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1654: Attempting to cache IMS 
subscription for public IDs
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1658: Got public IDs to cache 
against - doing it
28-03-2018 06:30:58.046 UTC Debug handlers.cpp:1663: Public ID 
sip:[email protected]
28-03-2018 06:30:58.046 UTC Debug cassandra_store.cpp:159: Generated Cassandra 
timestamp 1522218658046440
28-03-2018 06:30:58.046 UTC Debug a_record_resolver.cpp:80: 
ARecordResolver::resolve_iter for host vellum.clearwater.local, port 9160, 
family 2
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:425: Attempt to parse 
vellum.clearwater.local as IP address
28-03-2018 06:30:58.046 UTC Verbose dnscachedresolver.cpp:486: Check cache for 
vellum.clearwater.local type 1
28-03-2018 06:30:58.046 UTC Debug dnscachedresolver.cpp:588: Pulling 1 records 
from cache for vellum.clearwater.local A
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:366: Found 1 A/AAAA records, 
creating iterator
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:819: 10.67.79.6:9160 
transport 6 has state: WHITE
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:819: 10.67.79.6:9160 
transport 6 has state: WHITE
28-03-2018 06:30:58.046 UTC Debug baseresolver.cpp:1004: Added a whitelisted 
server, now have 1 of 1
28-03-2018 06:30:58.046 UTC Debug connection_pool.h:231: Request for connection 
to IP: 10.67.79.6, port: 9160
28-03-2018 06:30:58.046 UTC Debug connection_pool.h:244: Found existing 
connection 0x7fed8c0115a0 in pool
28-03-2018 06:30:58.046 UTC Debug cassandra_store.cpp:612: Constructing 
cassandra put request with timestamp 1522218658046440 and per-column TTLs
28-03-2018 06:30:58.046 UTC Debug cassandra_store.cpp:629:   
ims_subscription_xml => <?xml version='1.0' encoding='UTF-8'?><IMSSubscription 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:noNamespaceSchemaLocation="CxDataType.xsd"><PrivateID>[email protected]</PrivateID><ServiceProfile><InitialFilterCriteria><TriggerPoint><ConditionTypeCNF>0</ConditionTypeCNF><SPT><ConditionNegated>0</ConditionNegated><Group>0</Group><Method>INVITE</Method><Extension<mailto:[email protected]%3c/PrivateID%3e%3cServiceProfile%3e%3cInitialFilterCriteria%3e%3cTriggerPoint%3e%3cConditionTypeCNF%3e0%3c/ConditionTypeCNF%3e%3cSPT%3e%3cConditionNegated%3e0%3c/ConditionNegated%3e%3cGroup%3e0%3c/Group%3e%3cMethod%3eINVITE%3c/Method%3e%3cExtension>
 
/></SPT></TriggerPoint><ApplicationServer><ServerName>sip:mmtel.clearwater.opnfv</ServerName><DefaultHandling>0</DefaultHandling></ApplicationServer></InitialFilterCriteria><PublicIdentity><BarringIndication>1</BarringIndication><Identity>sip:[email protected]</Identity></PublicIdentity></ServiceProfile></IMSSubscription<sip:mmtel.clearwater.opnfv%3c/ServerName%3e%3cDefaultHandling%3e0%3c/DefaultHandling%3e%3c/ApplicationServer%3e%3c/InitialFilterCriteria%3e%3cPublicIdentity%3e%3cBarringIndication%3e1%3c/BarringIndication%3e%3cIdentity%3esip:[email protected]%3c/Identity%3e%3c/PublicIdentity%3e%3c/ServiceProfile%3e%3c/IMSSubscription>>
 (TTL 0)
28-03-2018 06:30:58.046 UTC Debug cassandra_store.cpp:629:   is_registered =>  
(TTL 0)
28-03-2018 06:30:58.046 UTC Debug cassandra_store.cpp:650: Executing put 
request operation
28-03-2018 06:30:58.047 UTC Debug baseresolver.cpp:830: Successful response 
from  10.67.79.6:9160 transport 6
28-03-2018 06:30:58.047 UTC Debug connection_pool.h:267: Release connection to 
IP: 10.67.79.6, port: 9160 to pool
28-03-2018 06:30:58.047 UTC Debug handlers.cpp:1567: Sending 200 response (body 
was {"reqtype": "reg", "server_name": 
"sip:scscf.sprout.clearwater.local:5054;transport=TCP"})
28-03-2018 06:30:58.047 UTC Verbose httpstack.cpp:93: Sending response 200 to 
request for URL /impu/sip%3A2010000021%40clearwater.opnfv/reg-data, args 
private_id=2010000021%40clearwater.opnfv



The error about "Could not get subscriber data from HSS" on Sprout node 
occurred sometimes.

root@sprout-2tl8g3:/var/log/sprout# vim sprout_20180328T060000Z.txt
28-03-2018 06:36:11.741 UTC Error httpclient.cpp:712: cURL failure with cURL 
error code 0 (see man 3 libcurl-errors) and HTTP error code 400
28-03-2018 06:36:11.741 UTC Error hssconnection.cpp:704: Could not get 
subscriber data from HSS
28-03-2018 06:36:25.875 UTC Status alarm.cpp:62: sprout issued 1004.1 alarm




root@dime-au6gte:/var/log/homestead# monit summary
Monit 5.18.1 uptime: 2h 27m
 Service Name                     Status                      Type
 node-dime-au6gte.clearwater....  Running                     System
 snmpd_process                    Running                     Process
 ralf_process                     Running                     Process
 ntp_process                      Running                     Process
 nginx_process                    Running                     Process
 homestead_process                Running                     Process
 homestead-prov_process           Running                     Process
 clearwater_queue_manager_pro...  Running                     Process
 etcd_process                     Running                     Process
 clearwater_diags_monitor_pro...  Running                     Process
 clearwater_config_manager_pr...  Running                     Process
 clearwater_cluster_manager_p...  Running                     Process
 ralf_uptime                      Status ok                   Program
 poll_ralf                        Status ok                   Program
 nginx_ping                       Status ok                   Program
 nginx_uptime                     Status ok                   Program
 monit_uptime                     Status ok                   Program
 homestead_uptime                 Status ok                   Program
 poll_homestead                   Status ok                   Program
 check_cx_health                  Status ok                   Program
 poll_homestead-prov              Status ok                   Program
 clearwater_queue_manager_uptime  Status ok                   Program
 etcd_uptime                      Status ok                   Program
 poll_etcd_cluster                Status ok                   Program
 poll_etcd                        Status ok                   Program


Any help/suggestion would be much appreciated.




Thanks,

Linda
_______________________________________________
Clearwater mailing list
[email protected]
http://lists.projectclearwater.org/mailman/listinfo/clearwater_lists.projectclearwater.org

Reply via email to