Hi list,

Finally, my problem evolved and is I hope solved.
Ellie, you correctly pinpointed that the file [sprout]
/etc/clearwater/cluster_setting was badly filed by etcd with my Bono host
address!

However in order for my cluster to work again, stop all clearwater-*
services did not work out and I had to remove and install again all
clearwater-* packages. (from
http://clearwater.readthedocs.org/en/stable/Manual_Install/index.html).

Also, given that I did not run an apt-get autoremove, some remaining files
were still in my systems.
Indeed, running "python create_numbers.py --start 6505550000 --count 1000"
returned that the numbers was already in database (Created 0 numbers, 1000
already present in database).

So I ran "python create_numbers.py --start 6505552000 --count 1000" that
returned the following: Created 1000 numbers, 0 already present in database.

Now, 90% of the tests are said to be passed!
I still have some 403 errors but it's another story.
I now have to understand how I can see whether the sipp testing is OK or
not.

Thank you so much for your time Ellie!

Yours sincerely,
Austin


2015-10-02 12:56 GMT+02:00 Austin Marston <[email protected]>:

> Hello Ellie,
>
> The local_config file is ok.
> The cluster_setting file is now well configured automatically with etcd.
>
>
> Here are the results of the command :
> *[sprout]*user@cw-002:~$ sudo monit summary
>
> The Monit daemon 5.8.1 uptime: 1h 34m
>
> Process 'sprout_process'            Running
> Program 'poll_sprout_sip'           Status ok
> Program 'poll_sprout_http'          Status ok
> Process 'snmpd_process'             Running
> Process 'ntp_process'               Running
> System 'node-cw-002.svc.laas.fr'    Running
> Process 'memcached_process'         Running
> Program 'poll_memcached'            Status ok
> Process 'etcd_process'              Running
> Program 'poll_etcd_cluster'         Waiting
> Program 'poll_etcd'                 Status ok
> Process 'clearwater_diags_monitor_process' Running
> Process 'clearwater_config_manager' Running
> *Process 'clearwater_cluster_manager' Execution failed*
> Process 'chronos_process'           Running
> Program 'poll_chronos'              Status ok
> Process 'astaire_process'           Running
>
> *[homestead]*user@cw-004:~$ sudo monit summary
> [sudo] password for user:
> The Monit daemon 5.8.1 uptime: 1h 38m
>
> Process 'snmpd_process'             Running
> Process 'ntp_process'               Running
> System 'node-cw-004.svc.laas.fr'    Running
> Process 'nginx_process'             Running
> Process 'homestead_process'         Running
> Program 'poll_homestead'            Status ok
> Process 'homestead-prov_process'    Running
> Program 'poll_homestead-prov'       Status ok
> *Process 'etcd_process'              Does not exist*
> Program 'poll_etcd_cluster'         Initializing
> Program 'poll_etcd'                 Initializing
> Process 'clearwater_diags_monitor_process' Running
> Process 'clearwater_config_manager' Running
> *Process 'clearwater_cluster_manager' Running*
> Process 'cassandra_process'         Running
>
> The cluster_manager is also failed on Ellis and Bono.
>
> Austin
>
> 2015-10-02 12:02 GMT+02:00 Eleanor Merry <[email protected]>:
>
>> The cluster_settings file is created from the values in
>> /etc/clearwater/local_config. Can you check that the values in your
>> local_config file are correct for the Sprout node, and change the cluster
>> settings file to use Sprout’s local IP, not Bono’s.
>>
>>
>>
>> Also, what’s the result of running ‘sudo monit summary’ on your Sprout
>> node?
>>
>>
>>
>> Ellie
>>
>>
>>
>> *From:* Austin Marston [mailto:[email protected]]
>> *Sent:* 01 October 2015 16:23
>>
>> *To:* Eleanor Merry
>> *Subject:* Re: [Clearwater] Manual installation does not pass any live
>> test
>>
>>
>>
>> I do not use cluster for the moment and I did not configured
>> clearwater-cluster-manager.
>>
>> I followed the most simple deployment following this page
>> http://clearwater.readthedocs.org/en/stable/Manual_Install/index.html without
>> followinf the Larger Scale Deployment part.
>>
>>
>>
>> The /etc/clearwater/cluster_settings file was created automatically and
>> with the 172.16.1.20 address like you found, which is my Bono host.
>>
>>
>>
>>
>>
>> Austin
>>
>>
>>
>> 2015-10-01 17:02 GMT+02:00 Eleanor Merry <[email protected]>:
>>
>> Hi Austin,
>>
>>
>>
>> You’re seeing repeated 401s in the tests because Sprout can’t write the
>> authentication vectors to memcached.
>>
>>
>>
>> On an initial register, Sprout gets an authentication vector from
>> Homestead, stores it in memcached, then responds with a 401 challenge to
>> the client. The client then should send it a new REGISTER with an
>> authorization header. Sprout will then test the supplied authorisation
>> credentials against the authentication vector it’s just stored, and if it
>> succeeds respond with a 200.
>>
>>
>>
>> In this case, Sprout doesn’t have a stored authentication vector (as the
>> write to memcached failed), so it repeats the first step.
>>
>>
>>
>> Looking at the logs in detail, we see:
>>
>> 01-10-2015 12:36:06.700 UTC Debug memcachedstore.cpp:916: ADD/CAS
>> returned rc = 5 (WRITE FAILURE)
>>
>> (19914928) CONNECTION FAILURE(Connection refused),  host:
>> 172.16.1.20:11211 -> libmemcached/connect.cc:131
>>
>>
>>
>> This is attempting to connect to host 172.16.1.20 – this doesn’t match
>> the Sprout IP address you’ve put below. Can you check that the value in
>> /etc/clearwater/cluster_settings are correct for your current system (you
>> can see more details at
>> http://clearwater.readthedocs.org/en/latest/Old_Manual_Install/index.html#larger-scale-deployments,
>> Clustering Sprout).
>>
>>
>>
>> Also, are you using our automatic clustering tool
>> (clearwater-cluster-manager)?
>>
>>
>>
>> Ellie
>>
>>
>>
>> *From:* Austin Marston [mailto:[email protected]]
>> *Sent:* 01 October 2015 15:13
>> *To:* Eleanor Merry
>> *Subject:* Re: [Clearwater] Manual installation does not pass any live
>> test
>>
>>
>>
>> ​​Yes I run the create_number.py script.
>>
>> Indeed I can create users manually if I want to, on Ellis' interface.
>>
>> I created one 2 users as my first tests before trying the live tests.
>>
>>
>>
>> Actually, the access files are really light, but the
>> sprout/homestead_date files are huge so here they are cut.
>>
>>
>>
>> Thanks,
>>
>> Austin
>>
>>
>>
>> 2015-10-01 15:18 GMT+02:00 Eleanor Merry <[email protected]>:
>>
>> The file is a bit too large. Can you just send the homestead_<date> and
>> sprout_<date> logs (so not the access logs)?
>>
>>
>>
>> Also, the clearwater-live-tests create their own subscribers – you
>> shouldn’t create any yourself (although you do need to run the
>> create_numbers.py script on Ellis so that it has a pool of number it can
>> provision subscribers from during the live tests – see
>> https://github.com/Metaswitch/ellis/blob/dev/docs/create-numbers.md for
>> more details).
>>
>>
>>
>> Ellie
>>
>>
>>
>> *From:* Austin Marston [mailto:[email protected]]
>> *Sent:* 01 October 2015 14:10
>> *To:* Eleanor Merry
>> *Subject:* Re: [Clearwater] Manual installation does not pass any live
>> test
>>
>>
>>
>> ​
>>
>> Hi Ellis,
>>
>>
>>
>> You'll find as attachments in google drive the Homestead and Sprout logs
>> corresponding to a new try of the live tests.
>>
>> They are the most recent log files (a total of 4 files) in both Sprout
>> and Homestead VMs.
>>
>> -The test was run between 12:36 and 12:42 on 10/01/2015.
>>
>> -172.16.1.28 is my test client
>>
>> -172.16.1.21 is Sprout.
>>
>> -172.16.1.23 is Homestead.
>>
>> -176.16.1.29 is the DNS server.
>>
>>
>>
>> The log level 5 was already up in Homestead and not in Sprout.
>>
>>
>>
>> Two more notes :
>>
>> -> I just found that while running all the tests, at the very end of the
>> execution, I see this output on my test client:
>>
>> Failed to delete leaked number, check Ellis logs
>>
>> -> I also have issue running the sipp testing. Even thought I created the
>> users following all steps of the "Bulk-provisioning numbers" page and run
>> the scripts I get some 403 and 401 errors in Bono and nothing is created in
>> the Ellis database.
>>
>>
>>
>>
>>
>> Thank you so much for your help!
>>
>>
>>
>> Austin
>>
>>
>>
>> 2015-10-01 12:42 GMT+02:00 Eleanor Merry <[email protected]>:
>>
>> Hi Austin,
>>
>>
>>
>> Can you please send me the full Sprout and Homestead logs for this (as
>> attachments). Can you also turn on debug logging - to do so create/edit the
>> file /etc/clearwater/user_settings, add log_level=5 and then restart
>> Sprout/Homestead (service <sprout/homestead> stop – they’re automatically
>> restarted by monit).
>>
>>
>>
>> Thanks,
>>
>>
>>
>> Ellie
>>
>>
>>
>> *From:* Clearwater [mailto:[email protected]]
>> *On Behalf Of *Austin Marston
>> *Sent:* 30 September 2015 09:04
>> *To:* [email protected]
>> *Subject:* [Clearwater] Manual installation does not pass any live test
>>
>>
>>
>> Hi list,
>>
>>
>> I just manually deployed over a VMware platform five VMs respectively
>> with bono, homer,sprout,homestead, and ellis, with no HSS lookups nor
>> I-CSCF function.
>> I tried to test it with clearwater-live-test but every tests fail with
>> the following error:
>> RuntimeError thrown: - Expected 200, got
>> 401
>>
>>
>> I understood from old messages from the list that "401s are expected. The
>> first time a client registers with Clearwater it will not provide
>> authentication credentials. Sprout rejects the REGISTER message with a 401
>> response, containing an authentication challenge. The client then sends
>> another REGISTER with credentials."
>> But still, I cannot get what I'm missing here even with level 5
>> logs.
>> Indeed, Bono seems to fail with both 403 and 401
>> errors.
>>
>>
>> In the remaining of this email I put parts of the logs of each VM to
>> illustrate my problem.
>>
>>
>> *Sprout
>> logs:                                                                     *
>> 29-09-2015 16:31:33.994 UTC Error hssconnection.cpp:147: Failed to get
>> Authentication Vector for [email protected]
>> 29-09-2015 16:31:34.005 UTC Error httpconnection.cpp:743: cURL failure
>> with cURL error code 0 (see man 3 libcurl-errors) and HTTP error code 404
>> 29-09-2015 16:31:34.005 UTC Error hssconnection.cpp:147: Failed to get
>> Authentication Vector for [email protected]
>>
>> [...]
>> 29-09-2015 16:31:34.145 UTC Error hssconnection.cpp:147: Failed to get
>> Authentication Vector for [email protected]
>> 29-09-2015 16:31:39.414 UTC Status load_monitor.cpp:237: Maximum incoming
>> request rate/second increased to 1675.491699 (based on a smoothed mean
>> latency of 2810 and 0 upstream overload responses)
>> 29-09-2015 16:31:41.596 UTC Error memcachedstore.cpp:719: Failed to write
>> data for av\\[email protected]\0826351d69f0f550 to 1 replicas
>> 29-09-2015 16:31:41.596 UTC Error avstore.cpp:84: Failed to write
>> Authentication Vector for private_id [email protected]
>> 29-09-2015 16:31:41.830 UTC Error memcachedstore.cpp:514: Failed to read
>> data for av\\[email protected]\0826351d69f0f550 from 1 replicas
>> 29-09-2015 16:31:41.830 UTC Warning authentication.cpp:242: Received an
>> authentication request for [email protected] with nonce
>> 0826351d69f0f550, but no matching AV found
>>
>>
>>
>>
>> *Homer
>> logs:                                                                      *
>> 29-09-2015 16:31:33.566 UTC DEBUG base.py:283: Writing response body:
>> {}
>> 29-09-2015 16:31:33.566 UTC INFO base.py:269: Sending 200 response to
>> localhost for PUT
>> http://http_homer/org.etsi.ngn.simservs/users/sip%3A6505550980%40my.local.domain/simservs.xml
>> 29-09-2015 16:31:39.797 UTC INFO base.py:256: Received request from
>> localhost - PUT
>> http://http_homer/org.etsi.ngn.simservs/users/sip%3A6505550758%40my.local.domain/simservs.xml
>> 29-09-2015 16:31:39.798 UTC INFO xsd.py:76: Performing XSD
>> validation
>> 29-09-2015 16:31:39.801 UTC DEBUG base.py:283: Writing response body:
>> {}
>> 29-09-2015 16:31:39.802 UTC INFO base.py:269: Sending 200 response to
>> localhost for PUT
>> http://http_homer/org.etsi.ngn.simservs/users/sip%3A6505550758%40my.local.domain/simservs.xml
>> 29-09-2015 16:31:43.436 UTC INFO base.py:256: Received request from
>> localhost - PUT
>> http://http_homer/org.etsi.ngn.simservs/users/sip%3A6505550048%40my.local.domain/simservs.xml
>> 29-09-2015 16:31:43.436 UTC INFO xsd.py:76: Performing XSD
>> validation
>> 29-09-2015 16:31:43.439 UTC DEBUG base.py:283: Writing response body:
>> {}
>>
>>
>>
>>
>> *Homestead-prov
>> logs:                                                             *
>> 29-09-2015 16:31:33.538 UTC INFO base.py:269: Sending 200 response to
>> localhost for PUT
>> http://http_homestead_prov/irs/1004b727-c478-4123-98f6-9b63802c6a8c/service_profiles/19e5a738-f3ec-4664-bd2b-958ab5a8310a/filter_criteria
>> 29-09-2015 16:31:34.183 UTC INFO base.py:256: Received request from
>> localhost - PUT
>> http://http_homestead_prov/private/6505550473%40my.local.domain
>> 29-09-2015 16:31:34.184 UTC DEBUG models.py:303: Create private ID
>> [email protected]
>> 29-09-2015 16:31:34.185 UTC DEBUG cache.py:85: Put private ID 
>> '[email protected]'
>> into cache with AV: {'digest': {'ha1': u'33accced67a669ca7b3d7e96640ef7f4',
>> 'realm': u'my.local.domain', 'qop': 'auth'}}
>> 29-09-2015 16:31:34.187 UTC INFO base.py:269: Sending 200 response to
>> localhost for PUT
>> http://http_homestead_prov/private/6505550473%40my.local.domain
>>
>>
>> *Homestead
>> logs:                                                                  *
>> 29-09-2015 16:31:33.999 UTC Debug cassandra_store.cpp:731: Failed ONE
>> read for get_columns. Try QUORUM
>> 29-09-2015 16:31:34.000 UTC Debug cassandra_store.cpp:432: Cassandra
>> request failed: rc=2, Row [email protected] not present in
>> column_family impi
>> 29-09-2015 16:31:34.000 UTC Debug handlers.cpp:196: Cache query failed -
>> reject request
>> 29-09-2015 16:31:34.000 UTC Debug handlers.cpp:201: No cached av found
>> for private ID [email protected], public ID
>> sip:[email protected] - reject
>> 29-09-2015 16:31:34.000 UTC Verbose httpstack.cpp:69: Sending response
>> 404 to request for URL /impi/2010018950%40my.local.domain/av, args
>> impu=sip%3A2010018950%40my.local.domain
>> 29-09-2015 16:31:34.000 UTC Verbose httpstack.cpp:286: Process request
>> for URL /impi/2010070984%40my.local.domain/av, args
>> impu=sip%3A2010070984%40my.local.domain
>> 29-09-2015 16:31:34.000 UTC Debug handlers.cpp:147: Parsed HTTP request:
>> private ID [email protected], public ID
>> sip:[email protected], scheme Unknown, authorization
>> 29-09-2015 16:31:34.000 UTC Debug handlers.cpp:167: Querying cache for
>> authentication vector for
>> [email protected]/sip:[email protected]
>> 29-09-2015 16:31:34.000 UTC Debug cassandra_store.cpp:279: Getting
>> thread-local Client
>> 29-09-2015 16:31:34.000 UTC Debug cache.cpp:673: Looking for
>> authentication vector for [email protected]
>> 29-09-2015 16:31:34.000 UTC Debug cache.cpp:686: Checking public ID
>> sip:[email protected]
>> 29-09-2015 16:31:34.000 UTC Debug cache.cpp:696: Issuing cache
>> query
>> 29-09-2015 16:31:34.000 UTC Debug cassandra_store.cpp:731: Failed ONE
>> read for get_columns. Try QUORUM
>> 29-09-2015 16:31:34.001 UTC Verbose httpstack.cpp:286: Process request
>> for URL /impi/2010052484%40my.local.domain/av, args
>> impu=sip%3A2010052484%40my.local.domain
>> 29-09-2015 16:31:34.001 UTC Debug handlers.cpp:147: Parsed HTTP request:
>> private ID [email protected], public ID
>> sip:[email protected], scheme Unknown, authorization
>>
>>
>>
>>
>> *Bono
>> logs:                                                                       *
>> 29-09-2015 16:31:33.996 UTC Call-Disconnected: CALL_ID=2010076236///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.008 UTC Call-Disconnected: CALL_ID=2010018950///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.008 UTC Call-Disconnected: CALL_ID=2010070984///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.008 UTC Call-Disconnected: CALL_ID=2010052484///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.013 UTC Call-Disconnected: CALL_ID=2010023276///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.013 UTC Call-Disconnected: CALL_ID=2010004256///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.014 UTC Call-Disconnected: CALL_ID=2010076066///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.014 UTC Call-Disconnected: CALL_ID=2010040994///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.020 UTC Call-Disconnected: CALL_ID=2010047262///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.021 UTC Call-Disconnected: CALL_ID=2010070408///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.024 UTC Call-Disconnected: CALL_ID=2010001186///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.030 UTC Call-Disconnected: CALL_ID=2010080220///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.037 UTC Call-Disconnected: CALL_ID=2010026340///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.049 UTC Call-Disconnected: CALL_ID=2010059662///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.051 UTC Call-Disconnected: CALL_ID=2010007964///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.059 UTC Call-Disconnected: CALL_ID=2010001412///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.084 UTC Call-Disconnected: CALL_ID=2010048846///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.085 UTC Call-Disconnected: CALL_ID=2010031554///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.092 UTC Call-Disconnected: CALL_ID=2010083556///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.092 UTC Call-Disconnected: CALL_ID=2010025210///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.110 UTC Call-Disconnected: CALL_ID=2010060732///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.112 UTC Call-Disconnected: CALL_ID=2010020690///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.120 UTC Call-Disconnected: CALL_ID=2010065600///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.123 UTC Call-Disconnected: CALL_ID=2010058752///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.129 UTC Call-Disconnected: CALL_ID=2010070682///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.129 UTC Call-Disconnected: CALL_ID=2010046438///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.142 UTC Call-Disconnected: CALL_ID=2010049994///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.144 UTC Call-Disconnected: CALL_ID=2010070470///
>> [email protected] REASON=403
>> 29-09-2015 16:31:34.146 UTC Call-Disconnected: CALL_ID=2010024050///
>> [email protected] REASON=403
>> 29-09-2015 16:31:41.598 UTC Call-Disconnected:
>> CALL_ID=17745eb3f9e50ecb8155e83aea81edae REASON=401
>> 29-09-2015 16:31:41.835 UTC Call-Disconnected:
>> CALL_ID=17745eb3f9e50ecb8155e83aea81edae REASON=401
>> 29-09-2015 16:31:42.136 UTC Call-Disconnected:
>> CALL_ID=17745eb3f9e50ecb8155e83aea81edae REASON=401
>> 29-09-2015 16:31:42.437 UTC Call-Disconnected:
>> CALL_ID=17745eb3f9e50ecb8155e83aea81edae REASON=401
>> 29-09-2015 16:31:45.312 UTC Call-Disconnected:
>> CALL_ID=88a59417552efe9582128c1c9d134862 REASON=401
>> 29-09-2015 16:31:45.320 UTC Call-Disconnected:
>> CALL_ID=88a59417552efe9582128c1c9d134862 REASON=401
>> 29-09-2015 16:31:45.327 UTC Call-Disconnected:
>> CALL_ID=88a59417552efe9582128c1c9d134862 REASON=401
>> 29-09-2015 16:31:45.334 UTC Call-Disconnected:
>> CALL_ID=88a59417552efe9582128c1c9d134862 REASON=401
>> 29-09-2015 16:31:51.173 UTC Call-Disconnected:
>> CALL_ID=a716eed59a862336ae71d2a329b77400 REASON=401
>> 29-09-2015 16:31:51.465 UTC Call-Disconnected:
>> CALL_ID=a716eed59a862336ae71d2a329b77400 REASON=401
>> 29-09-2015 16:31:51.766 UTC Call-Disconnected:
>> CALL_ID=a716eed59a862336ae71d2a329b77400 REASON=401
>>
>>
>>
>>
>> *Ellis
>> logs:                                                                      *
>> 29-09-2015 16:31:32.727 UTC DEBUG utils.py:94: Still expecting 0
>> callbacks
>> 29-09-2015 16:31:34.324 UTC DEBUG numbers.py:121: Number allocation API
>> call (PSTN = false)
>> 29-09-2015 16:31:34.650 UTC DEBUG numbers.py:134: Allocating a non-PSTN
>> number
>> 29-09-2015 16:31:34.653 UTC DEBUG numbers.py:148: Fetched
>> e7d0bf0c-e00a-41c8-8bc7-5377aa76dc97
>> 29-09-2015 16:31:35.426 UTC DEBUG numbers.py:160: Updated the
>> owner
>> 29-09-2015 16:31:35.427 UTC DEBUG numbers.py:130: SIP URI sip:6505550758
>> @my.local.domain
>> 29-09-2015 16:31:39.686 UTC DEBUG numbers.py:159: Populating other
>> servers...
>> 29-09-2015 16:31:39.686 UTC DEBUG numbers.py:170: About to create private
>> ID at Homestead
>> 29-09-2015 16:31:39.686 UTC WARNING homestead.py:279: Passing SIP
>> password in the clear over http
>> 29-09-2015 16:31:39.698 UTC WARNING homestead.py:279: Passing SIP
>> password in the clear over http
>> 29-09-2015 16:31:39.704 UTC WARNING homestead.py:279: Passing SIP
>> password in the clear over http
>> 29-09-2015 16:31:39.720 UTC DEBUG numbers.py:175: Created private ID at
>> Homestead
>> 29-09-2015 16:31:39.721 UTC WARNING homestead.py:279: Passing SIP
>> password in the clear over http
>> 29-09-2015 16:31:39.794 UTC DEBUG utils.py:84: OK HTTP response.
>> HTTPResponse(code=200,request_time=0.015350818634033203,buffer=<_io.BytesIO
>> object at
>> 0x7ff1a10d6f50>,_body=None,time_info={},request=<tornado.httpclient.HTTPRequest
>> object at 0x7ff1a1070d50>,effective_url='
>> http://hs.my.local.domain:8889/private/6505550758%40my.local.domain/associated_implicit_registration_sets/1ed1c2eb-745d-4d53-8be2-68116882268e',headers={'Date':
>> 'Tue, 29 Sep 2015 16:31:39 GMT', 'Content-Length': '0', 'Content-Type':
>> 'text/html; charset=UTF-8', 'Connection': 'close', 'Server': 'nginx/1.4.6
>> (Ubuntu)'},error=None)
>> 29-09-2015 16:31:39.794 UTC DEBUG utils.py:94: Still expecting 2
>> callbacks
>> 29-09-2015 16:31:39.803 UTC DEBUG utils.py:94: Still expecting 1
>> callbacks
>> 29-09-2015 16:31:39.823 UTC DEBUG utils.py:84: OK HTTP response.
>> HTTPResponse(code=200,request_time=0.030350208282470703,buffer=<_io.BytesIO
>> object at
>> 0x7ff1a10d6d10>,_body=None,time_info={},request=<tornado.httpclient.HTTPRequest
>> object at 0x7ff1a10705d0>,effective_url='
>> http://hs.my.local.domain:8889/irs/1ed1c2eb-745d-4d53-8be2-68116882268e/service_profiles/113003da-f6f3-4efa-8612-bcb4f5be908b/filter_criteria',headers={'Date':
>> 'Tue, 29 Sep 2015 16:31:39 GMT', 'Content-Length': '0', 'Content-Type':
>> 'text/html; charset=UTF-8', 'Connection': 'close', 'Server': 'nginx/1.4.6
>> (Ubuntu)'},error=None)
>> 29-09-2015 16:31:39.823 UTC DEBUG utils.py:92: All requests
>> successful.
>>
>>
>> Many thanks for your help
>> !
>>
>>
>> PS: Let me know if the logs are not clear enough or if I should put them
>> in attachment.
>>
>>
>>
>>
>>
>>
>>
>
>
_______________________________________________
Clearwater mailing list
[email protected]
http://lists.projectclearwater.org/mailman/listinfo/clearwater_lists.projectclearwater.org

Reply via email to