The reload takes about 5 seconds. But I think there is nothing to do
with that. Because the nginx error log said the connection reset while
receiving the response. As following:

2013/12/06 17:49:46 [error] 14090#0: *63748476 recv() failed (104:
Connection reset by peer) while r
eading response header from upstream, client: 115.210.50.98, server:
service.mkey.163.com, request:
"POST /WSszq1twyG/api/v3/fetch_messages HTTP/1.1", upstream:
"http://127.0.0.1:4004/api/v3/fetch_mes
sages", host: "service.mkey.163.com"
2013/12/06 17:49:46 [error] 14090#0: *63748453 recv() failed (104:
Connection reset by peer) while r
eading response header from upstream, client: 221.194.31.132, server:
service.mkey.163.com, request:
 "POST /WSszq1twyG/api/v3/check_for_client_update HTTP/1.1", upstream:
"http://127.0.0.1:4004/api/v3
/check_for_client_update", host: "service.mkey.163.com"

On Fri, Dec 6, 2013 at 6:08 PM, Roberto De Ioris <[email protected]> wrote:
>
>> There is only connection reset error now.  No connection refused.
>
> How much time the reload takes ? If it is higher than
> uwsgi_connect_timeout in nginx you will have problems.
>
>>
>> On Fri, Dec 6, 2013 at 6:02 PM, Roberto De Ioris <[email protected]> wrote:
>>>
>>>> Here is my uwsgi reload log:
>>>>
>>>> ...gracefully killing workers...
>>>> Gracefully killing worker 2 (pid: 14095)...
>>>> Gracefully killing worker 1 (pid: 14094)...
>>>> Gracefully killing worker 4 (pid: 14097)...
>>>> Gracefully killing worker 8 (pid: 14101)...
>>>> Gracefully killing worker 5 (pid: 14098)...
>>>> Gracefully killing worker 7 (pid: 14100)...
>>>> Gracefully killing worker 12 (pid: 14105)...
>>>> Gracefully killing worker 3 (pid: 14096)...
>>>> Gracefully killing worker 19 (pid: 14112)...
>>>> Gracefully killing worker 15 (pid: 14108)...
>>>> Gracefully killing worker 6 (pid: 14099)...
>>>> Gracefully killing worker 9 (pid: 14102)...
>>>> Gracefully killing worker 20 (pid: 14113)...
>>>> Gracefully killing worker 14 (pid: 14107)...
>>>> Gracefully killing worker 17 (pid: 14110)...
>>>> Gracefully killing worker 13 (pid: 14106)...
>>>> Gracefully killing worker 23 (pid: 14116)...
>>>> Gracefully killing worker 22 (pid: 14115)...
>>>> Gracefully killing worker 21 (pid: 14114)...
>>>> Gracefully killing worker 26 (pid: 14119)...
>>>> Gracefully killing worker 25 (pid: 14118)...
>>>> Gracefully killing worker 27 (pid: 14120)...
>>>> Gracefully killing worker 32 (pid: 14125)...
>>>> Gracefully killing worker 16 (pid: 14109)...
>>>> Gracefully killing worker 33 (pid: 14126)...
>>>> Gracefully killing worker 35 (pid: 14128)...
>>>> Gracefully killing worker 34 (pid: 14127)...
>>>> Gracefully killing worker 29 (pid: 14122)...
>>>> Gracefully killing worker 18 (pid: 14111)...
>>>> Gracefully killing worker 36 (pid: 14129)...
>>>> Gracefully killing worker 24 (pid: 14117)...
>>>> Gracefully killing worker 46 (pid: 14139)...
>>>> Gracefully killing worker 42 (pid: 14135)...
>>>> Gracefully killing worker 31 (pid: 14124)...
>>>> Gracefully killing worker 50 (pid: 14143)...
>>>> Gracefully killing worker 39 (pid: 14132)...
>>>> Gracefully killing worker 47 (pid: 14140)...
>>>> Gracefully killing worker 28 (pid: 14121)...
>>>> Gracefully killing worker 52 (pid: 14145)...
>>>> Gracefully killing worker 40 (pid: 14133)...
>>>> Gracefully killing worker 51 (pid: 14144)...
>>>> Gracefully killing worker 37 (pid: 14130)...
>>>> Gracefully killing worker 56 (pid: 14149)...
>>>> Gracefully killing worker 58 (pid: 14151)...
>>>> Gracefully killing worker 55 (pid: 14148)...
>>>> Gracefully killing worker 45 (pid: 14138)...
>>>> Gracefully killing worker 43 (pid: 14136)...
>>>> Gracefully killing worker 59 (pid: 14152)...
>>>> Gracefully killing worker 49 (pid: 14142)...
>>>> Gracefully killing worker 61 (pid: 14154)...
>>>> Gracefully killing worker 44 (pid: 14137)...
>>>> Gracefully killing worker 10 (pid: 14103)...
>>>> Gracefully killing worker 53 (pid: 14146)...
>>>> Gracefully killing worker 48 (pid: 14141)...
>>>> Gracefully killing worker 54 (pid: 14147)...
>>>> Gracefully killing worker 63 (pid: 14156)...
>>>> Gracefully killing worker 11 (pid: 14104)...
>>>> Gracefully killing worker 57 (pid: 14150)...
>>>> Gracefully killing worker 38 (pid: 14131)...
>>>> Gracefully killing worker 30 (pid: 14123)...
>>>> Gracefully killing worker 41 (pid: 14134)...
>>>> Gracefully killing worker 62 (pid: 14155)...
>>>> Gracefully killing worker 60 (pid: 14153)...
>>>> Gracefully killing worker 64 (pid: 14157)...
>>>> worker 1 buried after 1 seconds
>>>> worker 2 buried after 1 seconds
>>>> worker 3 buried after 1 seconds
>>>> worker 4 buried after 1 seconds
>>>> worker 5 buried after 1 seconds
>>>> worker 6 buried after 1 seconds
>>>> worker 7 buried after 1 seconds
>>>> worker 8 buried after 1 seconds
>>>> worker 9 buried after 1 seconds
>>>> worker 10 buried after 1 seconds
>>>> worker 11 buried after 1 seconds
>>>> worker 12 buried after 1 seconds
>>>> worker 13 buried after 1 seconds
>>>> worker 14 buried after 1 seconds
>>>> worker 15 buried after 1 seconds
>>>> worker 16 buried after 1 seconds
>>>> worker 17 buried after 1 seconds
>>>> worker 18 buried after 1 seconds
>>>> worker 19 buried after 1 seconds
>>>> worker 20 buried after 1 seconds
>>>> worker 21 buried after 1 seconds
>>>> worker 22 buried after 1 seconds
>>>> w
>>>
>>> all is fine, are you sure after moving to http-socket the error is
>>> always
>>> "connection refused" in nginx ?
>>>
>>> --
>>> Roberto De Ioris
>>> http://unbit.it
>>> _______________________________________________
>>> uWSGI mailing list
>>> [email protected]
>>> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi
>> _______________________________________________
>> uWSGI mailing list
>> [email protected]
>> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi
>>
>
>
> --
> Roberto De Ioris
> http://unbit.it
> _______________________________________________
> uWSGI mailing list
> [email protected]
> http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi
_______________________________________________
uWSGI mailing list
[email protected]
http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi

Reply via email to