[ 
https://issues.apache.org/jira/browse/THRIFT-4591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

James E. King III closed THRIFT-4591.
-------------------------------------

> lua client uses two write() calls per framed message send
> ---------------------------------------------------------
>
>                 Key: THRIFT-4591
>                 URL: https://issues.apache.org/jira/browse/THRIFT-4591
>             Project: Thrift
>          Issue Type: Improvement
>          Components: Lua - Library
>    Affects Versions: 0.11.0
>            Reporter: allen_lee
>            Assignee: James E. King III
>            Priority: Minor
>             Fix For: 0.12.0
>
>         Attachments: 9090.pcap, 9090_1.pcap
>
>   Original Estimate: 4h
>  Remaining Estimate: 4h
>
> 1) realize thrift server with TNonblockingServer via c++;
> 2) realize thrift client via lua lib and choose frame transport.
> 3) call remote interface failed with "TTransportException:0: Default 
> (unknown)" print, and the server show "TConnection::workSocket(): 
> THRIFT_EAGAIN (unavailable resources)" error.
> 4)investigate this fault with tcpdump tool, attachment 9090.pcap show the 
> frame msg doesnot contains frame size field, the rifht situation of 
> attachment 9090_1.pcap show the frame msg contains 4 bytes (00 00 00 25) 
> before protocol id field.
> 5) dig into the fault and tried to find root cause, then i found there is an 
> fault in TFramedTransport:flush function in TFramedTransport.lua file. the 
> original realization is:
> -----
> function TFramedTransport:flush()
>   if self.doWrite == false then
>     return self.trans:flush()
>   end
>   -- If the write fails we still want wBuf to be clear
>   local tmp = self.wBuf
>   self.wBuf = ''
>   local frame_len_buf = libluabpack.bpack("i", string.len(tmp))
>   self.trans:write(frame_len_buf)
>   self.trans:write(tmp)
>   self.trans:flush()
> end
> -----
> which send frame size file and reset msg content independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to