Hi, kevin

I can import fastbinary.

In [4]: from cadaapi.stock.ttypes import fastbinary

In [5]: fastbinary
Out[5]: <module 'thrift.protocol.fastbinary' from
'/Users/vincent/cada/lib/python2.6/site-packages/thrift/protocol/fastbinary.so'>


On Wed, Jan 9, 2013 at 8:21 AM, Kevin Clark <[email protected]> wrote:

> Vincent - are you sure the fastbinary extension is being used? In the
> python shell:
>
> from cadaapi.stock.ttypes import fastbinary
>
> And check to see if 'fastbinary' is None or not. The lib tries to
> optionally include the fairly fast C bindings. Without them, it's
> going to be an order of magnitude slower. 200k shouldn't matter - I've
> pushed megs through the ruby implementation without problems, and the
> ruby C extension is based on the python version.
>
> On Tue, Jan 8, 2013 at 3:34 PM, Henrique Mendonça <[email protected]>
> wrote:
> > Hi Vincent,
> >
> > Do you have any reason to use test_json(rawdata) instead of
> test_json(data)
> > ?
> > It looks like the object is a lot smaller on rawdata...
> > Anyway you're trying to compare a built-in serialisation function with
> the
> > binary implementation, and in this case I would expect some lost of
> > performance.
> > You can also try to use the compact protocol, as it's also supposed to
> be a
> > little faster, and/or the new json protocol. With you have a benchmark on
> > that, please report back to us.
> >
> > Regards,
> > Henrique
> >
> > On 6 January 2013 09:43, Vincent <[email protected]> wrote:
> >
> >> Hi, all
> >>
> >> I'm using thrift in python, I found serialize structure-data in
> >> thrift-python is very slow.
> >>
> >> I wrote a serialization test on thrift and json,
> >>
> >> *testing thrift defination:*
> >> https://gist.github.com/4465825
> >> https://gist.github.com/4465826
> >>
> >> *python testing code:*
> >> https://gist.github.com/4465830
> >>
> >> *testing data:*
> >> https://gist.github.com/4465834
> >>
> >> *testing results:*
> >> https://gist.github.com/4465853
> >>
> >>
> >> testing results:
> >>
> >> >
> >> > Test thrift
> >> >
> >> > start: 1357457502.17
> >> >
> >> >   File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> >       File length: 796500
> >> >
> >> > end: 1357457509.93
> >> >
> >> > elapse: 7.7634768486
> >> >
> >> >
> >> >
> >> >
> >> >
> >> > Test json
> >> >
> >> > start: 1357457509.93
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> >       File length: 217252
> >> >
> >> > end: 1357457510.01
> >> >
> >> > elapse: 0.0743980407715
> >> >
> >> >
> >> As the result above, I can't suffer thrift-serialization is slow like
> this.
> >>
> >> And my question is:
> >>
> >>    - Did I have wrong usage with thrift?
> >>    - Or thrift was not design to transport big data(200k, is it big?)
> >>
> >>
> >> Thanks.
> >> --
> >> Vincent.Wen
> >>
>
>
>
> --
> Kevin Clark
> http://glu.ttono.us
>



-- 
Vincent.Wen

Reply via email to