To follow up on this old thread I've started. Once I had no success in integrating the available gzip plug-in into my scenario I came up with different implementation:
- It was based on gzip/gunzip provided by ESI plug-in. - The main plug-in was forked from null-transform and uses the same model. - I couldn't resist to make some parts more C++ friendly (vectors instead of strings, less mallocs, etc...). In fact I have two plug-ins, one to firstly inflate the incoming response content and the other to deflate (after the transformation pipeline has been applied), both plug-ins supporting streaming. Differently from the gzip plug-in, they don't prevent the origin from deflating the original content (by hiding the accept-encoding request field). Instead I believe this decision is up to the origin. We may have cases the proxy is far from the origin where applying gzip between both could be beneficial. Therefore it is up to the origin to honor the field or not. It is worth to mention I am using the plug-ins in a very untraditional way. I am dynamically opening them from my transformation plug-in and including them into the transformation pipeline only when the content will in fact be subjected to additional transformations (a run-time decision). They also work when included on plugins.conf file. Care was taken to apply the plug-in only once per request (similarly to what has been done on the original gzip plug-in). A subset of this approach (only suboptimal gzip) can already be seen when accessing any page under https://search.yahoo.com/ The described implementation should be available on the same domain soon. More work needs to be done to make the plug-in broader (support deflate, etc) more debuggable and configurable. This will be the model I will follow for next transformation plug-ins I may write. I would be glad to open source the work so others could benefit from it if there is interest. thanks, On Tue, Nov 26, 2013 at 9:25 AM, Daniel Morilha <[email protected]> wrote: > I think I got it, I tried to use the gzip plug-in with a simple > configuration as exemplified on the docs but I wasn't able to stream the > content any longer. It looks like the plug-in buffers everything so it can > compress and flush at once. Is there a way to make the gzip plug-in to > compress and flush the chunks? That's a requirement for my use case. > > Thanks guys! > > > On Tue, Nov 26, 2013 at 6:16 AM, Igor Galić <[email protected]>wrote: > >> >> >> ------------------------------ >> >> > Also, I lied about the headers.. but, really, this should be a feature. >> >> That was what I was hinting at as well, and I think it should work in its >> current form. Other plugins should be able to control compression through >> the accept encoding header >> >> > See, ugh.. Otto, this >> > https://trafficserver.readthedocs.org/en/latest/>reference/plugins/gzip.en.html >> needs some polishing! >> >> I see :-) I'll fix that. >> >> >> Thanks! >> >> -- >> Igor Galić >> >> Tel: +43 (0) 664 886 22 883 >> Mail: [email protected] >> URL: http://brainsware.org/ >> GPG: 8716 7A9F 989B ABD5 100F 4008 F266 55D6 2998 1641 >> >> > > > -- > Daniel Morilha ([email protected]) > -- Daniel Morilha ([email protected])
