roxen.lists.roxen.general

Subject Author Date
Re: [PATCH 14/17] New module: gzip-on-the-fly Martin Stjernholm <mast[at]roxen[dot]com> 21-01-2009
"Arjan van Staalduijnen" <<Arjan.van.Staalduijnen[at]rtl.nl>> wrote:

> It might depend on the kind, or amount of data whether or not it is
> worthwhile compressing a dynamic answer, and depending on the definition
> of "dynamic".

For me "dynamic" means no caching at all possible. In normal cases,
that means essentially any page with rxml, since you pretty soon run
into a NOCACHE() when doing rxml. Of course, you speed freaks out
there are caching your rxml responses anyway.

Here are some assorted thoughts of mine on the subject:

o  Needless to say, compress only selected content types. I'd say
   text/* and a few more special cases. Marty made it configurable,
   which is nice.

o  At least an option should be to compress only responses that goes
   into the protocol cache. That automatically rules out big responses
   and dynamic responses. This is suitable for servers that are cpu
   bound (not uncommon for us); other config options can exist to
   cater for network bound cases.

o  Compression releases the interpreter lock, so the work can be
   parallellized. Just make sure to not use the backend thread,
   though.

o  It might be good to populate the cache with an uncompressed
   response immediately and then replace it with a compressed one
   later when the on-the-fly compression is done. The delays might
   become a factor if the cache mutation rate is high - not more than
   one thread should be devoted to the job.

o  A vary callback is not really a good way to go - we don't want to
   risk splitting the cache hits into uncompressed/compress/gzip/etc.
   Better choose one compression method and always use it (when
   compression at all is considered). Maybe one can make add
   uncompression to be able to send a cached response to a client that
   doesn't support the compression method, but otoh such clients might
   be so few that it's not really worth the bother.