cache.ram() with an high expire time, or redis with the contrib module
will make that possible.

Obviously the "BIG chunk of information" needs to be serialized
(Written to cache) and deserialized (Read from Cache), so don't go too
far in "too big" (<5mb) .



On Feb 6, 1:06 am, Bruno Rocha <[email protected]> wrote:
> Redis or MongoDB seems to be what you are looking for!
>
> On Sun, Feb 5, 2012 at 9:58 PM, Sebastian E. Ovide <
>
>
>
>
>
>
>
>
>
> [email protected]> wrote:
> > Hi All,
>
> > Is it possible to keep a BIG chunk of information in memory RAM
> > permanently (without reloading it for each new session/request) so that
> > different sessions (from X different users using the system at the same
> > time) can access (R/W) it concurrently ? For the kind of processing that
> > need to be done on the data, ALL of it must be in memory at the same time,
> > and due for the size of it, it is not practical to load/save it for each
> > request. (to slow)
>
> > thanks
>
> > --
> > Sebastian E. Ovide
>
> --
>
> Bruno Rocha
> [http://rochacbruno.com.br]

Reply via email to