On 16 oct, 19:18, Jeremy Evans <[email protected]> wrote:
> On Oct 16, 10:07 am, Florent <[email protected]> wrote:
>
> > I saw that sequel has a caching feature but it's using memcache-
> > client.
> > I prefer the memcached version (http://github.com/fauna/memcached)
> > that is know to be a lot faster. Problem is that memcached API is a
> > bit different of memcache-client API.
>
> In that case, I recommend writing a proxy object that translates the
> former API to the latter, and use that with the existing plugin, or
> copy the existing plugin and modifying it to use the other API.

proxy solution sounds great but seems too difficult for my ruby
skills... :(
so I will create a new plugin.

>
> > I have an application that inserts/updates a lot and if I clear the
> > cache each time it inserts/updates, there will be to many cache miss
> > when finding to be useful.
>
> That makes sense.  The caching plugin was designed with a mostly read
> mindset, which is where you benefit the most from caching.

Actual plugin works this way:
Find request: lookup in memcached ? if present give cached value, if
not give DB value and store it in the cache
Insert/Update request: update in DB and clear the cache

I think that updating the cache in insert/update requests is as
expensive as clearing the cache, and would suit both needs (heavy
reads and heavy inserts/updates use cases). or maybe put an option in
the plugin to choose between both behaviour (clear the cache on update
or set the cache on update)

>
> > I've spotted that I can only cache sequel model _before_ save or
> > update call, otherwise it can't be cached: "TypeError: can't dump hash
> > with default proc"
>
> Hmm, weird.  Can you send a backtrace for this error?

I have tested with two memcached clients to be sure it was not my
memcached client fault.
here is the two related gists: http://gist.github.com/213348

I think it's normal that memcached can't marshal proc in a ruby
object, but what I don't understand is where do these proc come from
between before and after the save call.

>
> > It's not likely a problem to cache the model before save/update
> > (values are the same), but when you cache a new model (not created in
> > database yet), you get it back from the cache and it's still
> > considered "new" from the model point of view.
>
> That's one of the reasons the current plugin works the way it does.

In my code, I need to know if a sequel model is new or not, so I don't
cache it when it's new (but that makes the cache miss for the next
find request). it's not very convenient but it works.

Florent
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sequel-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to