Hey Robert,

Too often I find myself creating a searchSomething in the database and
storing the results in a memory cache.   This works fine for small sets
where I can fetch the entire result with minimum overhead but I may get into
trouble if the search result is too big.  So I have to add pagination
support.

I had to deal with a similar situation on a similar project.

To explain an overview of this in psudo-code:
<code>

If (We do not have data in the cache); Then
  Process the data as normal.
  Store the results in cache.
  Data is stored in $data.
Else
  Fetch the data from the cache and store in $data.
EndIf

If (The user exceeds the number of pages provided); Then
  If ( We have more data to fetch ); Then
    Fetch more data, append to $data and store that.
  Else
    Return an empty array.
  EndIf
EndIf

Pass the $data into Zend_Paginator to paginate the result set as it was first requested.

</code>

Sure, the first pageload might be a bit slow because it must process all of the data at once, but every subsequent pageload, including those that are able to share the results of the same search, will be much quicker. Returning the results as an array allows you to pass that to a view, which can render HTML, XML or whatever other format you may need for your project.

For the implementation I know there is Zend_Paginator and I will investigate
that and coupling with Zend_Cache.    If I understand correctly I'll to
perform the query two times.  One with a COUNT statement to get only the
maximum number of items and the second to retrieve the current (or selected
"page") of records.

I think you might only have to make one query for the entire result set, and only use the count() of that.

Should $c->searchSomething($queryParams) return just the number of items
(integer) and have another method $c->fetchPageResult($i) to get results
(array of objects) of the "page" being fetched?

I would think something along the lines of $c->searchTable($queries) to fetch the data, and $c->paginate($results) to paginate the result set. However, this can all be done at once when you search for your terms.

As far as generating a cache key, what you could do is serialize the query object, and/or the parameters used to generate the query, then hash the result of the serialized result (using something other than MD5, of course :P). That way, if the same parameters are used to search for data, the cached result is fetched instead of the fresh result.
If the cache expires, then the model will fetch a fresh result set.

Hope this helps,
-Kizano
//-----
Information Security
eMail: [email protected]
http://www.markizano.net/

Reply via email to