I've done something similar, except I cache the big query for 5 minutes.

In addition, I store the long list of IDs as a client variable (in a 
database).  The cache thus trades the need to create sublists for a 
database read AND speeds up the paging queries (since the cache is in 
memory).  Works well, but on a busy site I'm concerned I'll run up against 
the limit of 100 cached queries.

I also wonder about the speed of lists, both the time it takes to create 
them, and the resultant use of WHERE ID IN (#ID_List#), compared to using a 
JOIN on a table that contains the IDs as a column.

Anyone?

best,  paul

At 10:14 AM 8/4/00 +0100, you wrote:
>Ok here's my take on the problem, and actually what I use.
>
>Notes, we are always reloading the the same page results.cfm (ie. next
>& prev links link right back to results.cfm), there are three
>variables which will get passed, startRow, maxRows & resultIDList, all
>of which I assume are undefined when I first enter the page.  startRow
>& maxRows are passed via the URL, resultIDList is passed in some other
>fashion; client.resultIDList, or whatever takes your fancy (passing
>all three in a hidden form is quite nice).
>
>Enter page.
>if startRow & maxRows are undefined, give them some value.
>if resultIDList is undefined then...
>   Do the QUERY, but return only the ID values:
>     SELECT id FROM somewhere WHERE somethingComplex
>   create resultIDList as an empty list, and then loop round the query
>populating the list
>   resultIDList = ''
>   <loop query="aboveQuery">resultIDList =
>listAppend(resultIDList,#id#)</loop>
>endif
>
>What we have done here is checked to see if a list already existed, if
>it didn't we run the query to get all the records which match our
>search out, and only return the ID value.
>Next time we enter this page, the list will already have been defined,
>therefore the query will not run again (handy for those 200 record
>returns)
>
>Now you create a subset of that list using the startRow and maxRow,
>I'll leave that exercise up to the cunning of the reader. (subsetList
>= from startRow to startRow+maxRows, do listGetAt etc. etc. etc. etc.)
>
>Finally we can pull all we really need from the database using our
>subList;
>SELECT * FROM somewhere WHERE id IN (#sublist#)
>
>So what happened here?  Well we hit the database once for the big-hit
>of 200 records or so, and then again (say 20 times) for our display of
>results for this page.  But once we move onto the next page, we don't
>run that big assed query again, just the smaller 20 hit one.
>
>Of course we could cache the first query <CFQUERY NAME="bigassedQ"
>DATASOURCE="myData" CACHEWITHING="#CreateTimeSpan(0,6,0,0)#">, so
>rerunning it on each page didn't really give us a big database
>slowdown, but, here's the pay off...
>
>You can use the above method to run a number of queries the first time
>to enter the page, and append the results to the list.  For example 3
>queries...
>
>SELECT id FROM table WHERE a=x AND b=y AND c=z   (exact match)
>   returns records 6 & 7
>   add results to subList
>
>SELECT id FROM table WHERE a=x AND (b=y OR c=z)
>   AND id NOT IN (#subList#)                        (more relaxed)
>   returns records 2,3 & 12
>   add results to subList
>
>SELECT id FROM table WHERE a=x OR b=y OR c=z
>   AND id NOT IN (#subList#)                        (very relaxed)
>   returns records 1,4,5 & 9
>   add results to list
>
>I end up with the list subList = 6,7,2,3,12,1,4,5,9
>
>Notice the "AND id NOT IN (#subList#)", in the above case, any records
>found in the first query would also be found in the second one, unless
>we tell it not to find the records we already have IDs for.
>
>I can now use this list to base the rest of my search on.
>SELECT * FROM table WHERE id IN (#subList#)
>
>The records will be returned in the order that the IDs appear in the
>list, that way the closest matches are returned first going to loose
>matches at the end of the list.
>
>I suspect there are certain issues which may need looking at.  The
>exact way you pass the large results list from page to page will
>depend on your situation, and total number of records you're dealer
>with.  If you're using strings instead of ints for your key IDs,
>you'll need to address some single quote issues.
>
>I also suspect you can gain something from using an array instead of a
>list, as this will possibly speed up the creation of a sub list.  But
>I haven't even looked into it.
>
>Hope this is of help to someone.
>
>Dan.
>
>
>
>This message is intended only for the use of the person(s) ("the intended 
>recipient(s)") to whom it is addressed.
>
>It may contain information which is privileged and confidential within the 
>meaning of the applicable law. If you are not the intended recipient, 
>please contact the sender as soon as possible.The views expressed in this 
>communication may not necessarily be the views held by Live Information 
>Systems Limited.
>
>
>------------------------------------------------------------------------------
>Archives: http://www.mail-archive.com/[email protected]/
>To Unsubscribe visit 
>http://www.houseoffusion.com/index.cfm?sidebar=lists&body=lists/cf_talk or 
>send a message to [EMAIL PROTECTED] with 'unsubscribe' in 
>the body.

------------------------------------------------------------------------------
Archives: http://www.mail-archive.com/[email protected]/
To Unsubscribe visit 
http://www.houseoffusion.com/index.cfm?sidebar=lists&body=lists/cf_talk or send a 
message to [EMAIL PROTECTED] with 'unsubscribe' in the body.

Reply via email to