Thanks, Larry, that make me feel a bit better. The suggestion is to keep a separate table of applicable nodes, generate a random number from the number of records, and get that record. Your explanation seems to indicate that one is simply spreading the "overhead" out over time - and probably increasing it in total, maybe. Given that the block can be refreshed on every page load, and a new node created relatively infrequently, I can also see that there may be some merit when the number of nodes gets fairly large. Maybe something like "SELECT n.* FROM {quotes_random} r INNER JOIN {node} n ON n.nid = r.nid WHERE r.row_num = 1 + RAND() * (SELECT MAX(row_num) FROM {quotes_random})". (Sorry for the D6 version; it would take me a much longer time to construct in D7.)
Nancy Injustice anywhere is a threat to justice everywhere. -- Dr. Martin L. King, Jr. ________________________________ From: Larry Garfield I don't know that I've ever tried it with 12k nodes so I can't say from experience. Randomization is a tricky subject. The common method, which Views uses as does the ->orderRandom() method of DBTNG, boils down to adding a random number as a new column for each record, ordering by that, and then (in this case) discarding all but the first record. That involves a linear-growth creation for the new column (a fixed additional cost per record) and then an integer ordering. Assuming the sorting function in the SQL database is sane (which if it isn't you need a new database), that should be an n*log n algorithm. (That's as fast as a sorting algorithm can get.)