Greg, How about building a Searchable table with the record ID as the primary key and a text (or blob) field called SearchTerms. Then the first time that this gets built, just concatenate all the results of the 35 tables together and dump it into SearchTerms with the RecordID. If you have a lastUpdated field in your db for that record, next time you want to update the SearchTerms field, simply run a query that checks to see what records were updated in (say) the last 24 hours and run the 35-table join only for those records to update the Searchable table.
Or is a new table out of the question? HTH George On 12/7/05, Greg Johnson <[EMAIL PROTECTED]> wrote: > > Ok, I have a database with 35 tables in it. I need to index fields from > each of the tables so that it can all be searched and return the record > numbers to pull out all the details. But bringing back every record with 35 > tables joined takes so long things time out. So my plan was to query a > table, add it to a collection, then do the next and add it to the same > collection. IOW each query appends to the body field of 1 collection with > seperate cfindex calls. > > However each time I use an index statement, it erases what is curently > there. Any ideas other them building the body in a variable or file and > then indexing that? > > Thanks > Greg > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Find out how CFTicket can increase your company's customer support efficiency by 100% http://www.houseoffusion.com/banners/view.cfm?bannerid=49 Message: http://www.houseoffusion.com/lists.cfm/link=i:4:226480 Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4 Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4 Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=11502.10531.4 Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

