HI - here's a thing - how can you create an automated search index that
trawls other sites (e.g. asp and cgi template sites) on a regular basis and
updates a local search index?

e.g. create local verity collection - create schedule that runs a page -
write page with cf code that re-indexes the collection - updates search on
site. EASY

BUT

Where to hold the data?  
1.) Create copies of asp code(e.g. search.asp?ver=xxx&doobie=dah), go to
other site, find page, save .htm file with file name
search.asp?ver=xxx&doobie=dah.htm) in local directory and index that I guess
- but I need to automate the process.
2.) Create a database with http refs in that hold the links, then create a
cfhttp dooh-dah that GETs the pages from the refs in the database and
downloads them - then index THAT - but will this work?
3.) There must be some WDDX way of doing this?

....help...

James

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Structure your ColdFusion code with Fusebox. Get the official book at 
http://www.fusionauthority.com/bkinfo.cfm

Archives: http://www.mail-archive.com/[email protected]/
Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists

Reply via email to