Not that I know of.  Goggle's bot only crawls webservers.   That's where 
the "painless" part comes in -- just copy the directory structure to a 
webserver that gets crawled now, FTP all of the .txt files over, create a 
script that will create an index page for each directory and 
voila  --  searchable patch descriptions.   Than all you need to do is 
setup a process to FTP the new txt files every so often.

Of course, if these files were just copied to a public webserver the same 
time they were put on the FTP server....  Better yet, just have a public 
webserver that can mount the FTP directories as a remote directory then we 
can get everything via HTTP instead of FTP, and the search engines can 
crawl all they wish.


At 05:01 PM 6/17/2006, Nancy wrote:
>The descriptions for the last few years of patches are in txt format on the
>ftp.va.gov/vista site and all of the patches are available by purchasing the
>FOIA CDs (or maybe DVDs).  Can one get google to search an ftp site?
>
>On Saturday 17 June 2006 15:50, Dan wrote:
>Anyone know if the VistA patch descriptions are available via the web and
>thus are being indexed by the search engines?
>
>If you've ever needed to find that new API sent out by patch XYZ having a
>searchable index would be helpful.  Should be relatively painless to
>  implement.



_______________________________________________
Hardhats-members mailing list
Hardhats-members@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/hardhats-members

Reply via email to