On 15.04.2004 12:17, Markus Strickler wrote:
Hi-
I'm trying to use Cocoon to generate a static website. After some twaeking
and reading through the list archives I was able to get the cli and ant task
to work. However all the examples I could find either specify only single
URIs or use link crawling with a starting URI.
What I would like to do is just specify a directory and then recourse
through all subdirectories to process the xml files inside. Is this possible? Has someone already done this?
It works URI-based, not directory-based. If your URIs map to directories, you can use the directory generator to create a link crawling starting document on the fly.
That would work, but it would also then crawl all pages, not just the directory generator page. You might be able to prevent further crawling with <include> or <exclude>, but I don't know.
Let me know if this is enough to meet your needs - maybe there's some more functionality required here.
Regards, Upayavira
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
