Thorsten Scherler wrote:
On Wed, 2007-08-22 at 12:22 +1000, David Crossley wrote:
Ross Gardler wrote:

...

That depends on the follow-links="true|false", but yes if you turn off
the follow links then the project.start-uri and all links in the
project.urifile are crawled only.

If this is the case then we could write a script to create this uriFile from the last modified dates of files and solve the problem of regenerating every page in the content object.

...

The non trivial task is the uriFile itself. Since a source does not have
to be based on a file system file one would need to crawl the whole site
and store the http-headers regarding caching.

Good point. But at least it works for local files, I suspect most people build only from local files.

Further when adding or removing nodes in the site.xml some parts of the
site needs to be rebuilded even if the underlying source has not
changed. The reason in html is e.g. ATM the menu which is reflecting the
navigation.

Yes, when any of the config files have changed (e.g. site.xml, locationmap.xml, forrest.properties* and *.xmap) the whole site should be regenerated.

Anyway, it's just a thought. I'm not about to do this, but next time a user asks for this functionality lets point them to this thread and ask them to have a bash at implementing it.

Ross