On Sat, Jun 28, 2003 at 07:29:49AM +0100, Upayavira wrote:
> On 28 Jun 2003 at 11:59, Jeff Turner wrote:
...
> Okay. For the CLI, the cli.xconf file is the equivalent of the web.xml and the user 
> agent. 
> 
> Now, normally the user agent requests a URI, and that's it. It is up to the user 
> agent 
> as to what to do with that URI.

Oh I see.  Yep, makes sense that the 'user agent' be the one who decides
whether or not to chase down links.

> Are you saying that you want to put the configuration as to where pages
> should be placed into the sitemap?

No, that's the user agent's (CLI's) business.

...
> Yup. The primary aim was to reduce the number of page generations. And there was 
> an element of hack here - particularly in the 'hard-wired'ness of the LinkGatherer. 
...

> Or an alternative would be to ask: can you always do your link view
> with a single XSLT stage? If so:
> 
> <map:match pattern="page.html">
>   <map:generate src="page.xml"/>
>   <map:transform src="before-links.xsl"/>
>   <map:transform type="gather-links" src="identify-links.xsl"/>
>   <map:transform src="after-links.xsl"/>
>   <map:serialize/>
> </map:match>
> 
> So there's no hidden link gatherer. And you've got a single xslt to filter, etc. Not 
> specifying src="xxx" skips the xsl stage. The output of this xsl would be xml 
> conforming to a predefined namespace.

Having eliminated the dont-follow-these-links use-case, I don't see a
use-case for XSLT transformations, so it simplifies to 

<map:transform type="gather-links"/>

It certainly fixes the hard-wired'ness problem you mention above (that
'content' != XML before the serializer).


--Jeff

> 
> Regards, Upayavira

Reply via email to