Hi list,
I have a question concerning “parsing” or “traveling around” (however
you want to call it) in Lenya’s output.
The sitemap’s location is publication-sitemap.xmap* - at the end of
<map:match pattern="*/**.html"> everything becomes serialized to xhtml.
If this becomes changed for e.g. to xml then the entire output becomes
displayed as xml instead. So this has to be the very last point of
computation before displaying.
What I would like to do is to create a database connection and “parse”
the output. Depending on the outcome of the parsing process different
values will have to be stored in the database.
I guess there should be at least two ways to establish this [hopefully
I’m not already entirely wrong at this point ;)].
Firstly, I think I could use an XSL file in combination with a transform
statement. In the following the XSL file would be applied to the
pipelines current SAX events and wouldn’t change them, but write the
specific database entries in case of a match.
Secondly, I guess working with SAX or DOM directly should be possible
too. Further, it should be more comfortable to move within the
“document” (if moving up-, down- as well as side wards the tree will be
required), but I got no clue how I could establish this.
I’d be glad if someone could tell me how something like the above
mentioned method could be achieved. Moreover, it would be great if
someone could tell me if my two guesses (or at least one of them) are
right so far.
*you may have to take care of my Lenya version (not default one – see
below), but this shouldn’t make any differences to answer my question,
it is just about the currently described sitemap’s structure.
Using:
Lenya: unizh version 1.0.1 based on Lenya 1.2.3-dev , Cocoon 2.1.7
Tomcat: version 5.0.28
Java: j2sdk-1.4.2_10
with best regards
Tom
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]