On Feb 2, 2005, at 8:10 AM, Tod Harter wrote:

It really doesn't matter WHERE in the pipeline after the first step XSP is,
the caching issue is that if the input to XSP changes, then it is going to
force a recompile of the generated perl code, which is SLOW. When your XSP is
at the start of the pipeline, then its (most often) the same every time and
so it only gets compiled on the 1st page hit to your site, but (in your
example) XSP #2 is based on the output of XSP #1, which being dynamic means
EVERY page hit results in regeneration of perl code and then recompilation of
that code. This would be true regardless of what kind of processor comes
first in the pipeline, XSP isn't first, thus it is processing dynamic input,
and its not really much good for that...

yes but no. I feel that if I explain a little further about how I am trying to use XSP, you or someone might butt in and explain what I want to do is fine, but perhaps I am going about it the wrong way.


The alternatives have been pretty well discussed as well. In your example you
could redesign your pipeline as follows:


XSP -> XSL -> XSL -> XSL -> output

and have XSL #1 merge in content that you now produce in XSP #2 via the
'document()' function. Or you can use XInclude within XSP #1, or other taglib
based including functions. Or you could accomplish what you want perhaps by
writing a custom provider module.

The pipeline is certainly up for re-design. I have tried to keep everything dynamic at the top of the pipeline, and have therefore had to write a fair deal of XSL to accomplish things that would be far easier in Perl. I have a custom Content Provider which ISA File Provider, other than an over-ride on get_styles(), which adds 3 stages of XSL to the end of the pipeline. I also have a custom Style Provider which serves up dynamic XSL to these 3 stages, based on information built by the taglib call in the initial content.


The problem is when the XML generated by the XSP, before it gets passed to XSL processing, contains tags that should be handled a taglib. If I am limited to having a single XSP stage at the beginning, then I would have to build a static page containing all the tags handled by their corresponding taglib. What I am trying to do is use XSP to build a dynamic page that is then passed once again through XSP.

So far, I have a .xsp file like this:

<?xml-stylesheet href="NULL" type="application/x-xsp"?>
<xsp:page
    xmlns:xsp="http://www.apache.org/1999/XSP/Core";
    xmlns:param="http://axkit.org/NS/xsp/param/v1";
    xmlns:taglib1="http://www.neverintheoffice.com/xsp/taglib1/v1";
    >
    <xsp:content>
          <taglib1:build>
                <taglib1:action><param:action /></taglib1:action>
          </taglib1:build>
    </xsp:content>
</xsp:page>

when run against the XSP engine, taglib1 outputs:

<taglib1>
        <output>
                <taglib2:build>
                        <taglib2:action>...</taglib2:action>
                </taglib2:build>
        </output>
</taglib1>

For the XML to be complete, it has to be re-run through the XSP engine, so that taglib2 can have a chance to be handled by its taglib. Can it be wrapped in an appropriate <xsp:page />? Would it need the appropriate processing instruction or is it too late for that?

In my mind, it is not a matter of whether the Perl code behind the taglib needs to be re-compiled, as it's not changing. It is more a matter of how to structure the pipeline so that the XML can undergo multiple XML transforms via taglibs.

Should I be investigating SAX-based taglibs?

To be honest I think most people using AxKit have come to the conclusion that
XSP (and other similar language

You must have gone into a tunnel :)

/dave


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to