Re: maximum (practical) size of $r-notes

2000-11-01 Thread Todd Finney

In continuing my work on this template system, I've run 
into another problem.

Any one or more of the components can be a script, and the 
problem with the existing system which I'm trying to solve 
is catching redirects by them.  The order in which I'm 
doing things is now:

( 4 part handler )
Part 1: Get the template set id (int), title, and meta 
designations from
  the requested file.  Put this list in pnotes.
Part 2: Add to that list the header, footer, toolbar, and 
content files as
 as designated in the template file.
Part 3: Run through the list of components, and execute 
anything that needs
 to be executed.  Capture the output and put it in 
pnotes.
Part 4: Check for redirects, and send one if it exists.  If 
not, print the
 template filled in with the output from the list 
entries.

I had previously been executing component scripts like 
this:

($file is an entry in the list from Parts 12.  All entries 
in that list are full pathnames.  $args is a list of 
parameters set up previously in the handler.)

if ( -x $file ) {
$file =~ s/\/www\/html//o;
$file = $file .'?'.$args if $args;
my $subr = $r-lookup_uri($file);
$subr-handler('cgi-script');
$subr-run();
return;
}

This doesn't work especially well anymore, as $subr-run 
outputs to the browser.  I've looked through past list 
messages, and from what I've read there's no way to 
suppress that.

Is there a reason that no one has come up with a method for 
doing this?   It seems like a few people have asked about 
it, are we all going about things the wrong way?  I'm not a 
C guy, so I'd probably need a little guidance, but I'd be 
happy to work a patch if no else wants to.

Barring that, what's the best way to do what I need to 
do?  exec'ing (and it's cousins) the scripts doesn't sound 
pretty, but I'm at a loss on what else to try.

thanks,
Todd





At 12:29 AM 10/30/00, Todd Finney wrote:
This is a follow-up on a question that I asked a couple of 
months ago.  The subject was "executing a cgi from within 
a handler (templating redux)", dated 8/23/00.

The gist of the matter is that we need a handler which 
will serve html pages ('content files') inside of other 
html files ('template files'), while sticking other files 
and scripts ('component files') into the template at the 
same time.  Quasi-frames, if you will.  We've already 
covered why we didn't pick one of the readily-available 
packages to do this.

We've had a working handler that does _almost_everything 
it needs to do.  When the user requests a page, it figures 
out which template to use (based on the page requested), 
which component set to use (based on the user and the 
page), rolls the whole thing together and sends it along.

The wrapper can handle both static files and scripts as 
components or content files, and works really well most of 
the time.  However, we've run into a problem when a page 
needs to redirect after execution, such as a login page.

The problem is that when a component or content file is a 
script, the server executes that script when it encounters 
it in the template, a'la

- hey, the user wants foo.html
- the user is a member of group 'coders', and their 
component path is
 /www/htdocs/components/coders/
- foo.html wants template set 1,
- go get /www/htdocs/components/coders/tmpl_1, and open
- begin printing the template file to the browser.  As the 
file goes by,
watch for [tags] containing insertion points.
- hey, there's [head], print or execute 
/www/htdocs/components/coders/head_1
- hey, there's [tool], print or execute 
/www/htdocs/components/coders/tool_1
- hey, there's [cont], print or execute foo.html
- hey, there's [foot], print or execute 
/www/htdocs/components/coders/foot_1
- finish printing /www/htdocs/components/coders/tmpl_1 and 
close

If /www/htdocs/components/coders/tool_1 has a redirect 
call in it, it's too late for the browser to actually do 
anything about it.

I managed to corner Nathan in New York (thanks, Nathan!). 
He recommended a two-stage handler, one that processes the 
components and content, and another that actually handles 
the printing.  Using $r-notes, the second handler could 
be aware of what it needed to do before printing 
anything.  This is a really good idea, but it's turning 
out to be more difficult than I anticipated.

The only way I can think of doing this is adding a third 
handler, in the middle, that executes any scripts and 
stores the output somewhere.  Then it would check the 
output for a Location: header and set something like 
$notes-{'redirect'} if it finds anything.  The third 
handler would then check that value before printing 
anything.  If it's there, do it; if not, grab the output 
and the static files and print them to the user.

I'm concerned about putting large amounts of data into 
$r-notes.  Some of our script output can be pretty 
heavy.  If $r-notes can only take simple strings, how 
large of a simple string is it safe 

Re: maximum (practical) size of $r-notes

2000-10-30 Thread Matthew Byng-Maddick

On Mon, 30 Oct 2000, Matt Sergeant wrote:
 On Mon, 30 Oct 2000, Todd Finney wrote:
  I'm concerned about putting large amounts of data into 
  $r-notes.  Some of our script output can be pretty 
  heavy.  If $r-notes can only take simple strings, how 
  large of a simple string is it safe to put in it?  Is there 
  a better way to do this?
 AxKit uses the notes table to store interim strings for template
 processing. I've not yet heard a bug related to it, but then I'm not
 delivering massive files. I'd imagine it would probably be limited by
 available memory.

This is basically correct. The notes table is tied to Apache::Table, which
in itself is an ap_table, (with C accessors of ap_table_get() and
ap_table_set()). Like everything else in Apache, it is based on the pools
system of memory management, which will quite happily allocate up to the
amount of memory you have, and will then throw it away at the end of the
request.

However, apache itself calls malloc() but never calls free(), because once
it has allocated the memory, it manages it itself. Given that the notes
table is allocated from the request pool (such that it is thrown away at
the end of a request). It will always happen in the child and can never be
shared - so if you have large numbers of apache processes, then this might
(if you're using a lot of memory) be able to take the machine down.

But the memory management system in apache is only limited by the amount
of memory that the system will let it allocate - which means that you
should be OK.

MBM

-- 
Matthew Byng-Maddick   Home: [EMAIL PROTECTED]  +44 20  8981 8633  (Home)
http://colondot.net/   Work: [EMAIL PROTECTED] +44 7956 613942  (Mobile)
"It's today!" said Piglet.
"My favourite day," said Pooh.





Re: maximum (practical) size of $r-notes

2000-10-30 Thread G.W. Haywood

Hi all,

On Mon, 30 Oct 2000, Matthew Byng-Maddick wrote:

 On Mon, 30 Oct 2000, Matt Sergeant wrote:
  On Mon, 30 Oct 2000, Todd Finney wrote:
  AxKit uses the notes table to store interim strings for template
  processing. I've not yet heard a bug related to it, but then I'm not
  delivering massive files. I'd imagine it would probably be limited by
  available memory.
 
 This is basically correct. The notes table is tied to Apache::Table,

If it's a huge amount of data and you don't want to bloat your
processes, why not pass a tempfile name/pointer/handle in $r-notes
and write the data to a ramdisk?  I wouldn't think you'd lose much
in opening and closing the file.

73,
Ged.




RE: maximum (practical) size of $r-notes

2000-10-30 Thread Geoffrey Young



 -Original Message-
 From: G.W. Haywood [mailto:[EMAIL PROTECTED]]
 Sent: Monday, October 30, 2000 7:29 AM
 To: Matthew Byng-Maddick
 Cc: [EMAIL PROTECTED]
 Subject: Re: maximum (practical) size of $r-notes
 
 
 Hi all,
 
 On Mon, 30 Oct 2000, Matthew Byng-Maddick wrote:
 
  On Mon, 30 Oct 2000, Matt Sergeant wrote:
   On Mon, 30 Oct 2000, Todd Finney wrote:
   AxKit uses the notes table to store interim strings for template
   processing. I've not yet heard a bug related to it, but 
 then I'm not
   delivering massive files. I'd imagine it would probably 
 be limited by
   available memory.
  
  This is basically correct. The notes table is tied to Apache::Table,
 
 If it's a huge amount of data and you don't want to bloat your
 processes, why not pass a tempfile name/pointer/handle in $r-notes
 and write the data to a ramdisk?  I wouldn't think you'd lose much
 in opening and closing the file.

or (easier) just place a reference to a variable containing your data in
pnotes instead of notes - that way a reference, and not the data, is passed
around.  the data has to exist somewhere, but now you only have one copy of
it...


pnotes is (IMHO) the single greatest, but least well known method in
mod_perl - don't leave home without it...

--Geoff
 
 73,
 Ged.
 



RE: maximum (practical) size of $r-notes

2000-10-30 Thread Geoffrey Young



 -Original Message-
 From: G.W. Haywood [mailto:[EMAIL PROTECTED]]
 Sent: Monday, October 30, 2000 10:06 AM
 To: Geoffrey Young
 Cc: [EMAIL PROTECTED]
 Subject: RE: maximum (practical) size of $r-notes
 
 
 Hi all,
 
 On Mon, 30 Oct 2000, Geoffrey Young wrote:
 
   From: G.W. Haywood [mailto:[EMAIL PROTECTED]]
   If it's a huge amount of data and you don't want to bloat your
   processes, why not pass a tempfile name/pointer/handle in 
 $r-notes
  
  or (easier) just place a reference to a variable containing 
 your data in
  pnotes instead of notes - that way a reference, and not the 
 data, is passed
  around.  the data has to exist somewhere, but now you only 
 have one copy of
  it...
 
 If the data is in a Perl variable somewhere Perl will already have
 grabbed enough memory to store it.  Won't Perl then just keep that
 memory until the child dies, even if you undef the variable containing
 the data?  

that is my understanding... I guess that my point was that if you are going
to have the data in perl somewhere the memory is going to be taken (for
example, putting it in a tempfile but then local $/ and slurp).  pnotes
allows for passing by reference, so it really doesn't matter when you read
it in and where you use it, you still only have one copy...

but then again, it's monday morning and my coffee was weak today...



 My idea was to avoid this cause of process bloat.
 
 If the data isn't in a Perl data structure is this always safe?

I dunno

--Geoff

 
 73,
 Ged.
 
 



RE: maximum (practical) size of $r-notes

2000-10-30 Thread Matt Sergeant

On Mon, 30 Oct 2000, Geoffrey Young wrote:

 that is my understanding... I guess that my point was that if you are going
 to have the data in perl somewhere the memory is going to be taken (for
 example, putting it in a tempfile but then local $/ and slurp).  pnotes
 allows for passing by reference, so it really doesn't matter when you read
 it in and where you use it, you still only have one copy...

Of course there's no functional difference between using pnotes and
something like:

$Apache::Pnotes::myvar = \$string;

Its really just syntactic sugar to make you think that its a notes table
for a perl data structure.

-- 
Matt/

/||** Director and CTO **
   //||**  AxKit.com Ltd   **  ** XML Application Serving **
  // ||** http://axkit.org **  ** XSLT, XPathScript, XSP  **
 // \\| // ** Personal Web Site: http://sergeant.org/ **
 \\//
 //\\
//  \\




RE: maximum (practical) size of $r-notes

2000-10-30 Thread G.W. Haywood

Hi Geoff,

On Mon, 30 Oct 2000, Geoffrey Young wrote:
  Ged mumbled:
  Won't Perl then just keep that memory until the child dies...?
 
 that is my understanding... I guess that my point was that if you
 are going to have the data in perl somewhere the memory is going to
 be taken (for example, putting it in a tempfile but then local $/
 and slurp).

Right.  I think people are much too fond of slurping in whole files in
that way.  It's OK in development, then some villain sends you tewnty
megabytes instead of twenty bytes.  Clang.  It's a little more work to
break things up into some smaller pieces but you can gain a lot in
efficiency sometimes.

73,
Ged.