Re: Static linking of Sybase::DBlib to mod_perl?

2000-01-28 Thread Thomas Corte


Hello there,

On Thu, 27 Jan 2000, Doug MacEachern wrote:

 does apache-x.x.x/src/modules/perl/perlxsi.c have a DBlib reference?
 you should see something like:
 boot_Sybase__DBlib

No, its not present. It only shows some Apache::* stuff und Dynaloader.
Indeed, 

cd apache_1.3.11
find . -type f -exec fgrep -il sybase {} \;

shows that Sybase::DBlib only shows up in the various Makefiles,
(on the line PERL_STATIC_EXTS=Apache ... Sybase::DBlib),
nowhere else.

I get the impression that Sybase::DBlib might not be the kind of
static extension that is supported by PERL_STATIC_EXTS.

_

Thomas Corte
[EMAIL PROTECTED]



Re: make test fails

2000-01-28 Thread vinecent hong

Read http://perl.apache.org/guide/troubleshooting.html

there explain the error you mentioned though its explanation for
"Can't load '.../auto/DBI/DBI.so' for module DBI "

- Original Message -
From: Jeff Beard [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, January 28, 2000 4:57 AM
Subject: make test fails


 Hi there,

 running make test fails and produces the errors listed at the end of this
 message. I searched the list archives and found a posting that suggested
 rebuilding Perl with the same compiler and tools that I use for apache and
 mod_perl. So I did but it didn't fix the problem. I did in fact build Perl
 the first time with gcc 2.8.1, then built gcc 2.95.2 from source. But I
 rebuilt Perl with the new compiler and get the same results. Any ideas,
 pointers, etc. in troubleshooting are appreciated.

 The environment, etc. is:

 Solaris 2.6 on an Ultra 1
 gcc 2.95.2
 Sun's build tools (ld, nm, ar, etc.)
 Perl 5.005_03
 apache 1.3.11
 mod_perl 1.21

 Other "3rd party" mods I'm including:
 php 4.0b3
 mod_ssl 2.5.0-1.3.11

 Appended are perl -V output and the errors from make test

 Thanks for your help.

 --Jeff

 Perl Version:

 Summary of my perl5 (5.0 patchlevel 5 subversion 3) configuration:
Platform:
  osname=solaris, osvers=2.6, archname=sun4-solaris
  uname='sunos wiggy 5.6 generic sun4u sparc sunw,ultra-1 '
  hint=recommended, useposix=true, d_sigaction=define
  usethreads=undef useperlio=undef d_sfio=undef
Compiler:
  cc='gcc -B/usr/ccs/bin/ -B/usr/ccs/bin/', optimize='-O',
 gccversion=2.95.2 19991024 (release)
  cppflags='-I/usr/local/include'
  ccflags ='-I/usr/local/include'
  stdchar='unsigned char', d_stdstdio=define, usevfork=false
  intsize=4, longsize=4, ptrsize=4, doublesize=8
  d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=16
  alignbytes=8, usemymalloc=y, prototype=define
Linker and Libraries:
  ld='gcc -B/usr/ccs/bin/ -B/usr/ccs/bin/', ldflags ='
-L/usr/local/lib'
  libpth=/usr/local/lib /lib /usr/lib /usr/ccs/lib
  libs=-lsocket -lnsl -ldl -lm -lc -lcrypt
  libc=, so=so, useshrplib=false, libperl=libperl.a
Dynamic Linking:
  dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags=' '
  cccdlflags='-fPIC', lddlflags='-G -L/usr/local/lib'


 Characteristics of this binary (from libperl):
Built under solaris
Compiled at Jan 23 2000 14:15:33
@INC:
  /usr/local/lib/perl5/5.00503/sun4-solaris
  /usr/local/lib/perl5/5.00503
  /usr/local/lib/perl5/site_perl/5.005/sun4-solaris
  /usr/local/lib/perl5/site_perl/5.005


 make test errors:

 [Thu Jan 27 13:51:41 2000] [error] Can't load
 '/usr/local/lib/perl5/5.00503/sun4-solaris/auto/IO/IO.so' for module IO:
 ld.so.1: ../apache_1.3.11/src/httpd: fatal: relocation error: file
 /usr/local/lib/perl5/5.00503/sun4-solaris/auto/IO/IO.so: symbol main:
 referenced symbol not found at
 /usr/local/lib/perl5/5.00503/sun4-solaris/DynaLoader.pm line 169.



 Jeff Beard
 ___
 Web: www.cyberxape.com
 Phone: 303.443.9339
 Location: Boulder, CO, USA





__
FREE Personalized Email at Mail.com
Sign up at http://www.mail.com?sr=mc.mk.mcm.tag001



No Subject

2000-01-28 Thread vinecent hong

I thought you are going install apache server with mod_perl and mod_apache
both enable? If so ,the README.configuration under apache-1.3.9 directory is
clear enough for you  about install steps.

Also,what do you want to do with this line?
 perl Makefile.PL ADD_MODULE="src/module.php3/libphp3.a"

why want to add the libphp3.a  to mod_perl?


Vincent

- Original Message -
From: Craig Sebenik [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, January 28, 2000 11:20 AM
Subject: problem with configure?


 I ran into the following problem when trying to configure mod_perl with
 Apache and PHP:

  perl Makefile.PL ADD_MODULE="src/module.php3/libphp3.a"
 Configure mod_perl with ../apache-1.3.9/src ? [y]
 Shall I build httpd in ../apache-1.3.9/src for you? [y]
 /^#\s+(\w{0,3}Module\s+.*src/: unmatched () in regexp at -e line 1.
 Appending mod_perl to src/Configuration
 [snip]

 It is not critical... I was able to work around it, but I thought someone
 may want to take a look at it. The problem is "around" the routine
 "add_module" (in Makefile.PL).

 Anyway, before I start really looking into this I thought I would send out
 a message and see if I am doing something stupid... or maybe someone else
 has seen (and fixed?) this...

 If you need more data on my configuration (of perl, apache, solaris, etc.)
 or just in general what I am trying to do, just let me know.

 The following may be of interest:

  uname -a
 SunOS blah 5.7 Generic sun4u sparc SUNW,Ultra-5_10

  perl -V
 Summary of my perl5 (5.0 patchlevel 5 subversion 3) configuration:
   Platform:
 osname=solaris, osvers=2.7, archname=sun4-solaris
 uname='sunos dublin.hq.netapp.com 5.7 generic sun4u sparc
 sunw,ultra-5_10 '
 hint=recommended, useposix=true, d_sigaction=define
 usethreads=undef useperlio=undef d_sfio=undef
   Compiler:
 cc='gcc', optimize='-O', gccversion=2.8.1
 cppflags=''
 ccflags =''
 stdchar='char', d_stdstdio=define, usevfork=false
 intsize=4, longsize=4, ptrsize=4, doublesize=8
 d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=16
 alignbytes=8, usemymalloc=y, prototype=define
   Linker and Libraries:
 ld='gcc', ldflags =''
 libpth=/lib /usr/lib /usr/ccs/lib
 libs=-lsocket -lnsl -ldl -lm -lc -lcrypt
 libc=/lib/libc.so, so=so, useshrplib=false, libperl=libperl.a
   Dynamic Linking:
 dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags=' '
 cccdlflags='-fPIC', lddlflags='-G'


 Characteristics of this binary (from libperl):
   Built under solaris
   Compiled at Jul 12 1999 19:53:08
   @INC:
 /usr/local/packages/perl/5.005_03/lib/5.00503/sun4-solaris
 /usr/local/packages/perl/5.005_03/lib/5.00503
 /usr/local/packages/perl/5.005_03/lib/site_perl/5.005/sun4-solaris
 /usr/local/packages/perl/5.005_03/lib/site_perl/5.005
 .



 TIA...

 Craig.




__
FREE Personalized Email at Mail.com
Sign up at http://www.mail.com?sr=mc.mk.mcm.tag001



Re: make test fails

2000-01-28 Thread G.W. Haywood

Hi there,

On Thu, 27 Jan 2000, Jeff Beard wrote:

 running make test fails and produces the errors listed at the end of
 this message.  I searched the list archives and found a posting that
 suggested rebuilding Perl with the same compiler and tools that I
 use for apache and mod_perl. So I did but it didn't fix the
 problem. I did in fact build Perl the first time with gcc 2.8.1,
 then built gcc 2.95.2 from source. But I rebuilt Perl with the new
 compiler and get the same results.

I think the bit about using the same compiler means don't use gcc and
ztcpp, you ought to get away with 2.8 and 2.95, but it's good advice.

You're obviously comfortable with compiling your tools, so you could
try a few more recompilations.  My first try would be a static build.
It seems dynamic linking is responsible for all kinds of problems.  I
built mySQL for a customer yesterday and the Perl interface wouldn't
run with dynamic linking of Msql-Mysql-modules, no matter what I did.
No problems at all with --static.

If that fails I'd try with a minimum set of modules (just mod_perl to
start with, leave out php/mod_ssl) and work up from there to see what
(if anything) triggers it.  There have been questions about Apache
1.3.11 with mod_perl.  Try Apache 1.3.9?

There are several other possibilities.  Where is apache_1.3.11?  I
found I had to put both the mod_perl and Apache directories in
/usr/local, i.e. /usr/local/apache_1.3.9 and /usr/local/mod_perl-1.21.
Did you delete everything before recompiling?  You should.  Have you
tried `make distclean'?

Let me know how you get on.

73,
Ged.


 The environment, etc. is:
 
 Solaris 2.6 on an Ultra 1
 gcc 2.95.2
 Sun's build tools (ld, nm, ar, etc.)
 Perl 5.005_03
 apache 1.3.11
 mod_perl 1.21
 
 Other "3rd party" mods I'm including:
 php 4.0b3
 mod_ssl 2.5.0-1.3.11
 
 Appended are perl -V output and the errors from make test
 
 Thanks for your help.
 
 --Jeff
 
 Perl Version:
 
 Summary of my perl5 (5.0 patchlevel 5 subversion 3) configuration:
Platform:
  osname=solaris, osvers=2.6, archname=sun4-solaris
  uname='sunos wiggy 5.6 generic sun4u sparc sunw,ultra-1 '
  hint=recommended, useposix=true, d_sigaction=define
  usethreads=undef useperlio=undef d_sfio=undef
Compiler:
  cc='gcc -B/usr/ccs/bin/ -B/usr/ccs/bin/', optimize='-O', 
 gccversion=2.95.2 19991024 (release)
  cppflags='-I/usr/local/include'
  ccflags ='-I/usr/local/include'
  stdchar='unsigned char', d_stdstdio=define, usevfork=false
  intsize=4, longsize=4, ptrsize=4, doublesize=8
  d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=16
  alignbytes=8, usemymalloc=y, prototype=define
Linker and Libraries:
  ld='gcc -B/usr/ccs/bin/ -B/usr/ccs/bin/', ldflags =' -L/usr/local/lib'
  libpth=/usr/local/lib /lib /usr/lib /usr/ccs/lib
  libs=-lsocket -lnsl -ldl -lm -lc -lcrypt
  libc=, so=so, useshrplib=false, libperl=libperl.a
Dynamic Linking:
  dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags=' '
  cccdlflags='-fPIC', lddlflags='-G -L/usr/local/lib'
 
 
 Characteristics of this binary (from libperl):
Built under solaris
Compiled at Jan 23 2000 14:15:33
@INC:
  /usr/local/lib/perl5/5.00503/sun4-solaris
  /usr/local/lib/perl5/5.00503
  /usr/local/lib/perl5/site_perl/5.005/sun4-solaris
  /usr/local/lib/perl5/site_perl/5.005
 
 
 make test errors:
 
 [Thu Jan 27 13:51:41 2000] [error] Can't load 
 '/usr/local/lib/perl5/5.00503/sun4-solaris/auto/IO/IO.so' for module IO: 
 ld.so.1: ../apache_1.3.11/src/httpd: fatal: relocation error: file 
 /usr/local/lib/perl5/5.00503/sun4-solaris/auto/IO/IO.so: symbol main: 
 referenced symbol not found at 
 /usr/local/lib/perl5/5.00503/sun4-solaris/DynaLoader.pm line 169.
 
 
 
 Jeff Beard
 ___
 Web:  www.cyberxape.com
 Phone:303.443.9339
 Location: Boulder, CO, USA
 



Apache::DBI and Sybase

2000-01-28 Thread Vladimir Ivaschenko

Hello.

I have a trouble with Apache::DBI and Sybase under mod_perl (Embperl in
this case). After some time calls to DBI-connect start to fail with
message from CT-Lib - "Net-Library operation terminated due to
disconnect". Maybe the ping() method from DBD::Sybase doesn't work? 

Thanks,
Vladimir



Re: Apache::DBI and Sybase

2000-01-28 Thread Vladimir Ivaschenko

 0 Andre Landwehr wrote about "Re: Apache::DBI and Sybase":

  message from CT-Lib - "Net-Library operation terminated due to
  disconnect". Maybe the ping() method from DBD::Sybase doesn't work? 
 
 I can't help you with that problem but I can assure you that
 ping() from DBD::Sybase does work.

Yes, I looked at the source and obviously it works.

I have come to an idea that maybe the limit on the maximum of concurrent
connections to the Sybase server is reached. I have raised it now and will
see if it helps.

Vladimir



Re: Apache::DBI and Sybase

2000-01-28 Thread Matt Sergeant

On Fri, 28 Jan 2000, Vladimir Ivaschenko wrote:
 0 Andre Landwehr wrote about "Re: Apache::DBI and Sybase":
 
   message from CT-Lib - "Net-Library operation terminated due to
   disconnect". Maybe the ping() method from DBD::Sybase doesn't work? 
  
  I can't help you with that problem but I can assure you that
  ping() from DBD::Sybase does work.
 
 Yes, I looked at the source and obviously it works.
 
 I have come to an idea that maybe the limit on the maximum of concurrent
 connections to the Sybase server is reached. I have raised it now and will
 see if it helps.

Also make sure you're not filling up tempdb or your log.

-- 
Matt/

Details: FastNet Software Ltd - XML, Perl, Databases.
Tagline: High Performance Web Solutions
Web Sites: http://come.to/fastnet http://sergeant.org
Available for Consultancy, Contracts and Training.



Re: Apache::DBI and Sybase

2000-01-28 Thread Michael Peppler

Matt Sergeant writes:
  On Fri, 28 Jan 2000, Vladimir Ivaschenko wrote:
   0 Andre Landwehr wrote about "Re: Apache::DBI and Sybase":
   
 message from CT-Lib - "Net-Library operation terminated due to
 disconnect". Maybe the ping() method from DBD::Sybase doesn't work? 

I can't help you with that problem but I can assure you that
ping() from DBD::Sybase does work.
   
   Yes, I looked at the source and obviously it works.
   
   I have come to an idea that maybe the limit on the maximum of concurrent
   connections to the Sybase server is reached. I have raised it now and will
   see if it helps.

Unfortunately I would guess not. The above message is not due to a
failure to connect, but rather to an existing connection being
terminated. I really don't see why this happens, but you should check
your Sybase server logs for any stack trace information or other error 
conditions.

  Also make sure you're not filling up tempdb or your log.

Again - if the log or tempdb fills up you should normally get an
explicit message to that effect.

Michael
-- 
Michael Peppler -||-  Data Migrations Inc.
[EMAIL PROTECTED]-||-  http://www.mbay.net/~mpeppler
Int. Sybase User Group  -||-  http://www.isug.com
Sybase on Linux mailing list: [EMAIL PROTECTED]



Output in real time [Emberl 1.2.0]

2000-01-28 Thread Philippe Gobin



Hi 


I have a page which looks like this :

HTMLHEAD/HEAD
BODY

Some html code

[- while ( $condition ) {
 perl code who send informations on the 
output
} -]

Some html code
/BODY
/HTML

-

I want that the browser show the output perl code in real time 
to see the progress whithout a time out.

If I use [+ "The output" +] or [- print "The output" -] , the 
entire page is shown in one time ( when all is finished ) ie no real 
time

If I use [- print STDOUT "The output" -] , the 
textappears in real time but before the header and the html code 
:

 one line one line one line one line one line HTTP/1.1 
200 OK Date: Fri, 28 Jan 2000 16:14:56 GMT Server: Apache/1.3.9 (Unix) 
  mod_perl/1.21 Content-Length: 8481 
Keep-Alive: timeout=15, max=40 Connection: Keep-Alive Content-Type: text/html; 

 charset=iso-8859-1

 
 the body html code

I tried to use optEarlyHttpHeaderthere's 
no difference and I don't want to use it for all the page.

How should I do ?

Thanks

Philippe GOBINFrance Telecom 
HostingDirection des Développements ApplicatifsTel : 01 46 12 68 
05Fax : 01.46.12.67.00Hotline : 0810 777 000Site internet : http://www.fth.net Les offres de 
FTH : http://extranet.fth.net


thoughts on cgi emulation layer

2000-01-28 Thread brian moseley

On Thu, 27 Jan 2000, Doug MacEachern wrote:

 I'm not convinced NG is the final solution, but
 something like it where the functionality is broken
 apart and re-usable, unlike the existing Registry.pm
 where everything is all lumped in together.

in order for this cgi emulation system to really make sense,
we need to re-examine the responsibilities of the registry
modules and perform a more explicit separation of behaviors
into classes. this will make it much easier for everybody to
understand their options: 1) use the strict cgi emulation
layer; 2) use the 'bare bones' cgi emulation layer; 3) write
your own cgi emulation layer.

seems to me we are looking at this set of classes, or at
least namespaces:

   registry registry bb  custom
   |  | |
   --
|  |
 script --- handler
compiler cache

by providing script compiler and handler cache subclasses,
we can provide ways for folks to do things like: extending
the script compiler to import additional symbols into their
handlers' namespaces; using a disk or shared memory handler
cache; etc.

this layering could prove very useful to application
framework developers. if you look at platypus
(http://www.maz.org/platypus) for instance, the presentation
subsystem is pretty independent of the rest of the
framework; you just have to provide classes that implement a
specific set of interfaces. i have some code that embeds
mason as the presentation subsystem for platypus, altho that
code isnt yet in the distribution. with the above layering,
i could subclass the script compiler and handler cache,
which would allow me to provide two presentation options out
of the box with platypus: template based and cgi emulation.

i'd really like to see this entire subsystem packaged
together (along with the registry loader) inside the
distribution, with its own distinct, coherent set of
documentation and examples. i think part of people's
confusion stems from the fact that the relationships aren't
obvious or well-documented between the various components of
the cgi emulation subsystem.

i have some time coming up in the next few weeks, so unless
somebody gets to it before me, i'll look more concretely
into this idea then.



No Subject

2000-01-28 Thread raptor

hi,

Check this :")
http://www.fenrus.demon.nl/



Re: problem with configure?

2000-01-28 Thread Craig Sebenik

Quoting Li Hong ([EMAIL PROTECTED]):

 I thought you are going install apache server with mod_perl and
 mod_apache both enable? If so ,the README.configuration under
 apache-1.3.9 directory is clear enough for you about install steps.

Yes, the README is clear. There still appears to be a problem with one of
the features in the building of mod_perl. What you suggested appears to be
my workaround, i.e. biulding everything "manually". You *should* be able
to just have the mod_perl build process build Apache for you. (See below.)


 Also,what do you want to do with this line?
  perl Makefile.PL ADD_MODULE="src/module.php3/libphp3.a"
 
 why want to add the libphp3.a  to mod_perl?

I think you misunderstood the purpose of the "ADD_MODULE" flag. That
allows mod_perl to pass some more info to Apache. When you build mod_perl
it prompts you whether to configure Apache and whether to build it. If you
want to biuld Apache with support for other 3rd pary stuff (like PHP), and
you want the mod_perl proces to build it, then you need to pass that info
along. This appears to be done via the "ADD_MODULE" flag.


--
Craig Sebenik [EMAIL PROTECTED]

Wisdom has two parts:
1) having a lot to say, and
2) not saying it.



Novel technique for dynamic web page generation

2000-01-28 Thread Paul J. Lucas

I've implemented what I believe to be a novel technique for
dymanic web page generation.  Although explained in much more
detail here:

http://www.best.com/~pjl/software/html_tree/

essentially it parses an HTML file into a DOM-like tree where a
"visitor" function is called for every node in the tree.  The
parser in written in C++ using mmap(2) for speed, but there is
a Perl XS layer and Apache mod_perl module.  Using the Apache
module, you can bind HTML CLASS attributes to functions via a
class map that is simpy a Perl hash.

The novel thing about this is that pure, standard HTML files
are used.  This allows one to create a mock-up of the page
complete with dummy data to see how it will look and then take
that same page, without modification, and have it used to
generate dynamic content.  The code behind the page is in a
separate .pm file (that is cached similarly to the way
Apache::Registry does it).

- Paul



Re: httpd.conf parameter affect mod_perl performance?

2000-01-28 Thread Ask Bjoern Hansen

On Fri, 28 Jan 2000, vinecent hong wrote:

 KeepAliveTimeout
 MaxKeepAliveRequests

These are for the HTTP/1.1 KeepAlive things - not for the apache child
lifetime.

 KeepAlive On
 #I am kinda blur of this.Slashdot script set it to OFF.and I think OFF means
 once a apache child finish serve a request,it will died?So not allow

No.

The Slashdot stuff probably have it off because it otherwise will keep
some childs occupied waiting for the next request. If you can afford the
extra memory keeping it on will increase your responsiveness.

 MaxRequestsPerChild
 #Apache said: We reccomend you leave this number high, for maximum
 performance.
 #I also think so.But why Slashdot.org set it just 150?to prevent sometimes
 bug in script to use all memory?

Exactly.

I tend to keep it so each child will live approximately 24 hours, but it
really depends on your memory usage patterns. For my servers it's usually
quite stable.


  - ask

-- 
ask bjoern hansen - http://www.netcetera.dk/~ask/
more than 60M impressions per day, http://valueclick.com



Re: overriding document root dynamically?

2000-01-28 Thread Sean Chittenden

This is more along the lines of a cautionary note than anything else,
but mod_perl isn't exactly a secure system, so I'd be very carefull with this
setup: if you're doing what I think you're doing. Do you trust your users?  Are
the allowed to execute perl-code?  If so, they could gain access to other
people's files in their www directory.

Are there any modules that are similar to user_dir that are more
configurable?  --SC

On Fri, 28 Jan 2000, Jonathan Swartz wrote:
 I have an application where I want the effective DocumentRoot to change 
 based on dynamic properties in the request.
 
 In particular, we are creating a number of domain aliases, pointing to the 
 same IP address, so that each user can view their own version of the web 
 site. e.g.
 
joe.dev.mysite.com would have doc root /home/joe/www
   dave.dev.mysite.com would have doc root /home/dave/www
 
 etc. I could do this with a set of virtual servers, but then I have to 
 change the httpd.conf and restart the server every time a user is
 added, which is undesirable.
 
 Here's what I wanted to work:
 
 sub trans_handler
 {
 my ($r) = @_;
 my ($user) = ($r-header_in('Host') =~ /^[^\.]+/);
 $r-document_root("/home/$user/www");
 return DECLINED;
 }
 
 PerlTransHandler trans_handler
 
 but I got
 
 [error] Usage: Apache::document_root(r) at handler.pl line 41
 
 so document_root ain't writable.
 
 Any other suggestions?  I'm loathe to recreate the entire default Apache 
 directory handler in my trans_handler (looking for index.html, etc.)
 
 Jon
-- 
Sean Chittenden p. 650.473.1805
auctia.com, Inc.f. 650.329.9651



Re: overriding document root dynamically?

2000-01-28 Thread Randal L. Schwartz

 "Jonathan" == Jonathan Swartz [EMAIL PROTECTED] writes:

Jonathan Sure, but then index.html files won't get found,

Uh, why not?  No module after the Trans phase looks at document root,
that I'm aware of.  index.html is handled by mod_autoindex during the
content phase upon noticing that it's a MAGIC_DIR_TYPE, causing an
internal (or external, if no slash) redirect.  And *that'll* come back
through your Trans handler to get the same treatment.

Jonathan  Alias
Jonathan directives won't get processed.

Well, that's true, because those are Trans phase items.  But do you
really want an alias to be run?  It'll be wrong!  Besides, you can do
the same Aliasing at the time you come up with the $r-filename.

Another option is to do a subrequest, setting $r-pnotes with a value
that will cause *your* handler to quickly DECLINE, and letting the
rest of the stuff trigger.  Then look at the $r-filename from that
subrequest, and use that to decide if you need to edit your
$r-filename on the main request, and return OK.  Weird, but cool. :)

Jonathan  etc. Basically I don't want
Jonathan to have to reimplement Apache's entire default file handler
Jonathan in Perl.

I don't think you need to.  I think only the Trans handlers need to
worry about docroot.  Of course, I'm possibly wrong. :)

-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: Looking for Higher Ed mod_perl'ers

2000-01-28 Thread Brad Cox

I might be interested. I did the virtual school project in mod_perl. 
Must have been doing something right... I won the $25,000 Paul Allen 
competition with the taming the electronic frontier course ;)

However I'm not sure our goals are aligned. I've gave up trying to 
"put lecture materials online" (note your website) when I realized 
that wasn't consistent with the use the right tool for the job 
philosophy.

It is impossible to beat brick and mortar lectures at what lectures 
are great at... delivering non-experiential information from prof to 
students, with retention relatively unimportant, gauged only 
occasionally via quizes.

It was only when I redefined the goal in student-centric terms, as 
"delivering top quality learning to students whereever they might 
be", that things started clicking. Finally resigned when it became 
clear that the this was completely inconsistent with the goals of 
higher education, certainly at GMU if not in general.

For more about this, see http://virtualschool.edu/98c and especially 
http://virtualschool.edu/heu.


At 12:17 PM -0500 01/22/2000, Gerd Kortemeyer wrote:
Hi,

Anybody out there who is using mod_perl for online teaching and learning
applications in higher education? Looking for possible future collaborators on
an open source project, http://www.lite.msu.edu/kortemeyer/lon/ .

- Gerd.

Content-Type: text/x-vcard; charset=us-ascii;
 name="korte.vcf"
Content-Transfer-Encoding: 7bit
Content-Description: Card for Gerd Kortemeyer
Content-Disposition: attachment;
 filename="korte.vcf"

Attachment converted: G3-A4:korte.vcf 1 (TEXT/R*ch) (00049D48)

---
Dr. Brad Cox [EMAIL PROTECTED]
Phone: 703 361 4751 Fax: 703 995 0422 Cellular: 703 919-9623
http://virtualschool.edu A Project with Paradoxical Goals
PGP Signature: E194 C6E5 92D8 B8FB 20E8  8667 929A 95A0 FCB6 7C62



Re: Novel technique for dynamic web page generation

2000-01-28 Thread Paul J. Lucas

On Fri, 28 Jan 2000, Perrin Harkins wrote:

 Looks almost exactly like XMLC from http://www.enhydra.org/.

I hadn't heard of that, but, from a quick look, enhydra is
XML/Java not HTML/Perl.  It also seems like a much more
"involved" solution.

 It's an interesting idea to use straight HTML, since it enables you to take
 HTML generated by authoring tools and use it immediately,

That's the exact point, yes.  :-)

Another small benefit is it allows the web page author to see a
mock-up page without having to have any code behind it.  Hence,
a web site can be designed in mock-up first.

 but it seems like it does tie your program closely to the structure of the
 documents.

It does somewhat, but much less so than existing techniques:

1. Conventional CGI (a print-statement-laden Perl script): this
   tightly intertwines code and markup.

2. Embedded Perl (e.g., ePerl): variables are placed at
   specific points in the markup.

3. Non-standard tags: placed at specific points in the markup.
   (Another downside: DreamWeaver doesn't understand them.)

So, to me, all those techniques tightly tie the document
structure to the code.

However, with this technique, you can move CLASS names around
freely and NOT have to touch a single line of code.

For example, if at first I have:

OL CLASS="query_db"
LI CLASS="fetch_next"
SPAN CLASS="text::name"John Smith/SPAN
lt;SPAN CLASS="text::email"[EMAIL PROTECTED]/SPANgt;
/OL

that queries a database and lists people and their e-mail
addresses, I can then change the HTML to use a table instead:

TABLE CLASS="query_db"
TR CLASS="fetch_next"
TD CLASS="text::name"John Smith/TD
TD CLASS="text::email"[EMAIL PROTECTED]/TD
/TABLE

and not have to touch a single line of the underlying Perl
code.

In theory, if the people who design the web pages using
DreamWeaver and the Perl programmers agree on conventions for
CLASS names in advance, the pages generated by the web
designers should "just work."

The "win" is being able to separate the "getting the data" part
in the Perl code and its "presentation" part in the HTML.

 Am I correct in thinking that if you want to put a piece of text pulled from
 a database into a page you have to know exactly where it should end up in the
 HTML parse tree?

Yes; but you control where that is by editing the HTML file.
And you can reedit it and move it around again and again; and
you never have to touch the underlying Perl code.  See above.

- Paul



Re: splitting mod_perl and sql over machines

2000-01-28 Thread Marko van der Puil

Hello,

In response to Stas's question about improving performance by splitting your
SQL and Apache over diffent machines... Please reads Stas's original posting
for this discussion.

There has been an discussion in the Mod_Perl mailing list about whether you
would profit from splitting your Mod_Perl enabled Apache server and a SQL
database like MySQL over multiple machines. To give this discussion some
technical and scientific background I've run some benchmarks.

They're available in HTML on
http://cztb.zeelandnet.nl/rzweb/benchmarks/splitservers.html

--
Conclusions: (for the benchmarks see the web-page above)
By this testing I'd say the difference in CPU performance (47%) is the key
here. Where does the other 16 % go ? Probably in the 0,30 load of the MySql
machine. And maybe in the slighly older version of MySql. I don't realy see
any significant decrease in performance here, by using a separate MySql server
for your Database.

But You Should Be asking yourself what 3 Tier actually means.

This has been figured out by all sorts of University buffs, and some mayor
soft and hardware firms as well. Why would they invent 3 Tier? Because its
faster, and easier to administrate even more secure.

Now to prove that, I did the same test again, but then with ab doing a
thousand requests over 20 simultaneous connections. The Mysql deamon on the
mod_perl machine was bogging up over 95% of the CPU time, in addition to
taking up over 20 Megs of memory The mod_perl httpd's are taking up to 10
Megs each! The httpd's and mysqld are fighting for Memory and CPU time and
take the combined server to its limits. By splitting the tasks over two
servers both do, what the're good at. The MySQL server has all it's resources
available for doing query's, the Mod_Perl server has all the memory it needs
to server the requests.
So what happens where your site is (very) busy? You actually get a *MAYOR*
speed increase! Even when the mysql server is a MUCH SLOWER machine than the
mod_perl apache server!

So you can definetly benefit from splitting your Mod_perl and MySql Servers
over diffent machines, even slower ones...

With the price of computer hardware these days, state of the art being more
than double the price of a system with some slightly lower spec. If you
anticipate a lot of traffic, go for two cheap machines in stead of one state
of the art one... UW2 SCSI in the sql machine, loads of RAM in the mod_perl
server.

What would work great would be these specs.

Mod_Perl:
400 Mhz machine
256 MB Ram or better
Small IDE disk

MySQL:
500 Mhz machine
128 MB Ram
Large UW2 SCSI disk


I hope this has some added value to this discussion...
--

Yours sincerely,
Met vriendelijke groeten,


Marko van der Puil http://www.renesse.com
   [EMAIL PROTECTED]




ANNOUNCE: Updated Hello World Web Application Benchmarks

2000-01-28 Thread Joshua Chamas

Hey,

I have updated the Hello World Web Application Benchmarks, 
now at http://www.chamas.com/bench/

The old page hello_world.html points here now, if anyone 
could update the link at http://perl.apache.org/, that would 
be grand.

New in the fastest benchmarks are:

 + the fastest yet mod_perl results, at 1042 hits per sec
   thanks Chip Turner, apparently his 100Mbs network was 
   the bottleneck ;) !!!

 + Velocigen Perl on Linux/Apache thanks to 
   Shahin Askari of Velocigen

 + 1st benchmarks for JSP Java and JSP JavaScript for 
   Caucho's Resin, and best benchmark for Java Servlet, 
   thanks to Scott Ferguson of Caucho

 + 1st benchmarks for RXML / Roxen on WinNT

New cool hardware benchmarks listed at 
http://www.chamas.com/bench/hello_bysystem.html
 
 + Linux/RH 2.2.14 - PIII-500 x 2
 + Linux/RH 2.2.14 - Ahtlon-600 (ooh an Athlon)
   thanks again Chip!

Thanks for all of your contributions, and keep them coming!

-- Joshua
_
Joshua Chamas   Chamas Enterprises Inc.
NodeWorks  free web link monitoring   Huntington Beach, CA  USA 
http://www.nodeworks.com1-714-625-4051



Re: Novel technique for dynamic web page generation

2000-01-28 Thread Paul J. Lucas

On Fri, 28 Jan 2000, Jason Bodnar wrote:

The resultant file, no longer pure HTML, is something that can not be
read back into DreamWeaver should the page need a tweak.

 Hmmm ... I thought one of the big pluses of Dreamweaver is that it guaranteed
 roundtrip HTML. I'm guessing it doesn't?

I suppose it depends on what one does to the HTML.  Even if it
does now, it still doesn't allow mock-up content to be easily
replaced.

- Paul



Re: Novel technique for dynamic web page generation

2000-01-28 Thread brian moseley

On Fri, 28 Jan 2000, Perrin Harkins wrote:

 Looks almost exactly like XMLC from
 http://www.enhydra.org/.  It's an interesting idea to

sounds a lot more like the approach webobjects takes. except
that webobjects defines a special tag representing a dynamic
element, and then uses a map to type each instance of that
tag in the html page and to bind attributes for each tag to
variables and methods in the class.

altho i havent actually looked at the code, im a big fan of
the approach. of course a requirement of using it is that
your application's interface is strictly html based. but
thats the case for a very large percentage of web
applications. i will definitely be trying this stuff out.



ApacheDBI question

2000-01-28 Thread Deepak Gupta

How does connection pooling determine how many connections to keep open?

The reason I ask is that I am afraid my non-modperl scripts are getting
rejected by the db server b/c all (or most) connections are being
dedicated to Apache activity. 

Thanks,
Deepak



Re: splitting mod_perl and sql over machines

2000-01-28 Thread Pascal Eeftinck

At 11:16 28-1-2000 +0100, Marko van der Puil wrote:
Hello,

In response to Stas's question about improving performance by splitting your
SQL and Apache over diffent machines... Please reads Stas's original posting
for this discussion.

There has been an discussion in the Mod_Perl mailing list about whether you
would profit from splitting your Mod_Perl enabled Apache server and a SQL
database like MySQL over multiple machines. To give this discussion some
technical and scientific background I've run some benchmarks.

They're available in HTML on
http://cztb.zeelandnet.nl/rzweb/benchmarks/splitservers.html

--
Conclusions: (for the benchmarks see the web-page above)
By this testing I'd say the difference in CPU performance (47%) is the key
here. Where does the other 16 % go ? Probably in the 0,30 load of the MySql
machine. And maybe in the slighly older version of MySql. I don't realy see
any significant decrease in performance here, by using a separate MySql server
for your Database.

But You Should Be asking yourself what 3 Tier actually means.

This has been figured out by all sorts of University buffs, and some mayor
soft and hardware firms as well. Why would they invent 3 Tier? Because its
faster, and easier to administrate even more secure.

Now to prove that, I did the same test again, but then with ab doing a
thousand requests over 20 simultaneous connections. The Mysql deamon on the
mod_perl machine was bogging up over 95% of the CPU time, in addition to
taking up over 20 Megs of memory The mod_perl httpd's are taking up to 10
Megs each! The httpd's and mysqld are fighting for Memory and CPU time and
take the combined server to its limits. By splitting the tasks over two
servers both do, what the're good at. The MySQL server has all it's resources
available for doing query's, the Mod_Perl server has all the memory it needs
to server the requests.
So what happens where your site is (very) busy? You actually get a *MAYOR*
speed increase! Even when the mysql server is a MUCH SLOWER machine than the
mod_perl apache server!

So you can definetly benefit from splitting your Mod_perl and MySql Servers
over diffent machines, even slower ones...

With the price of computer hardware these days, state of the art being more
than double the price of a system with some slightly lower spec. If you
anticipate a lot of traffic, go for two cheap machines in stead of one state
of the art one... UW2 SCSI in the sql machine, loads of RAM in the mod_perl
server.

What would work great would be these specs.

Mod_Perl:
400 Mhz machine
256 MB Ram or better
Small IDE disk

MySQL:
500 Mhz machine
128 MB Ram
Large UW2 SCSI disk

Some interesting points, Marko.

I haven't done any actual testing in this area myself, although I do
have a couple of observations to make. BTW: 'my' mod_perl enabled
server is currently a Sun Ultra Enterprise II (2 x 300MHz) with 512MB,
and it handles some 200K purely dynamic pageviews a day all by itself.
It has no static content to serve. Most of its (Dutch only, sorry :)
dynamic content is here: http://cgi5.planet.nl/aanbod/planet/

Initially we've had this mod_perl system running on a slower [but
with more memory] Sun server, with a lot more different scripts
and databases and doing about 5-7 MySQL queries per page. That
ended up to about 20K to 50K SQL queries per 5 minutes during peak
hours (!). With the MySQL server running on the *same* machine
as the Apache server you'll get a lot of resource contention.
The system wasn't unresponsive at all (fortunately) but still
doing 100% CPU on both CPU's for 75% of the day, ie HIGHLY loaded.
Swap usage (without much swap activity btw) could run up to 800MB
or more ... and MySQL was constantly showing some 65% cpu usage
in the top overview. At no time did the machine have room for
breathing.

I rewrote the lot (moving from Apache::Registry to HTML::Mason) to
benefit from some caching and with that I managed to dramatically
drop the number of SQL queries needed per request. We've moved from
20-50K queries per 5 minutes down to 3K-5K queries per 5 minutes.
A significant improvement I'd say. :) When I moved the lot to
HTML::Mason I also switched servers. Now this whole dynamic content
system is doing about 50% usage on both CPU's during peak hours, so
I have lots of room to spare still. But nowadays MySQL barely shows
up in CPU usage overviews - and it's still running together with
Apache on one machine.

With any system there's a certain point at which it will 'tip over',
i.e. resource contention will cause a sudden dramatic drop in
performance at that point through some snowball effect. There could
be several reasons for the cause of a tremendous drop of performance
at that specific point, but be sure it's there. (For example, your
system gets to be so busy that it can't serve requests in 'real'
time anymore and they queue up and thus worsen your already bad
situation ... Marko mentions this above too). You've got to ensure
that your parameters are 

Re: ApacheDBI question

2000-01-28 Thread Perrin Harkins

On Fri, 28 Jan 2000, Deepak Gupta wrote:

 How does connection pooling determine how many connections to keep open?
 
 The reason I ask is that I am afraid my non-modperl scripts are getting
 rejected by the db server b/c all (or most) connections are being
 dedicated to Apache activity. 

Please RTFM.  perldoc Apache::DBI.
- Perrin



perlhandler - CGI.pm - no request object?

2000-01-28 Thread Brian Reichert

Sorry about the confusing subject line.

I'm witnessing a symptom:

Using apache_1.3.9 and mod_perl-1.21 and CGI.pm-2.56 under 3.2-STABLE.

I'm writing a cookie-based access handler.  (Based very directly
on the Apache::TicketMaster example in the Eagle book.)

My perl handler uses:

  use CGI qw(:standard);

then eventually calls

  header()

Deep down, my error log says

  [Fri Jan 28 17:59:24 2000] [error] Can't call method "send_cgi_header" on
  an undefined value at (eval 11) line 50.

Much digging shows that at around line 1278:

my $header = join($CRLF,@header)."${CRLF}${CRLF}";
if ($MOD_PERL and not $nph) {
my $r = Apache-request; 
$r-send_cgi_header($header);  
return ''; 
}
return $header;

the call to Apache-request returns an undef.

The pod for Apache asserts:

   Apache-request([$r])

   The Apache-request method will return a reference to
   the request object.

but fails to disclose under what cirumstances the call fails.  I
most certainly have an Apache request object at my disposal, my
handler otherwise copes with it.

Does anyone have any opinions or pointers?

Thanks...

-- 
Brian 'you Bastard' Reichert[EMAIL PROTECTED]
37 Crystal Ave. #303Daytime number: (781) 899-7484 x704
Derry NH 03038-1713 USA Intel architecture: the left-hand path



RegistryLoader

2000-01-28 Thread William Deegan

Greetings,

If I user RegistryLoader to preload a script, should it
show up in /perl-status?rgysubs   (Apache::Status)??

-Bill

-- 
***
*  Never ascribe to malice that which is  *
*  adequately explained by incompetence   *
***

begin:vcard 
n:Deegan;William
tel;fax:650-413-1355
tel;work:650-598-3858
x-mozilla-html:FALSE
url:http://www.iescrow.com
org:iEscrow,Inc.
version:2.1
email;internet:[EMAIL PROTECTED]
title:Web Site Operations Manager
note:http://www.orangefood.com/baddog
adr;quoted-printable:;;2600 Bridge Parkway=0D=0ASuite 201;Redwood Shores;CA;94065;
x-mozilla-cpt:;-30592
fn:William Deegan
end:vcard



Re: splitting mod_perl and sql over machines

2000-01-28 Thread Perrin Harkins

On Fri, 28 Jan 2000, Marko van der Puil wrote:
 There has been an discussion in the Mod_Perl mailing list about whether you
 would profit from splitting your Mod_Perl enabled Apache server and a SQL
 database like MySQL over multiple machines. To give this discussion some
 technical and scientific background I've run some benchmarks.
 
 They're available in HTML on
 http://cztb.zeelandnet.nl/rzweb/benchmarks/splitservers.html

While I agree with your conclusion (splitting the database onto another
machine is a given in a clustered environment like the one I work with), I
think your benchmark is kind of misleading.  You test performance with
everything on one machine and then you test it with things split between
two machines.  To be fair, both tests should use both machines.
- Perrin



Re: Novel technique for dynamic web page generation

2000-01-28 Thread Perrin Harkins

On Fri, 28 Jan 2000, Paul J. Lucas wrote:
  but it seems like it does tie your program closely to the structure of the
  documents.
 
   It does somewhat, but much less so than existing techniques:
 
   1. Conventional CGI (a print-statement-laden Perl script): this
  tightly intertwines code and markup.
 
   2. Embedded Perl (e.g., ePerl): variables are placed at
  specific points in the markup.
 
   3. Non-standard tags: placed at specific points in the markup.
  (Another downside: DreamWeaver doesn't understand them.)

Now that I've seen your example, it seems to me that you are doing almost
exactly the same as #3.  The only difference is that you're using HTML
extensions ("CLASS=foo") that are legal in authoring tools.  Otherwise,
this is really the same effect as using HTML::Template or Template
Toolkit. 

  Am I correct in thinking that if you want to put a piece of text pulled from
  a database into a page you have to know exactly where it should end up in the
  HTML parse tree?
 
   Yes; but you control where that is by editing the HTML file.
   And you can reedit it and move it around again and again; and
   you never have to touch the underlying Perl code.  See above.

This is different from XMLC, which requires the program using it to
specify which node to replace the contents of.  I think your approach is
better.

- Perrin



Re: Novel technique for dynamic web page generation

2000-01-28 Thread Paul J. Lucas

On Fri, 28 Jan 2000, Perrin Harkins wrote:

I wrote:
  3. Non-standard tags: placed at specific points in the markup.
 (Another downside: DreamWeaver doesn't understand them.)

 Now that I've seen your example, it seems to me that you are doing almost
 exactly the same as #3.  The only difference is that you're using HTML
 extensions ("CLASS=foo") that are legal in authoring tools.

Not quite.  If you use a package like Meta-HTML, the non-
standard tags are used to do ALL the work including (for
example) database queries and iterations.  Hence, the "logic"
is very much embedded within a given point in the HTML file as
well.

My technique completely separates the logic from the dynamic
content substitution.

 This is different from XMLC, which requires the program using it to specify
 which node to replace the contents of.

Right: my technique is not tied to a specific node; it's tied to
ANY node that has the right CLASS names.

 I think your approach is better.

Thanks!  :-)

- Paul



Re: ANNOUNCE: Updated Hello World Web Application Benchmarks

2000-01-28 Thread Joshua Chamas

Perrin Harkins wrote:
 
 On Fri, 28 Jan 2000, Joshua Chamas wrote:
  I have updated the Hello World Web Application Benchmarks,
  now at http://www.chamas.com/bench/
 
 The end result of all this is that you have benchmark numbers which, while
 sort of entertaining, should not be used to make any kind of decision.  I
 would hate to think that someone used these numbers as the basis for
 making decisions and didn't benchmark the options himself.  I almost fell
 into that trap when I saw the low scores for certain things and wrote them
 off without doing my own testing.  (Yes, I know a good developer should
 never believe other people's benchmarks.)
 
 Although there is a disclaimer on the page, I wish it said more.  In some
 ways, it's worse to have these misleading benchmarks than no benchmarks at
 all.  I shudder to think what a naive manager might do with these.
 
 Okay, I'm done complaining for now.  Sorry Joshua, I know you put effort
 into this, and I do appreciate it.
 

I hear you Perrin.  Its better to have something than nothing 
though.  There is no way that people are going to benchmark 
10+ different environments themselves, so this merely offers 
a quick fix to get people going with their own comparisons.
Do you have any idea how much time it takes to do these? 
Run a few on different operating systems, the time involved 
is humbling to say the least.

In order to improve the benchmarks, like the Resin  Velocigen 
ones that you cited where we have a very small sample, we simply 
need more numbers from more people.  That these benchmarks are 
open to anyone submitting, means we all can scrutinize those
benchmarks submitted by those with agendas and submit our own.

If you can find any way to make these benchmarks better than
simply more data, please feel free to suggest improvements.
Also, any disclaimer modifications might be good if you feel 
there can be more work done there.

-- Joshua
_
Joshua Chamas   Chamas Enterprises Inc.
NodeWorks  free web link monitoring   Huntington Beach, CA  USA 
http://www.nodeworks.com1-714-625-4051