Kee Hinckley wrote:
>
> >
> > At 17:18 28.04.2002, Ernest Lergon wrote:
> > >Now I'm scared about the memory consumption:
> > >
> > >The CSV file has 14.000 records with 18 fields and a size of 2 MB
> > >(approx. 150 Bytes per record).
> >
> > Now a question I would like to ask: do you *need* to
Perrin Harkins wrote:
>
> > $foo->{$i} = [ @record ];
>
> You're creating 14000 arrays, and references to them (refs take up space
> too!). That's where the memory is going.
>
> See if you can use a more efficient data structure. For example, it
> takes less space to make 4 arrays with 14000
Hi,
thank you all for your hints, BUT (with capital letters ;-)
I think, it's a question of speed: If I hold my data in a hash in
memory, access should be faster than using any kind of external
database.
What makes me wonder is the extremely blown up size (mod)perl uses for
datastructures.
Ern
Have you tried DBD::AnyData? It's pure Perl so it might not be as fast
but you never know?
--
Simon Oliver
> Has anyone implemented a bandwidth limiting mechanism in mod_perl?
Have you looked at mod_throttle?
http://www.snert.com/Software/mod_throttle. There was a thread on this
last week so if you want more information you might read through that.
--Ade.
On Mon, Apr 29, 2002 at 07:49:33AM -0500, Ade Olonoh wrote:
> > Has anyone implemented a bandwidth limiting mechanism in mod_perl?
>
> Have you looked at mod_throttle?
I have. It does not work under load. At least, three months ago it didn't at
all.
Alex.
We currently use Apache::AuthCookie for authentication/authorization, and it
works great. However, we want to make a change to how the login works.
In addition to having Apache::AuthCookie intercept requests for URL's that
require auth/authz, we would like to provide a signon area on the main pa
At 07:15 29.04.2002, Martin Haase-Thomas wrote:
>Hi Andrew,
>
>thanx for the idea to have a look at Apache::ASP. I took that look
>meanwhile and to me that seems to be an overhead. Maybe I'm naive, because
>it wasn't much more than a glance, but the code copes with things a server
>page *never
Ernest Lergon wrote:
> So I turned it around:
>
> $col holds now 18 arrays with 14000 entries each and prints the correct
> results:
...
> and gives:
>
> SIZE RSS SHARE
> 12364 12M 1044
>
> Wow, 2 MB saved ;-))
That's pretty good, but obviously not what you were after.
I tried using the
Have that proactive signin area forward to a page behind
Apache::AuthCookie protection and then have that page forward them right
back to where they were? If you don't have frames that would be pretty
easy.
-Fran
Ken Miller wrote:
> We currently use Apache::AuthCookie for authentication/aut
Hi there,
On Mon, 29 Apr 2002, Ernest Lergon wrote:
> I think, it's a question of speed: If I hold my data in a hash in
> memory, access should be faster than using any kind of external
> database.
Test it. You may be surprised what the OS will do for you by caching.
> What makes me wonder is
I have a problem I can't seem to track down, showing up in our logs is:
Out of memory!
Callback called exit.
Typically there are two or three of these right after one another.
Depending on server load they show up every 15min. to an hour.
I followed the guidelines for allocating an emergency mem
> Is there anyway to have the parent apache process log all
> creations/exits of the children? This way I could setup an access log
> with the PID of each child and then trace back all requests served after
> it's death.
recipe 17.5 in the cookbook describes how to do this. basically you can h
At 01:05 29.04.2002, Mike Melillo wrote:
>[Sun Apr 28 18:05:42 2002] [error] [client 192.168.1.100] File does not
>exist: /moc/ticketLogin
>
>
> SetHandler perl-script
>
> PerlAccessHandler Apache::TicketMaster
You've misconfigured it. Look at the Eagle again, it says "PerlHandler".
Perrin Harkins wrote:
>
> [snip]
>
> Incidentally, that map statement in your script isn't doing
> anything that I can see.
>
It simulates different values for each record - e.g.:
$line = "\t\t1000\t10.99";
@record = split "\t", $line;
for ( $i = 0; $i < 14000; $i++ )
{
map { $
At 18:10 29.04.2002, Paul Dlug wrote:
>I have a problem I can't seem to track down, showing up in our logs is:
>Out of memory!
>Callback called exit.
I don't know if it'll be of any help, but you might want to look in the
guide:
http://perl.apache.org/preview/modperl-docs/dst_html/docs/1.0/guid
On Mon, 29 Apr 2002, F. Xavier Noria wrote:
> 3. Could one set up things in a way that allows the database to see
>the timestamps and program a trigger to delete old sessions? Or
>is there a standard idiom for doing this in a different way?
>
thats what i usually do ..
Ernest Lergon wrote:
> Hi,
>
> thank you all for your hints, BUT (with capital letters ;-)
>
> I think, it's a question of speed: If I hold my data in a hash in
> memory, access should be faster than using any kind of external
> database.
>
> What makes me wonder is the extremely blown up size
> Interesting ... not sure if implementing this in this fashion
> would be
> worth the overhead. If such a need exists I would imagine
> that would have
> choosen a more appropriate OS level solution. Think OpenAFS.
It is always nice to use stuff that has ibm backing and likely has at least
> I would think it could be useful in non-mod_perl applications as well
> - you give an example of a user's mailbox. With scp it might be even
> more fun to have around :) (/me is thinking of config files and
> such)
mod_perl works very well with the system for keeping track of what boxes are
d
This is OT for mod_perl, sorry...
* Cahill, Earl <[EMAIL PROTECTED]> [2002-04-29 13:55]:
> > Our NIS maps are on the order
> > of 3 GB per file (>64k users).
>
> Man, that is one big file. Guess dropping a note to this list sorta
> lets you know what you have to really scale to. Sounds like di
unsubscribe [EMAIL PROTECTED]
unsubscribe [EMAIL PROTECTED]
> But I will need a thread that processes the backend stuff, such as
> maintaining the database and message queue (more like a cron). Is
> this configuration possible?
You can do this now. We rely on cron to kick off the job, but all
the business logic is in Apache/mod_perl. The advantage of us
>
> You can do this now. We rely on cron to kick off the job, but all
> the business logic is in Apache/mod_perl.
How do you use cron to do scheduling, yet "calls" Apache/mod_perl to
do the processing?
Consider cron does not exist in Win32, maybe an all-Apache solution
will be simpler and mo
Lihn, Steve wrote:
> How do you use cron to do scheduling, yet "calls" Apache/mod_perl to
> do the processing?
Your cron script just uses LWP to call a module running in mod_perl.
> Consider cron does not exist in Win32, maybe an all-Apache solution
> will be simpler and more elegant!?
Cron doe
Hi,
The Apache 2 Connection handler opens up the possibility of
using it for all kinds of protocol servers.
However, I have a wild question: Is it possible to use Apache mod_perl
for a schedule server? I.e., a server that is self existent.
For example, I can use Apache 2 for Telnet, FTP, SMTP, o
I'm really lost with this...
I'm trying to set a session cookie from PerlAccessHandler. I'm basically
doing (simplified code):
my $cookie_jar = Apache::Cookie->fetch
my $session_id = $cookie_jar->{ session }->value;
if( !session_active( $session_id ) ) {
my $session_obj = create_ses
hmm, I still can't get it to work, but it somehow works under LWP. the
following code actually gets the cookie correctly, and no bogus sessions
are created in my server. any ideas??
use strict;
use LWP::UserAgent;
my $ua = LWP::UserAgent->new();
$ua->cookie_jar({ file => "$ENV{ HOME }/cook
Hi Perrin,
first of all please excuse my late answer - lots of things in mind to
care about, as I'm hopefully close to releasing the 0.2 version of the
serverpage implementation (and besides I urgently need a new job, too).
But thank you for your presice statement, that is exactly what I neede
Hi All,
I am trying to porting mod_cgi script to mod_perl script because
the mod_cgi script don't run correctly under mod_perl.
When running under apache::registry, the script show wrong result.
When running under apache::perlrun, the script sometimes crash.
In error_log show :
"can't locate obj
31 matches
Mail list logo