On 8/23/06, Stephen Carville <[EMAIL PROTECTED]> wrote:

joe bayer wrote:
> Group,
>
> I am trying to write a load testing script.
>
> The script goes like this: ++++++++++++++++++++++++++++++++++ my $j =
> 0; while ($j < 300) { $dbh[$j] = DBI->connect (
> "dbi:Oracle:$instance[$i]", "$username[$i]", "$passwd[$i]", {
> PrintError => 1, RaiseError => 1, AutoCommit => 1 } ) || die
> "Database Connection not made $DBI::errstr" ;# Create a Database #do
> some random, endless select statement here. $j++; }
> ++++++++++++++++++++++++++++++++++++++++++
>
> What I want is 300 session do the select statement simultaneously.
> But this script will do one session after another.
>
> Do I REALLY have to start 300 perl script in order to this testing,
> or  there is some way in perl that one script can start up 300
> session and  do their invidual select simultaneously?

Check out Parallel::ForkManager.

> Thanks for your help.
>
> Joe
>
>  --------------------------------- All-new Yahoo! Mail - Fire up a
> more powerful email and get things done faster.



Hi Stephen,

Unless I'm missing something (I'm no expert in this arena)
It seems like a script will ecxecute one stmt at a time anyway-
so how about cranking up 300 separate Perl scripts that synchoronize (ie,
soak up all available system resources simultaneously) with a named
semaphore?

(Win)
   $sem = Win32::Semaphore->new($initial,$maximum,$name);

(Unix)
   $sem = new IPC::Semaphore(IPC_PRIVATE, 10, S_IRWXU | IPC_CREAT);

I would envision you building a 300-line script to start up each individual
DB connect, and a single Perl script to "lower the flag" - causing the 300
perl scripts to pounce.

It seems like this is a much better test anyway- because I very much doubt
as *single* perl script will have 300 separate DB connections... but then
again... I don't know what your environment needs.

HTH

KC

Reply via email to