Jason,

   I am using Parallel::ForkManger for a similar project.  I works great
except you cannot call a fork in a child process using the perl module. 
I actually had to use fork.  But other then that it works great.

chad  

On 19 Sep 2002 10:48:47 -0400
Jason Frisvold <[EMAIL PROTECTED]> wrote:

> Hrm ...  So, I could create an array of parameters to send, plus the
> hash for the data and the forked process would have this information
> available?  Now what happens if the main program modifies these
> variables?  Is the fork in a different memory space?
> 
> Do the forked processes start from the beginning or continue on from
> that point?
> 
> Yes, I'm headed over to look up fork now ..  :)
> 
> perldoc -f fork .. right?
> 
> Friz
> 
> On Thu, 2002-09-19 at 10:36, Bob Showalter wrote:
> > > -----Original Message-----
> > > From: Jason Frisvold [mailto:[EMAIL PROTECTED]]
> > > Sent: Thursday, September 19, 2002 9:43 AM
> > > To: [EMAIL PROTECTED]
> > > Subject: Forking, Passing Parameters to forks
> > > 
> > > 
> > > Greetings,
> > > 
> > >   I'm in the process of writing a large network 
> > > monitoring system in
> > > perl.  I want to be sure I'm headed in the right direction,
> > > however.
> > > 
> > >   I have a large MySQL database comprised of all the 
> > > items that need
> > > monitoring.  Each individual row contains exactly one monitoring
> > > type(although, i would love to be able to combine this
> > > efficiently)
> > > 
> > >   One of the tables will contain the individual 
> > > monitoring types and the
> > > name of the program that processes them.  I'd like to have a 
> > > centralized
> > > system that deals with spawning off these processes and 
> > > monitoring those
> > > to ensure they are running correctly.  I'm looking to spawn 
> > > each process
> > > with the information it needs to process instead of it having 
> > > to contact
> > > the database and retrieve it on it's own.  This is where I'm 
> > > stuck.  The
> > > data it needs to process can be fairly large and I'd rather not
> > > drop back to creating an external text file with all the data.  Is
> > > there a way to push a block of memory from one process to another?
> > >  Or some
> > > other efficient way to give the new process the data it 
> > > needs?  Part of
> > > the main program will be a throttling system that breaks the data
> > > down into bite size chunks based on processor usage, running time,
> > > 
> > > and memory
> > > usage.  So, in order to properly throttle the processes, I need to
> > > be able to pass some command line parameters in addition to the
> > > data chunk...
> > > 
> > >   Has anyone attempted anything like this?  Do I have a snowball's
> > > chance?  :)
> > 
> > fork() creates a copy of a process. The new process is an exact copy
> > of the original process (except for a few items, see your fork(2)
> > manpage), including all the variables.
> > 
> > So you don't need to "pass" anything. If a variable like @data
> > contains the data to be processed, then after the fork, the child
> > will have a copy of@data to work with.
> -- 
> ---------------------------
> Jason 'XenoPhage' Frisvold
> Senior ATM Engineer
> Penteledata Engineering
> [EMAIL PROTECTED]
> RedHat Certified - RHCE # 807302349405893
> ---------------------------
> "Something mysterious is formed, born in the silent void. Waiting
> alone and unmoving, it is at once still and yet in constant motion. It
> is the source of all programs. I do not know its name, so I will call
> it the Tao of Programming."
> 
> 
> -- 
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to