On Mon, 8 Oct 2012 07:08:09 -0700
Bill Moseley <mose...@hank.org> wrote:

> I'm looking for some design ideas, and I'll try and briefly describe
> it. I suspect this is a Role vs. subclass question.
> 
> I have a job processing system where a set of one or more jobs are
> processed.   At the top I have a small class that has an ArrayRef of
> jobs and a method to "submit" the jobs for processing once they have
> all been gathered.  The jobs are processed independently, but they
> need to be submitted together.  (They are submitted to a pool of
> worker machines.)
> 
> It's the individual job class I have a question about.
> 
> A job has about five distinct stages of processing and uses a
> DBIx::Class row as a storage backend to maintain state.   The first
> state is before it has been "submitted" and doesn't have a job_id
> yet.   A notification message is sent when each job has completed a
> stage.   When I receive this message I need to recreate a job object,
> inspect the results of processing and then call a method to start the
> next stage of processing.
> 
> There's some common attributes -- after submitted every job has a
> unique job_id, and they all have a DIBx::Class row object.  They also
> to have connections to backend processing servers, etc.  But, each
> stage should have its own set of methods for dealing with just that
> one stage.
> 
> One approach is to create a base class with common methods and then
> create individual classes for each stage.   So, when I get a message
> that Stage #2 has completed I might inspect the message to decide on
> the stage completed and then:
> 
>    my $class = 'Job::State::' . $message->{stage};  # stage1, stage2
>    my $job = $class->new;
> 
> Another approach is to have just a single class ( my $job = Job->new
> ) and then based upon state indicated by the message apply a role to
> that job instance that has the methods needed for handling the
> transition from one state to the next.
> 
> I think what I'd like best is to do is:
> 
>    my $job = Job->new( $message );
> 
> and have the $job essentially "know" what state it is in and only
> support methods for that state.
> 
> 
> Anyone have any guidance on how to structure classes like these?
> 
> 
> 

Having built a similar system in the past, I did it with a
role acting as an abstract base class. Then each type of job consumed
that role. As for stages, I opted for a simpler approach than
subclasses for each. Instead, I just had a queue of coderefs to
execute with reporting on which step failed and why. The responsibility
of appending to the queue was left to the subclass upon init_job()
(which was executed on the other side of the process boundary to avoid
trying to transport db handles, etc).

Looking at your options, having classes per stage, and instantiating
them that way just seems hacky. Applying roles to instances also seems
rather ugly.

Overall, I think you need to have strict state machines in each of your
jobs. Your jobs should have predefined stages that you know ahead of
time and use Bread::Board to wire everything together (eg. Job1 depends
on Stage1, Stage2, Stage3). So, your Job class aggregates Stage objects,
and your Job Collection aggregates Job objects with each level of
aggregate able to report on its constituents. Then you just need a
simple set of roles to allow loose coupling between each level for
execution and reporting.



-- 

Nicholas Perez
XMPP/Email: n...@nickandperla.net
https://metacpan.org/author/NPEREZ
http://github.com/nperez

Reply via email to