OK, I've got the standard answer.
Supposed that the k-th job has just finished, and machine A has cost
t_A minutes. Denote T_B=T(k,T_A) as the time that machine B has cost ,
then we have:

T(k,T_A)=Min(T(k-1,t_A)+b_k,T(k-1,t_A-a_k))

, where T(k-1,t_A)+b_k means that the (k-1)-th is dispatched to
machine B, and T(k-1,t_A-a_k) means machine A.

On Oct 26, 8:21 pm, ziyuang <[email protected]> wrote:
> 2 machines, called A and B, and n jobs. The i-th job cost machine A
> a_i minutes, or, cost machine B a_i minutes. Some jobs are sent to
> machine A, and the others machine B. These 2 machines work in
> parallel. They start at the same time, and process jobs one by one.
> Now how to determine the minimal time of all jobs and the optimal
> schedule using dynamic programming?
>
> Thanks all.

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to