Pinedo_problemas_deterministicos
251 pág.

Pinedo_problemas_deterministicos


DisciplinaPlanejamento da Producao27 materiais261 seguidores
Pré-visualização50 páginas
Jk\u22121 and k must
contain a k\u2212optimal set. If set Jk contains the entire set Jk\u22121 plus job k then it
must clearly be k\u2212optimal since Jk\u22121 is (k \u2212 1)-optimal. If set Jk\u22121 combined
with job k is not feasible, then the k\u2212optimal set must be a smaller set within
the set that contains Jk\u22121 and k; however, it must contain at least as many
jobs as set Jk\u22121. Set Jk clearly satis\ufb01es this condition. 	unionsq
Example 3.3.3 (Minimizing Number of Tardy Jobs)
Consider the following 5 jobs.
jobs 1 2 3 4 5
pj 7 8 4 6 6
dj 9 17 18 19 21
Jobs 1 and 2 can be positioned \ufb01rst and second in the sequence with both
jobs being completed on time. Putting job 3 into the third position causes
problems. Its completion time would be 19 while its due date is 18. Algo-
rithm 3.3.1 prescribes the deletion of the job with the longest processing time
among the \ufb01rst three jobs. Job 2 is therefore deleted and jobs 1 and 3 remain
in the \ufb01rst two positions. If now job 4 follows job 3, it is completed on time at
50 3 Single Machine Models (Deterministic)
17; however, if job 5 follows job 4, it is completed late. The algorithm then
prescribes to delete the job with the longest processing time among those
already scheduled, which is job 1. So the optimal schedule is 3, 4, 5, 1, 2 with\u2211
Uj = 2. ||
Note that Algorithm 3.3.1 is an algorithm that goes forward in time. For this
problem there is not any algorithm that goes backward in time. Note also that
there may be many optimal schedules; characterizing the class of all optimal
schedules seems to be a very di\ufb03cult problem.
The generalization of this problem with weights, i.e., 1 ||\u2211wjUj is known to
be NP-hard (see Appendix D). The special case with all due dates being equal
is equivalent to the so-called knapsack problem. The due date is equivalent to
the size of the knapsack, the processing times of the jobs are equivalent to the
sizes of the items and the weights are equivalent to the bene\ufb01ts obtained by
putting the items into the knapsack. A popular heuristic for this problem is the
WSPT rule which sequences the jobs in decreasing order of wj/pj . A worst case
analysis shows that this heuristic may perform arbitrarily badly, i.e., that the
ratio \u2211
wjUj(WSPT )\u2211
wjUj(OPT )
may be arbitrarily large.
Example 3.3.4 (The WSPT Rule and a Knapsack)
Consider the following three jobs.
jobs 1 2 3
pj 11 9 90
wj 12 9 89
dj 100 100 100
Scheduling the jobs according to WSPT results in the schedule 1, 2, 3. The
third job is completed late and
\u2211
wjUj(WSPT ) is 89. Scheduling the jobs
according to 2, 3, 1 results in
\u2211
wjUj(OPT ) being equal to 12. ||
3.4 The Total Tardiness - Dynamic Programming
The objective
\u2211
Tj is one that is important in practice as well. Minimizing the
number of tardy jobs,
\u2211
Uj , in practice cannot be the only objective to measure
how due dates are being met. Some jobs may have to wait for an unacceptably
long time if the number of late jobs is minimized. If instead the sum of the
tardinesses is minimized it is less likely that the wait of any given job will be
unacceptably long.
3.4 The Total Tardiness - Dynamic Programming 51
The model 1 || \u2211Tj has received an enormous amount of attention in the
literature. For many years its computational complexity remained open, until
its NP-hardness was established in 1990. As 1 || \u2211Tj is NP-hard in the ordi-
nary sense it allows for a pseudo-polynomial time algorithm based on dynamic
programming (see Appendix D). The algorithm is based on two preliminary
results.
Lemma 3.4.1. If pj \u2264 pk and dj \u2264 dk, then there exists an optimal
sequence in which job j is scheduled before job k.
Proof. The proof of this result is left as an exercise. 	unionsq
This type of result is useful when an algorithm has to be developed for
a problem that is NP-hard. Such a result, often referred to as a Dominance
Result or Elimination Criterion, often allows one to disregard a fairly large
number of sequences. Such a dominance result may also be thought of as a set
of precedence constraints on the jobs. The more precedence constraints created
through such dominance results, the easier the problem becomes.
In the following lemma the sensitivity of an optimal sequence to the due
dates is considered. Two problem instances are considered, both of which
have n jobs with processing times p1, . . . , pn. The \ufb01rst instance has due dates
d1, . . . , dn. Let C\u2032k be the latest possible completion time of job k in any op-
timal sequence, say S\u2032, for this instance. The second instance has due dates
d1, . . . , dk\u22121,max(dk, C\u2032k), dk+1, . . . , dn. Let S\u2032\u2032 denote the optimal sequence
with respect to this second set of due dates and C\u2032\u2032j the completion of job j
under this second sequence.
Lemma 3.4.2. Any sequence that is optimal for the second instance is also
optimal for the \ufb01rst instance.
Proof. Let V \u2032(S) denote the total tardiness under an arbitrary sequence S with
respect to the \ufb01rst set of due dates and let V \u2032\u2032(S) denote the total tardiness
under sequence S with respect to the second set of due dates. Now
V \u2032(S\u2032) = V \u2032\u2032(S\u2032) +Ak
and
V \u2032(S\u2032\u2032) = V \u2032\u2032(S\u2032\u2032) +Bk,
where, if C\u2032k \u2264 dk the two sets of due dates are the same and the sequence
that is optimal for the second set is therefore also optimal for the \ufb01rst set. If
C\u2032k \u2265 dk, then
Ak = C\u2032k \u2212 dk
and
Bk = max(0,min(C\u2032\u2032k , C
\u2032
k)\u2212 dk)
It is clear that Ak \u2265 Bk. As S\u2032\u2032 is optimal for the second instance V \u2032\u2032(S\u2032) \u2265
V \u2032\u2032(S\u2032\u2032). Therefore V \u2032(S\u2032) \u2265 V \u2032(S\u2032\u2032) which completes the proof. 	unionsq
52 3 Single Machine Models (Deterministic)
In the remainder of this section it is assumed for purposes of exposition
that (without loss of generality) all processing times are di\ufb00erent, if neces-
sary after an in\ufb01nitesimal perturbation. Assume that d1 \u2264 · · · \u2264 dn and
pk = max(p1, . . . , pn). That is, the job with the kth smallest due date has
the longest processing time. From Lemma 3.4.1 it follows that there exists an
optimal sequence in which jobs {1, . . . , k \u2212 1} all appear, in some order, before
job k. Of the remaining n\u2212 k jobs, i.e., jobs {k + 1, . . . , n}, some may appear
before job k and some may appear after job k. The subsequent lemma focuses
on these n\u2212 k jobs.
Lemma 3.4.3. There exists an integer \u3b4, 0 \u2264 \u3b4 \u2264 n\u2212 k, such that there
is an optimal sequence S in which job k is preceded by all jobs j with j \u2264 k+ \u3b4
and followed by all jobs j with j > k + \u3b4.
Proof. Let C\u2032k denote the latest possible completion time of job k in any se-
quence that is optimal with respect to the given due dates d1, . . . , dn. Let S\u2032\u2032 be a
sequence that is optimal with respect to the due dates d1, . . . , dk\u22121,max(C\u2032k, dk),
dk+1, . . . , dn and that satis\ufb01es the condition stated in Lemma 3.4.1. Let C\u2032\u2032k
denote the completion time of job k under this sequence. By Lemma 3.4.2 se-
quence S\u2032\u2032 is also optimal with respect to the original due dates. This implies
that C\u2032\u2032k \u2264 max(C\u2032k, dk). One can assume that job k is not preceded in S\u2032\u2032 by
a job with a due date later than max(C\u2032k, dk) (if this would have been the case
this job would be on time and repositioning this job by inserting it immedi-
ately after job k would not increase the objective function). Also, job k has to
be preceded by all jobs with a due date earlier than max(C\u2032k, dk) (otherwise
Lemma 3.4.1 would be violated). So \u3b4 can be chosen to be the largest integer
such that dk+\u3b4 \u2264 max(C\u2032k, dk). This completes the proof. 	unionsq
In the dynamic programming algorithm a subroutine is required that gener-
ates an optimal schedule for the set of jobs 1, . . . , l starting with the processing
of this set at time t. Let k be the job with the longest processing time among
these l jobs. From Lemma 3.4.3 it follows that for some \u3b4 (0 \u2264 \u3b4 \u2264 l\u2212 k) there
exists an optimal sequence starting at t which may be regarded as a concate-
nation of three subsets of jobs, namely
(i) jobs 1, 2, . . . , k \u2212 1, k + 1, . . . , k + \u3b4 in some order, followed by
(ii) job k, followed by
(iii) jobs k + \u3b4 + 1, k + \u3b4 + 2, . . . , l in some order.
The completion time of job k, Ck(\u3b4), is given by
Ck(\u3b4)