Greedy Algorithm to minimize lateness when scheduling jobs on a processor
- Processor can process 1 job at a time
- Job
j
requires tj units of processing time and is due at time dj j
starts at time s(j), it finishes at time f(j) = s(j) + tj- Lateness = max {0, f(j) – dj}
- Goal: schedule all jobs to minimize the maximum lateness
Consider jobs with Earlieast Deadline first
Sorting O(n log(n)) + for-loop Θ(n)
O(n log(n))