APPROXIMATION ALGORITHMS FOR SCHEDULING SINGLE BATCH MACHINE WITH INCOMPATIBLE DETERIORATING JOBS

. Motivated by the soaking process under separate heating mode in iron and steel enterprises, we study the parallel batch machine scheduling problem with incompatible deteriorating jobs. The objective is to minimize makespan. A soaking furnace can be seen as a parallel batch processing machine. In order to avoid the thermal stress caused by excessive temperature difference, initial temperature is needed for the ingot before processing. With the increasing of waiting time, the ingot temperature decreases and the soaking time increases. This property is called deterioration. Setup time is needed between incompatible jobs. We show that if jobs have the same sizes, an optimal solution can be found within 𝑂 ( 𝑛 log 𝑛 ) time. If jobs have identical processing times, the problem is proved to be NP-hard in the strong sense. We propose an approximate algorithm whose absolute and asymptotic worst-case ratios are less than 2 and 11/9, respectively. When the jobs have arbitrary sizes and arbitrary processing times, the model is also NP-hard in the strong sense. An approximate algorithm with an absolute and asymptotic worst-case ratio less than 2 is proposed. The time complexity is 𝑂 ( 𝑛 log 𝑛 ).


Introduction
Soaking is a typical batch process in iron and steel enterprises, which consumes a large amount of heat, usually accounting for two-thirds of the total energy consumption in the primary rolling zone.Low equipment utilization increases energy loss, therefore, it is important to improve the efficiency of the soaking process.Figure 1 shows the soaking process.A soaking furnace generally includes three soaking pits that can process multiple ingots at the same time.The soaking furnace has separate heating mode and centralized heating mode.Under the separate heating mode, ingots processed simultaneously in the same soaking pit are considered as a batch.When the temperature of the steel ingot reaches the rolling temperature, it will be taken out and supplied to the rolling mill for rolling.In order to reduce energy consumption, the ingot is filled as much as possible on the basis of not exceeding the furnace capacity.To ensure the quality of rolling, the maximum rolling temperature of all ingots in the soaking batch is usually taken as the discharge temperature of this batch.The ingots in the same batch have the same entry and exit times.Therefore, the soaking time of ingots within the same batch is the same, which is equal to the maximum soaking time of ingots in the batch.
In the soaking process with hot chain characteristics, when the initial temperature of the soaking pit is much higher than that of the ingot, thermal stress will be generated due to the great temperature difference between the internal and external temperatures, resulting in surface cracking and internal cracking.Therefore, an initial temperature for the ingot is needed before soaking.The initial temperature is related to the waiting time.The increase in waiting time will reduce the ingot temperature and cause the soaking time to increase.The property that the processing time increases as job waiting time increases is called deterioration.Steel ingots of different materials and types cannot be processed in the same furnace.Such jobs that belong to different families and cannot be processed in the same batch are called incompatible jobs.In addition, setup time is required between incompatible ingots.When the number of changeovers increases, workers get more tired and need more time to prepare for the next changeover.The actual setup time of a changeover therefore varies according to its position in the schedule.
Based on the soaking process under separate heating mode in iron and steel enterprises, this paper studies the parallel batch machine scheduling problem with incompatible deteriorating job families so as to minimize the makespan.

Literature review
The first presentation of the deterioration can be traced back to 1988.Gupta and Gupta [1] studied the single machine scheduling problem considering the deterioration effect, where the job processing time is a monotonically increasing function of its starting time.Since then, the related models of time-dependent processing time have been widely studied from various perspectives.Ji et al. [2] considered parallel-machine scheduling with deteriorating jobs and proved the total completion time minimization problem is polynomially solvable.Gao et al. [3] presented more efficient algorithms to solve the two-agent scheduling problem on a parallel-batch machine, where jobs have release dates and linear deteriorating processing times.Tang et al. [4], Yin et al. [5] and Zhang et al. [6] study the linear deteriorating job scheduling problem under different environments.Liu et al. [7] investigate a specialized two-stage hybrid flow shop scheduling problem considering job-dependent deteriorating effect, in which the actual processing time is denoted as   =     .Pei et al. [8], Li et al. [9] and Ding et al. [10] studied the sequence dependent deteriorating effects under different models.
In most research, incompatibility mainly occurs in planned production, assembly line balancing and batch scheduling.Dauzère-Pérès and Mōnch [11], Li and Chen [12] studied the number of tardy jobs minimization problem with incompatible job families under different constraints.Geng and Yuan [13] presented an algorithm to solve family jobs scheduling on an unbounded parallel-batching machine.Cheng et al. [14] considered the scheduling problem of multiple job families on a batching machine and proposed two polynomial time heuristics.In recent years, more complicated incompatible job family scheduling problems have been studied.Sun et al. [15] gave polynomial time algorithms to solve the group scheduling job-dependent due date assignment problem with learning effect and resource allocation.Kramer et al. [16] explored the parallel machine scheduling problem with family setup time and introduced five novel mixed integer linear programs to solve it.Li et al. [17] investigated the scheduling problem of non-identical jobs from incompatible job families on a batch processing machine, proposed a lower bound and designed heuristics to solve this NP-hard problem.Alizadeh and Kashan [18] explored the scheduling of a single batch processing machine, where jobs are of different sizes and have a conflicting nature with each other.Molaee et al. [19] deal with the problem of single machine scheduling with family setup times and random machine breakdown.Mönch and Roob [20] discussed the parallel batch processing machines scheduling problem with incompatible jobs under an arbitrary regular sum objective, where a matheuristic framework is proposed to exploit this insight.Abu-Marrul et al. [21] developed an ILS and a GRASP algorithm to solve a batch scheduling problem with identical parallel machines and non-anticipatory family setup time.
Some researchers considered the incompatibility and deterioration simultaneously.Wu and Lee [22] investigated the two single-machine group scheduling problem where the group setup time and the job processing time are both increasing functions of their starting time, and prove that the makespan minimization problem remains polynomially solvable.Lee and Lu [23] considered the single machine scheduling problem with deteriorating jobs and setup time.Xu et al. [24] proposed a heuristic algorithm to solve single machine group scheduling problem with deterioration effect.Zhang et al. [25] proposed a position-dependent processing time for the single-machine group scheduling problem and presented polynomial-time algorithms to solve it.
However, few studies have been done on batch operation optimization scheduling problems with deterioration and incompatible job families.Optimizing batch machine scheduling problems with effects are more complex than traditional scheduling problems.These problems exist widely in practice.In this paper, the optimization problem of single batch equipment with deterioration and incompatible job families is explored.Specific models are given for different problems, and effective optimization algorithms are provided respectively.
The remainder of this paper is organized as follows.Section 3 describes the meaning of notations and our problem.In Section 4, we propose an optimal algorithm for the first model, where   = 1.In Section 5, we propose an approximate algorithm for the second model where   = 1 and calculate the absolute and asymptotic worst-case ratios.In Section 6, we consider the general model, present an approximation algorithm and prove that the absolute and asymptotic worst-case ratios are strictly less than 2. In Section 7, we provide managerial insights for decision makers.Finally, in Section 8, we conclude this paper and give directions for future research.

Notations and problem description
The problem under investigation can be described as follows.A set of jobs  = {1, 2, . . ., } needs to be processed, where each job  has a size   and a processing time   .Jobs are divided into  incompatible families  = { 1 ,  2 , . . .,   }, which means that jobs from different families cannot be processed together.Each family contains   jobs and ∑︀  =1   = .The capacity of a single batch processing machine is , and therefore the total size of jobs in a batch cannot exceed it.Suppose jobs within family   are formed into   batches, let   = { 1 ,  2 , . . .,   } denote the batch set of family   .Let  min = min{  | ∈ } and  max = max{  | ∈ }.We define  = min max , which represents the processing time difference among these jobs.Obviously we have 0 <  ≤ 1, where  = 1 means that jobs have the same processing time and  approaching 0 means there is a big difference between the processing time of jobs.
The normal processing time of batch   ( = 1, 2, . . ., ;  = 1, 2, . . .,   ) is the largest processing time among all the jobs in   .Let   denotes the normal processing time of batch   , so   = max{  | ∈   }.The actual processing time of   is a linear function of its starting time .Let    denotes the actual processing time of batch   .We have where  is the deteriorating rate of batch processing time and 0 <  < 1.All the jobs are available at time zero and jobs' preemption is not allowed.A setup time   is required if the machine switches to process family   .Jobs in the same family are processed consecutively and need no setup time.When the number of changeovers increases, workers get more tired and need more time to prepare for the next changeover.The actual setup time of a changeover therefore varies according to its position in the schedule.Therefore, the actual setup time for changeover to family   is sequence-dependent and as follows: where  is the deteriorating rate of setup time, 0 <  < 1,  is the processing position of job family   .The objective is to minimize makespan Using the three-field notation in Lai and Lee [26], the models can be denoted respectively as follows.
In the above three models, -batch means the processing time of a batch is equals to the longest processing time of jobs in the batch.One batch facility with capacity of  is used to process jobs and the objective is to minimize makespan  max .In  1 , jobs have identical sizes and arbitrary processing times, but in  2 , jobs have arbitrary sizes and identical processing times. 3 is the general model.
We introduce the definitions of the absolute worst-case ratio and the asymptotic worst-case ratio.There is a given instance  and an approximation algorithm , and we denote   and OPT  as the solution obtained by algorithm  and an optimal algorithm, respectively, to solve .Let    =   OPT  .So in the algorithm , we define the absolute worst-case ratio as and the asymptotic worst-case ratio as In the following content, we use  to represent solutions obtained by our algorithms.For simplicity, we use  * to represent the optimal variables.For example,  * represents an optimal solution and  * represents the number of batches in an optimal solution.

Solving problem 𝜓 1
In this section, we study problem  1 , in which all jobs have arbitrary processing times but the same sizes   = 1.In this case,  jobs can be organized in one batch.We propose the Algorithm  1 to solve  1 .

Algorithm 𝐴 1
Step 1. Sort the jobs within each family in non-increasing order of their processing time.
Step 2. Assign the jobs into batches using the following rule.Put the first  jobs in family   into the first batch  1 .Put the second  jobs into the second batch  2 , Continue the assignment and obtain   batches for each family.
Step 3.For each family   , order the batches in non-decreasing order of their processing time and then process the batches consecutively.
Step 4. Sort the families in non-increasing order of their setup time   , starting with the family with smallest   .Proposition 1.For problem  1 , the optimal number of batches for a family is  *  = ⌈  /⌉, where ⌈⌉ represents the smallest integer greater than or equal to .
Proposition 1 is easy to obtain and thus the proof is omitted.

Lemma 1.
Step 3 of  1 can generate the optimal order of batches for each family.
Proof.By contradiction.First, we prove that batches within a family should be processed consecutively.Consider a schedule   where batches of a family are processed consecutively.In schedule   , there are two adjacent families,   and  +1 . +1 is processed after   .We assume that the completion time of the previous batch before family   is .We use   to donate the processing start time of batch   .In this case, the completion time of family  +1 is Without loss of generality, we process batches of family  +1 consecutively after the batch   , where  <   .
Other batches remain the same as schedule   .Then, we obtain a new schedule  ′  .Suppose schedule  ′  is better than schedule   .Under schedule  ′  , the completion time of the family   is Thus, we have In schedule  ′  , the batches { (+1) , . . .,   } have longer processing waiting time.So, we have  ′  larger than   .Therefore,  ′ −  > 0. which means that after the interruption, the total processing time became longer.This contradicts with that  ′  betters than   , which proves that batches of a family should be processed consecutively to minimize the makespan.Now we prove that it is optimal to arrange the batches of every family in a non-decreasing order.For an arbitrary family   , consider an optimal schedule   where batches   = ( 1 ,  2 , . . .,   ) are arranged by non-increasing order of their processing time.In this schedule, there must be two adjacent batches   and   scheduled in the rth and ( + 1)th positions, respectively.Such that   ≥   .Furthermore, we assume that the starting time for the ℎ batch in schedule   is .We now interchange the sequence of   and   , leaving the remaining batches in their original position.Thus, we form a new schedule  ′  .Let ( + 1) and  ′ ( + 1) denote the completion time of ( + 1)th batch under   and  ′  respectively.Figure 3 shows the structure of   and  ′  .Under   , we have whereas under  ′  , we can obtain Thus, we have since   ≤   .It implies that the ( + 2)ℎ batch under   has a later starting time than the same batch under  ′  , which means that the processing time of   under   is longer than that under  ′  .This contradicts the optimality of   and proves that batches should be ordered according to Step 3 of Algorithm  1 rule.

Lemma 2.
Step 4 of  1 can generate the optimal order of families.
Proof.By contradiction.Let  = ( 1 , . . .,   ,   , . . .,   ) denote the optimal schedule that families process in non-decreasing order of their setup time.  and   denote the family scheduled in the th and ( +1)th position respectively,   ≤   .Let  ′ = ( 1 , , . . .,   ,   , , . . .,   ), where   and   are in an opposite order and the remaining families in their original positions.Let   * and   ′ denote the total setup time consumption of  and  ′ .Then, under  ′ we have and under  we have Thus, we can obtain The total setup time consumption of optimal schedule  is greater than or equal to that of  ′ , which disproves the Lemma 2.
Theorem 1. Algorithm  1 finds an optimal solution for problem  1 in ( log ) time.
Proof.In Algorithm  1 , we use the following rule to generate batches.Assigned  1 to  1 .Since   = 1 and the machine capacity is , we can assign  jobs into the first batch.Therefore,  1 =  * 1 =  1 .Assign the next  jobs to the second batch  2 , we have  2 =  * 2 =  2 .Repeat the above operation until all jobs of   are allocated, we can obtain that   =  *  .Lemmas 1 and 2 prove an optimal batch processing sequence and an optimal family sequence, respectively.Therefore, we prove Algorithm  1 finds an optimal solution for  1 .
Step 1, Step 3 and Step 4 of Algorithm  1 cost ( log ) time, and Step 2 costs () time.Thus,the overall running time of Algorithm  1 is ( log ).

Solving problem 𝜓 2
Proposition 2. Problem  2 is NP-hard in the strong sense.
Proof.We first consider a relaxed problem, in which the setup time is not considered, and the job has an arbitrary size but identical processing time, that is,   = 1.Additionally, deterioration is not considered.In this case, the problem is equivalent to the Bin Packing Problem (BPP).Since BPP is known to be a NP-hard problem in the strong sense,  2 is NP-hard in the strong sense.Now we propose an approximation Algorithm  2 to solve it.

Algorithm 𝐴 2
Step 1. Sorting jobs for each family in a non-increasing order of job sizes.
Step 3. Sorting families in a non-increasing order of their setup time   .Then process every family of jobs in consecutive batches.

⌋︀
. The values of  *  and the worst case of   are shown in Table 1.
Hence, the asymptotic worst-case ratio of Algorithm  2 is We now examine the asymptotic worst-case ratio.Since  approaches infinity, the number of jobs within   approaches infinity, we have

Solving problem 𝜓 3
In this section, we consider the general case  3 where the jobs have arbitrary sizes and processing times.Since  3 is more difficult than  2 ,  3 is also NP-hard in the strong sense.We have the following proposition.

Algorithm 𝐴 3
Step 1. Sorting jobs for each family in a non-increasing order of their processing times.
Step 2. Assign the jobs into batches by the First Fit Decreasing rule and obtain   batches for each family.
Step 3.For each family   , order the batches in non-decreasing order of their processing time and then process the batches consecutively.
Step 4. Sorting families in non-increasing order of their setup time   and starting with the smallest   .
After the execution of Step 1 and Step 2 of Algorithm  3 , batches are sorted in non-increasing order of their processing times.For simplicity, we denote batches and their processing time as   and   , respectively.By contrast, in Step 3, batches are ordered in the reverse order and batches and processing times are denoted as   and   , respectively.Proof.Consider a family   which contains a set of jobs   ∈ {1, 2, . . .,   }.Assume batches in optimal schedule order in non-increasing sequence, that is, Consider an arbitrary job  in this family, which satisfies and Then, in the optimal solution, the jobs in {1, 2, . . ., f} cannot all be assigned to the first  − 1 batches.If job  is assigned to  *  , we have  *  =   .If job  is assigned to a batch later than  *  , then since the batches are in non-increasing order of their processing time, we have  *  ≥   .In both cases, we can conclude that Now, we consider problem  3 .The worst case occurs when only one job can be put in each batch.In this case, job  is assigned to   .The batching result is the same as the case when each job has the same size  0 , where /2 <  0 ≤ .We have ( − 1)/2 < ( − 1) 0 ≤ ( − 1).So  < 2 − 1, which indicates that job  can be assigned to a batch before  (2−1) .Since the batches are in non-increasing order of their processing times, we have By ( 21) and ( 22), we can obtain In Step 3 of Algorithm  3 , the batches are assigned in non-decreasing order of their processing times, and now we use   and   to denote the batches and their processing times respectively.Obviously, we have Lemma 5.
Proof.From Lemma 2, we have proven Step 3 generates the optimal family sequence.Now we consider the ratio of total processing time for family   .
Case 1: When  *  = 2, consider the worst case that   = 3, When  *  = 3, consider the worst case that   = 4, we have The  (33) Therefore, we have  (︀  when   belongs to even and odd are shown in Figures 6 and 7, respectively.In all cases, the ratio of family processing time under Algorithm  3 and the optimal schedule is less than 2. Therefore, we have

Discussion
By the theoretical analysis of the parallel batch machine scheduling problem with incompatible deteriorating job families, we provide the following managerial insights to decision-makers of manufacturing enterprises.
First, a balance should be found between product categories and production costs.By comparing the three models, we find that the optimal scheduling can be obtained in polynomial time when the jobs have identical sizes.However, when the jobs have arbitrary sizes, the problem becomes NP-hard, which shows that the job size makes our problem complex.In addition, we find that the worst-case ratio is  decreasing function.When  approaches 1, the result of Algorithm  3 is closer to the optimal solution.Based on the above, we propose that the diversity of products should be carefully considered in optimizing operation.Therefore, decision-makers should pay attention to the balance between product categories and cost.Moreover, measures should be taken to reduce the impact of job diversity on scheduling complexity, such as standardizing product size and adopting a delay strategy.Second, it is important to improve the collaborative efficiency between processes.In order to minimize the impact of deterioration, decision-makers should take measures to improve the coordination efficiency between processes and reduce the waiting time of jobs.Firstly, manufacturers should optimize the layout of workshops and processing equipment, so as to reduce the transportation time in the workshop.Secondly, optimize the production scheduling.Develop a detailed scheduling plan to maintain workshop efficiency and operations between processes.
Third, measures should be taken to reduce the deteriorating rate.Based on the research in this paper, we find that when  increases, the worst-case ratio increases, indicating that deterioration not only reduces production efficiency, but also makes the scheduling problem more complex.Decision makers should take measures to reduce the deterioration rate.For example, in the soaking process, the initial temperature of the ingot can be maintained by thermal insulation packaging and increasing the ambient temperature.

Conclusions
In this paper, we study the single parallel batch machine scheduling problem with deteriorating incompatible jobs.The objective is to minimize makespan.Three models are considered and algorithms are proposed.In the first model, we propose an optimal polynomial time algorithm for the special case where the jobs have identical sizes.The optimality is proved.In the second model, we propose an approximate algorithm for the special case where the jobs have identical processing time.In the third model, an approximate algorithm is proposed for a more general case, that is, jobs have arbitrary sizes and arbitrary processing times.The latter two cases are proved to be NP-hard in the strong sense, and we show the absolute and asymptotic worst-case ratios of these two algorithms.All of the proposed algorithms run in ( log ) time.
There are some interesting directions for future work.First, only single batch equipment is considered in this paper.Facility configuration is complex in practice and problems with other machine configurations, such as flow shops or parallel batch, are valuable to be researched.Since the single machine scheduling problem is NP-hard in the strong sense as studied in this paper, problems with complex facility configurations are also NP-hard in the strong sense.Approximation algorithms and intelligent algorithms can be considered.Second, we only consider minimizing the makespan while multi-objective problems deserve study.For example, scheduling problems of purchasing, inventory and distribution.More objective functions are also interesting directions for future work, such as minimum service span or minimum total cost.Third, how to coordinate the scheduling of incompatible jobs and weaken the impact of deterioration is a direction worthy of research.

Proposition 4 .
Problem  3 is NP-hard in the strong sense.

Table 1 .
and the worst case of  *  . = 2, from Table 1, the worst case of   is   = 3, we have *  = 3, from Table 1, the worst case of the   is   = 4, we have lim Theorem 2. The running time of Algorithm  2 is ( log ) time.The absolute worst-case ratio  2 < 2, and the asymptotic worst-case ratio  ∞ 2 < 11/9.
=1  *  when   belongs to odd.When the scale of  3 approaches infinity, the asymptotic worst-case ratio are as follows.If   belongs to even, we have lim  < 2. (38)