Different Transfer Functions for Binary Particle Swarm Optimization with a New Encoding Scheme for Discounted {0-1} Knapsack Problem

(e discounted {0-1} knapsack problem (DKP01) is a kind of knapsack problem with group structure and discount relationships among items. It is more challenging than the classical 0-1 knapsack problem. In this paper, we study binary particle swarm optimization (PSO) algorithms with different transfer functions and a new encoding scheme for DKP01. An effective binary vector with shorter length is used to represent a solution for new binary PSO algorithms. Eight transfer functions are used to design binary PSO algorithms for DKP01. A new repair operator is developed to handle isolation solution while improving its quality. Finally, we conducted extensive experiments on four groups of 40 instances using our proposed approaches. (e experience results show that the proposed algorithms outperform the previous algorithms named FirEGA and SecEGA . Overall, the proposed algorithms with a new encoding scheme represent a potential approach for solving the DKP01.


Introduction
e discounted 0-1 knapsack problem (DKP01) is a new kind of knapsack problem proposed by Guldan [1]. is problem has an important role in the real-world business process. It is a part of many key problems such as investment decision-making, mission selection, and budget control. An exact algorithm based on dynamic programming for the DKP01 is first proposed in [1]. An approach combining dynamic programming with the core of the DKP01 to solve it is studied in [2]. Two algorithms based on genetic algorithm for DKP01 are named FirEGA and SecEGA in [3].
Assume that there are n groups. Each group contains three items. Consider a given set of 3 * n items; each of them has an integer weight w i and an integer profit p i . e problem is to select a subset from the set of 3 * n items in n groups such that the overall profit is maximized without exceeding a given weight capacity C. We cannot choose more than 1 item in each group. It is an NP-Hard problem and hence it does not have a polynomial time algorithm unless P � NP. e problem may be mathematically modelled as follows: subject to x 3i , x 3i+1 , x 3i+2 ∈ 0, 1 { }, ∀i ∈ 1, 2, . . . , n − 1 where x 3i , x 3i+1 , and x 3i+2 represent whether the items 3i, 3i + 1, and 3i + 2 are put into the knapsack: x j � 0 indicates that the item j(j � 0, 1, . . . , 3n − 1) is not in knapsack, while x j � 1 indicates that the item j is in knapsack. It is worth noting that a binary vector X � (x 0 , x 1 , . . . , x 3n−1 ) ∈ 0, 1 { } 3n is a potential solution of DKP01. Only if X meets both equations (2) and (3), it is a feasible solution of DKP01.
Recently,  also had a detailed study of the algorithms of the DKP01 and proposed brand new deterministic algorithm and approximation algorithms. A new exact algorithm and two approximation algorithms with a greedy repair operator were proposed to solve DKP01 [4]. An algorithm based on PSO is named GBPSO using discrete particle swarm optimization [5]. An evolution algorithm combines with ring theory to solve DKP01 [6]. Multistrategy monarch butterfly optimization algorithm, Binary Moth Search algorithm [7], and hybrid teaching-learning-based optimization algorithm [8] are proposed for DKP01. Binary PSO is also developed to solve many optimization problems such as scheduling of appliances in smart homes [9], fault diagnosis of bearing [10], operation cost reduction in unit commitment problem [11], channel selection in the EEG signals and its application to speller systems [12], wireless sensor networks [13], transmission expansion planning considering n − 1 security criterion [14], a bare-bones multiobjective particle swarm optimization algorithm for environmental/economic dispatch [15], multiobjective particle swarm optimization for feature selection with fuzzy cost [16], multiobjective particle swarm optimization approach for cost-based feature selection in classification [17], and variable-size cooperative coevolutionary particle swarm optimization for feature selection on high-dimensional data [18].
Many algorithms were proposed to solve DKP01, and each of them has its advantages and disadvantages. Further study on this problem is necessary. In this paper, we study binary particle swarm optimization (PSO) algorithms with different transfer function and a new encoding scheme for DKP01. An effective binary vector with 2 * -dimensional length is used to represent an individual the proposed binary PSO strategies. Eight types of transfer functions are used to design binary PSO algorithms for DKP01. Finally, we conducted extensive experiments on four groups of 40 instances using our proposed approaches. e experience results demonstrate that the proposed algorithms outperform the genetic algorithm and original binary PSO in solving 40 DKP01 instances. e main contributions of this work can be listed as follows: (i) Binary particle swarm optimization algorithms with difference binary transfer functions and new solution presentation are proposed to solve the discounted {0-1} knapsack problem.
(ii) A new encoding scheme has shorter binary vector (the length is 2n compared to 3n) and also automatically satisfies the constraint that chose the most one item in each group.
(iii) A new repair operator is developed to handle isolation solution while improving its quality. e rest of this paper is organized as follows: Section 2 presents previous algorithms for KP01. Section 3 presents the binary particle swarm optimization for DKP01. e simulated results of the proposed algorithms are presented in Section 4. We conclude this paper and suggest potential future work in Section 5.

Particle Swarm Optimization.
e PSO implements a population of particles. A population of particles is randomly created initially [19,20]. e standard molecule swarm optimizer keeps up a swarm of molecule that speaks to the potential arrangements to issue on hand. Suppose that the search space is D-dimensional, and the position of the ith particle of the swarm can be described using a D-dimensional vector, x i � (x i1 , . . . , x id , . . . , x iD ). e velocity of the particle . e last best position of the ith particle is named as p i � (p i1 , . . . , p id , . . . , p iD ). In substance, the direction of each molecule is upgraded concurring to its claim flying experience as well as to that of the finest molecule within the swarm. e fundamental PSO calculation can be depicted as where v k i,d is the dth dimension velocity of particle i in cycle k; x k i,d is the dth dimension position of particle i in cycle k; p k i,d is the dth dimension position of personal best (pbest) of particle i in cycle k; p k g,d is the dth dimension position of global best particle (gbest) in cycle k; w is the inertia weight; c 1 is the cognitive weight and c 2 is a social weight; r 1 and r 2 are two random values uniformly distributed in the range of [0, 1] [21]. e pseudocode of the PSO is given in Algorithm 1.

Binary Particle Swarm
Optimization. e binary particle swarm optimization algorithm was introduced by Bansal and Deep to allow the PSO algorithm to operate in binary problem spaces [21][22][23]. It uses the concept of velocity as a probability that a bit (position) takes on one or zero. In the BPSO, equation (5) for updating the velocity remains unchanged, but equation (6) for updating the position is redefined by the rule using the two following equations: where S (.) is the sigmoid function for transforming the velocity to the probability as the following expression: 2 Mathematical Problems in Engineering In this section, we propose 8 binary algorithms based on BSO named BPSO1 to BPSO8. e algorithm BPSO x uses transfer function S x (where x is integer in [1,8]), and BPSO1-BPSO4 use formula (7), while BPSO5-BPSO8 use formula (8) to calculate binary vector X.

Proposed Binary Particle Swarm Optimization for DKP01
3.1. Solution Presentation. At present, there are two methods to encode a solution which are using a binary vector with length equal to the 3 * n-dimensional problem [3,7,24,25], and the other method is an integer vector with length equal to number of groups n to present a solution [8]. Each encoding scheme has its advantages and disadvantages. e binary scheme has an advantage when many metaheuristics have a good design to be directly applied to solution.
In this paper, a new binary encode scheme with length 2 * n is used to present the solution. e advantage of this encode scheme is shorter length and it automatically satisfies constraint 2. e new binary encoding scheme is presented in Table 1.

Repair Function.
e new encoding scheme automatically satisfies constraint 2. To handle constraint 3 and improve the quality of solution, a new repair based on the idea in [3] is proposed. e advantage of this repair procedure is the balance between CPU time cost and not getting stuck in local optima. e items are sorted according to the profit-toweight ratio p i /w i (i � 1, 2, . . . , n) so that they are not increasing. It means that is repair operator consists of two phases. e first phase (called repair phase) examines each variable in increasing order of p j /w j and drops item from knapsack if feasibility is violated. e first phase (called optimization phase) examines each variable in increasing order of p j /w j and adds item to knapsack as long as feasibility is not violated. e aim of the repair phase is to obtain a feasible solution from an infeasible solution, while the optimization phase seeks to improve the fitness of a feasible solution. e pseudocode for the repair operator is given in Algorithm 2. e time complexity of repair operator is O (n). e overall pseudocode of the BPSO algorithms for DKP01 is given in Algorithm 3.

Simulation Results
In this paper, the experience results of eight BPSO algorithms are compared to find out the best one among them to solve DKP01. e best proposed BPSO is used to compare the results of two algorithms taken from [6] named FirEGA and SecEGA. 40 DKP01 test instances are 10 uncorrelated instances (denoted as UDKP1-UDKP10), 10 weakly correlated instances (denoted as WDKP1-WDKP10), 10 inverse strongly correlated instances (denoted as Input: Initial parameters Output: optimal solution (1) for each particle do (2) Initialize particle (3) while stop condition is not met do (4) for each particle do (5) Evaluate objective function (6) if the objective function value ≤ pBest then (7) current value is replace by pBest (8) Calculate the gBest (the global best value) (9) for each particle do (10) Calculate particle velocity by equation (5)  (11) Update particle position by equation (6) ALGORITHM 1: PSO algorithm.
Mathematical Problems in Engineering IDKP1-IDKP10), and 10 strongly correlated instances (denoted as SDKP1-SDKP10) [3]. All experiments of the proposed algorithms are performed on a Dell Vostro 5471 VTI5207W laptop with an Intel (R) Core (TM) i5-8250u CPU-1.6 GHz and 8 GB DDR3 memory. e operating system is Microsoft Windows 10. All the algorithms are implemented using MATLAB R2018a. e parameters of FirEGA and SecEGA are shown in [6]. e population sizes of FirEGA and SecEGA are set to 50, and the iteration is set to be equal to the dimension of the DKP01. For a fair comparison, the parameters for BPSO algorithms are set as follows: the number of particles is equal to 50, C 1 and C 2 are set to 2, w is linearly decreased from 0.9 to 0.4, the maximum number of iterations is set to be equal  (21) x (2k + 1) ⟵ 0 (22) x (2k + 2) ⟵ 0 (23) % Optimization phase (24) Mathematical Problems in Engineering Input: Initial parameters Output: Optimal solution (1) for each particle do (2) Initialize particle (3) while stop condition is not met do (4) for each particle do (5) Evalute objective function (6) if the objective function value ≤ pBest then (7) current value is replace by pBest (8) Caculate the gBest (the global best value) (9) for each particle do (10) Calculate particle velocity by equation (5)  (11) Caculate S (.) using a transfer function (12) Update particle position by equations (7) or (8)  (13) Apply repair operator for current particle position.
ALGORITHM 3: Overall pseudocode of BPSO algorithms for DKP01.          Mathematical Problems in Engineering 9       to the dimension of the DKP01, and the stopping criterion is satisfied when the maximum number of iterations is reached. For all algorithms, the numbers of objective function evaluations are similar. Tables 2-5 summarize the comparison among 8 BPSO algorithms based on the five different performance factors, that is, the best results (BEST), the average results (AVE), the worst results (Worst), the standard deviation (Std. dev), and the gap between the AVE and OPT, where OPT is the optimal value of the instance. e results are averaged over 30 independent runs, and the best results are highlighted in bold font. e formula of computing the gap is as follows: e results show that BPSO7 and BPSO8 have better performance compared to the other six algorithms. Table 6 summarizes the average ranks of eight BPSO algorithms on 40 instances. e results showed that BPSO8 achieved the average best rank in all three factors, that is, average best rank (rank based on Best), average mean rank (rank based on AVE), and average worst rank (rank based on Worst). Tables 7-10 summarize the comparison among the FirEGA, SecEGA, and BPSO8 based on the five different performance criteria on 30 independent runs: BEST, AVE, Worst, Std. dev, and Gap. BPSO8 is better than FirEGA and SecEGA in Best, AVE, and Worst for the instances of SDKP, UDKP, and WDKP except for instances of IDKP.
e results show that BPSO8 has better performance than FirEGA and SecEGA algorithms. Table 11 summarizes the average ranks of eight BPSO8, FirEGA, and SecEGA algorithms on 40 instances. e results showed that BPSO8 achieved the average best rank in all three factors, that is, average best rank (rank based on Best), the average mean rank (rank based on AVE), and average worst rank (rank based on Worst).    For the stability, the Std. dev and Gap value from Tables 2-10 demonstrate the stability of the proposed algorithms. Figure 1 demonstrates the box plot of eight instances. e results showed that the group of algorithms BPSO5-BPSO8 is better than group of algorithms BPSO1-BPSO4. Figure 2 demonstrates the convergence curves of eight instances. e results showed that the group of algorithms BPSO5-BPSO8 has faster convergence than the group of algorithms BPSO1-BPSO4. erefore, the performance of BPSO8 is excellent compared to that of BPSOs for the DKP01 problem. From the above comparison, it is not difficult to see that, for the DKP01 problem, the BPSOS has the best performance, followed by BPSO8, and they are far better than FirEGA and SecEGA.
is shows that the designed method of binary swarm optimization with a new binary encoding scheme is not only feasible but also effective.

Conclusion
In this paper, eight new algorithms have been proposed based on the binary particle swarm optimization with a new   repair operator to solve discounted 0-1 knapsack problem efficiently. An effective binary encoding scheme is proposed to present the solution to the problem. e new encoding scheme has two advantages, that is, helping reduce the computing effort when using shorter binary vector and also automatically satisfy the constraint that chose the most one item in each group. e simulation results on forty DKP01 instances showed that the proposed algorithms are better than the two algorithms based on genetic algorithm.
In the future, I will study the effect of transfer function combined with PSO algorithm for other optimization problems. Many other optimization algorithms are also considered to solve DKP01.

Data Availability
e data used to support the findings of this study are included within the article or are made publicly available to the research community at https://www.researchgate.net/ publication/336126537_Four_kinds_of_D0-1KP_instances.

Additional Points
(i) Particle swarm optimization algorithms with difference binary transfer functions and new solution presentation are proposed to solve the discounted 0-1 knapsack problem. (ii) A new encoding scheme has shorter binary vector and also automatically satisfy the constraint that chose the most one item in each group. (iii) A new repair operator is developed to handle isolation solution while improving its quality. (iv) Experiment results in 40 instances of discounted 0-1 knapsack problem showed that the proposed approaches are efficient.

Conflicts of Interest
e author declares that there are no conflicts of interest regarding the publication of this paper.