Microservice combination optimisation based on improved gray wolf algorithm

Microservices architecture is a new paradigm for application development. The problem of optimising the performance of microservice architectures from a non-functional perspective is a typical Nondeterministic Polynomial (NP) problem. Therefore, aiming to quantify the non-functional requirements of computing microservice systems, while solving the problem of latency in computing the best combination of services with the maximum QoS objective function value, this paper proposes a microservice combination approach based on the QoS model and a CGWO algorithm for optimisation computation for this model. The experimental results verify that the error rate of the method is only 0.528% on the non-functional combination optimisation problem, and the computational efficiency of the algorithm increases by 97.29% when the complexity of the problem search space increases, while CGWO improves 65.97% and 81.25% respectively in the accuracy of optimisation compared to the prototype of the algorithm (GWO), and has a stable optimisation performance, aspect. It proves that the research in this paper has a high advantage in automatically searching for the best QoS for the microservice combination problem.


Introduction
Microservices are small services consisting of a single application (Cerny et al., 2018).To accomplish more complex tasks, multiple interoperable microservices need to be combined, making such combinations a hot research topic.The enhancement of efficiency is a necessity given the fact that a set of microservices is executed in simply one task and the environment and requirements are dynamically changing (Ai et al., 2021;Barkat et al., 2021).In the field of cloud services industrial manufacturing, manual scheduling of microservices to generate microservice combinations is gradually becoming impossible due to the rapid increase in the number of microservices.Many existing studies contribute implementation methods for automated scheduling and combination of microservices based on functional attribute considerations (Mendonça et al., 2019;Potdar et al., 2020), but bring the impact of quality chaos of non-functional attributes (Brady et al., 2020), while the approach provided in this paper will compensate for this lack of aspect and enable microservice systems to complete user request tasks with high performance (Brasser et al., 2022).In order to reduce the performance overhead, the Quality of Service (QoS) model of microservices needs to be optimised.
Intelligent optimisation algorithms are good at performing a large number of efficient computations in a short time (Boussaïd et al., 2013;Zhang, You, et al., 2019).Research on intelligent optimisation algorithms has been relatively mature and extensive and has been applied in various fields (Liang et al., 2018;Chen & Huang, 2021;Zhang, Cui, et al., 2022;Hu et al., 2021;Naseri & Jafari Navimipour, 2019), especially in combination with neural networks, reinforcement learning (Li et al., 2021;Wang et al., 2020).They are widely used in security and computer vision, such as adversarial attacks (Huang, Zhang, et al., 2020;Mo et al., 2022;Mo et al., 2020) and attack detection (Kuang et al., 2019;Li et al., 2022;Zhang, Xue, et al., 2021), and the results are either an exciting improvement over previous work are either exciting improvements or feasible solutions to proposed problems in both research areas (Zhang, Xue, et al., 2021;Zhang, Zhu, et al., 2022;Ren et al., 2021).
In recent years, breakthroughs in cloud computing technology have brought new application scenarios for intelligent algorithms, such as the microservice combination problem studied in this paper.The QoS-based microservice combination optimisation problem requires the computation of the QoS objective function values for all possible service combinations and the selection of the service combination having the largest QoS objective function value; thus, this task is considered as a Nondeterministic Polynomial (NP)-hard problem in a clear sense.NP hard problems are commonly found in the field of scheduling computing, and metaheuristic algorithms have proven to be the most effective for such problems so far methods, and such algorithms include genetic algorithms (Katoch et al., 2021), particle swarm algorithms (Sengupta et al., 2018), grey wolf algorithms and so on.However, the existing algorithms have certain limitations: they are prone to fall into local optima, their execution efficiency is low, and they are not suitable for large-scale service combinations (Han et al., 2022;Wang et al., 2019;Wu & Li, 2021;Yu et al., 2021).
This paper makes the following contributions.Firstly, the QoS model is pioneered to visualise the non-functional requirements of the microservice combination problem, and the multi-objective optimisation problem with multiple QoS metrics is transformed into a single-objective optimisation problem with optimised QoS objective function values, and then an intelligent optimisation algorithm for microservice combination named CGWO is innovatively proposed by combining three common approaches in the intelligence domain, namely the GWO (grey wolf algorithm), the elite backward learning strategy and the vertical and horizontal crossover strategy.CGWO, using this algorithm to perform the optimisation search calculation for the microservice combination problem based on the QoS model, and finally experimentally demonstrates the effectiveness and stability of the CGWO algorithm in solving the microservice combination non-functional optimisation problem.To do so, this paper is organised as follow: in section 2, the algorithmic foundations are presented, and a special importance is attributed to Gray Wolf algorithm.Section 3 shows the QoS combination optimisation based on improved grey wolf algorithm.As for section 4, the experimental results are presented referring to some case study and finally, section 5 concludes this work.

Microservice portfolio optimisation model
The overall QoS for each microservice combination is calculated using the QoS objective function for microservice combinations, which reduces a multi-objective optimisation problem into a single-objective optimisation problem (Zhang, You, et al., 2019;Yan et al., 2020;Yan et al., 2021).The microservice combination with the highest objective function value is then chosen as the optimisation result.The microservice combination optimisation objective function is defined as shown in Eq. ( 1) (Naseri & Jafari Navimipour, 2019): where QoS sol denotes the overall QoS value of a microservice portfolio; Q t is the overall response time of a microservice portfolio; Q c is the execution cost of a microservice portfolio; Q re and Q av are the reliability and availability metric values of a microservice portfolio, respectively.Note that Q t , Q c , Q re , and Q av represent the aggregations of the normalised q t , q c , q re , and q av values of individual microservices in a microservice combination (Sefati & Navimipour, 2021), which are related to the actual workflow of the microservice.As for the values of w t , w c , w re , and w av , they correspond, respectively, to the weight coefficients of time, cost, reliability, and availability service responses.All these four values range in the interval [0, 1] and their sum weight is equal to the unit.

Grey wolf algorithm
Grey Wolf Optimiser (GWO) is an intelligence optimisation algorithm, proposed by Mirjalili et al. in 2014.It divides the social rank of grey wolves into four classes (α, β, δ, and ω from high to low), and assumes that α, β, and δ wolves have better knowledge of prey location (Mirjalili et al., 2014).During each iteration, these three wolves are used to estimate the location of the prey, and the remaining ω wolf updates the distance for the prey around α, β, and δ wolves to define their own location.At the end of each iteration, the α, β, and δ wolves and their positions in the current pack are updated.Finally, the α wolf is considered as the location of the prey found (Xiaofeng & Xiuying, 2019).In Eqs.
(2) to (5), − → D α , − → D β , and − → D δ indicate the distance between individual grey wolf and the prey locations marked by α, β, and δ wolf locations, respectively.Thus, one can obtain: where t denotes the number of iterations that have been performed, − → X α (t), − → X β (t), and − → X δ (t) represent the current positions of α, β, and δ wolves respectively, X(t) is the position of the individual grey wolf, − → C 1 , − → C 2 , and − → C 3 are random vectors, and − → r j is a random vector varying in the range [0, 1].
Eqs. ( 6) to (8) define the step length and the direction of the individual grey wolf around α, β, and δ wolves towards the prey, respectively, and Eq. ( 9) is the updated position of the individual grey wolf around α, β and δ wolves.
A i is the coefficient vector and is calculated as shown in Eq. ( 10): where a decreases linearly from 2 to 0 during the iterations and − → r i is a random vector varying in the range [0, 1].Thus, A is a random value in the interval [-a, a].When | A| ≥ 1, the individual grey wolf moves away from the prey and the algorithm performs a global search.When | A| < 1, the grey wolf individual approaches the prey and the algorithm performs a local search.
The GWO algorithm has only two adjustable parameters, A and C, and it is characterised by its simple structure and its easy implementation [33].At the same time, the grey wolf algorithm has two convergence factors, a and A, that can be adjusted adaptively and an information feedback mechanism can achieve a balance between local search and global search; thus, using the grey wolf algorithm as an intelligent optimisation algorithm for microservice combinations would yield to a good performance in terms of solution accuracy and convergence speed for the problem.However, the grey wolf algorithm also suffers from some drawbacks that are listed here below (Wang et al., 2020;Yang et al., 2020): (1) Poor population diversity: since the initial population of GWO is randomly generated, a good population diversity cannot be guaranteed; (2) Easy to fall into local optimum: each iteration of the grey wolf algorithm only shares the information of α, β, and δ wolves to other grey wolf individuals ω. ω continuously approximates α, β, and δ wolves, but the search domination of these three wolves does not necessarily result in finding the globally optimal and sub-optimal individuals; thus, the GWO algorithm easily falls into a local optimum during the solution process.

Qos combination optimisation based on improved grey wolf algorithm
In order to improve the response speed of the algorithm and to avoid falling into local optimum, this paper combines the ideas of elite backward learning and vertical and horizontal crossover strategies with the GWO, and proposes an improved grey wolf algorithm, namely the Crossover Grey Wolf Optimiser (CGWO) algorithm to better exploit the advantages of the grey wolf algorithm and solve its limitations.

Elite reverse learning strategy
The elite reverse learning strategy is used to enhance the diversity of the initial population of grey wolves and to ensure that it is of good quality.The basic idea of reverse learning is that, if a feasible solution to a problem is obtained, the opposite of the feasible solution is first computed, the feasible solution and its opposite are mixed, and the better solution from the mixed solution interval is selected as the next generation of individuals using the evaluation method (Meng et al., 2014) and a and b are the search space boundaries in the 11).
If there is an element x in − → X j that is out of bounds, replace it by a random value in the search space boundary [a, b] under that dimension.
The steps to initialise the grey wolf population using the elite reverse learning strategy are as follows: Step 1: Randomly initialize the position vector of N individual grey wolves to form the population pop 1 ; Step 2: For the position vectors of N individual grey wolves, find the inverse vector of each individual grey wolf to form population pop 2 ; Step 3: Combine pop 1 and pop 2 , calculate the objective function value of each grey wolf individual in the combined population, and select the top N grey wolf individuals in the objective function value to form the initial population of the algorithm.

Vertical and horizontal crossover strategy
The vertical and horizontal crossover strategy (Meng et al., 2014)corrects the individual and the global optimal solutions of the population, which can improve the diversity of the population and prevent the algorithm from falling into a local optimum.This strategy consists of two steps: horizontal crossover and vertical crossover.

Horizontal crossover
Horizontal crossover divides the population into two subpopulations of equal size, as shown in Figure 1.Individuals from the two subpopulations are randomly and nonrepeatedly combined in pairs, and the two combined individuals are crossed in the same dimension.Suppose there are two parent individuals X(i) and X(j) from two different subpopulations into which a dim-dimensional population is divided.The i −th parent individual X(i) from the first subpopulation and the j −th parent individual X(j) from the second subpopulation are to perform horizontal crossover operations in the d −th (1 ≤ d ≤ n) dimension, and their offspring X hc (i, d) in the d −th dimension and X hc (j, d) are generated via Eqs.( 12) and (13), respectively.
where r 1 , r 2 , c 1 and c 2 are uniformly distributed random values ranging between 0 and 1. X hc (i, d) and X hc (j, d) are the children solutions of the parent individuals X(i) and X(j) in the d −th (1 ≤ d ≤ n) dimension, respectively.The individuals generated after the horizontal crossover need to compete with the original parent, comparing their objective function values and ultimately retaining the individual with the higher objective function value as the offspring.

Vertical crossover
Vertical crossover is an arithmetic crossover of all individuals in the population between two different dimensions.These two dimensions, d 1 and d 2 , are randomly selected for the vertical crossover operation, as shown in Figure 2. d 1 −th and d 2 −th dimension elements of all individuals in the population are extracted for the vertical crossover, and the offspring X vc (i, d 1 ) of the d 1 −th dimension of the i −th individual in the population can be generated by Eq. ( 14) as shown below: where r is a uniformly distributed random value between 0 and 1, and X vc (i, d 1 ) is the offspring of X(i, d 1 ) and X(i, d 2 ).The individuals of the offspring obtained at the end of the vertical crossover also compete with the individuals of the parents to keep the individual with the higher value of the objective function.

Improved grey wolf Algorithm -cross grey wolf optimizer (CGWO)
The execution process of the CGWO algorithm is as follow: Step 1: Initialize the position vector X i (i = 1, 2, . . ., n) of n individual grey wolves using the elite backward learning strategy.Also, initialize the parameters A and C, the maximum number of iterations maxIter, and the horizontal crossover for the population and the vertical crossover operations with probability p 1 .
Step 2: Calculating the objective function value for each individual grey wolf using the top three grey wolf individuals, with the largest objective function values as α, β, and δ wolves.
Step 3: Updating the parameters C and Aand the position vector of individual grey wolves in the population using Eqs.( 5), ( 9) and (10).
Step 4: Randomly generate probability p∈[0, 1].When p > p 1 , no horizontal crossover nor vertical crossover operation are generated, and the process will switch to Step 5.However, when p ≤ p 1 , horizontal crossover is performed on individuals of the population.After that, the objective function values of children and parents is being compared, and the individuals with larger objective function values are inserted into the population.Then, the vertical crossover is performed on individuals of the population, and, similarly, the individuals with higher objective function values in the offspring and parents are added to the population.
Step 5: Use Eq. ( 1) to calculate the objective function values of all individual grey wolves and update the objective function values and position vectors − → X α (t), − → X β (t), and − → X δ (t)for α, β, and δ wolves.
Step 6: When the maximum number of iterations is reached, the algorithm ends and outputs the position vector − → X α (t) and the objective function value of α.Otherwise, Step 3 is recalled.

CGWO-based microservice combination optimisation
Each microservice in the portfolio corresponds to a set of microservice instances with the same function but with different QoS metric values.Each candidate microservice instance is represented as a binary group, i.e. service ms i,j = < id, qos > , where id is the number of the microservice instance in the set of candidate microservice instances (the number is encoded as a decimal integer) and qos = < q t , q c , q av , q re > denotes the normalised QoS metric value of the microservice instance.Optimisation of the microservice combinations is performed using CGWO algorithm.The location vector X i of grey wolf individuals in CGWO represents a microservice instance combination solution as shown in Eq. ( 15).The elements x i,j in X i correspond to the number m i,j .id of a microservice instance m i,j as represented in Eq. ( 16).x i,j represents the j −th microservice of the combination where the microservice instance numbered x i,j in the set of instances is selected to join the service.
− → x i,j = ms i,j .id. ( 16) Algorithm 1 gives the procedure for solving the microservice combination optimisation problem using CGWO.The number of abstract microservices in the microservice combination is dim, indicating that the dimension of the grey wolf individual position vector − → X i is dim, the maximum number of iterations is maxIter, the crossover probability is p 1 , indicating that the horizontal and vertical crossover operations are performed on the individuals in the current grey wolf population with a probability of p 1 during each iteration, the population size is set to popNum that is being initialised as − → X i (1 ≤ i ≤ popNum) before starting the optimisation process.The position vector − → X i of each individual grey wolf is taken as a feasible solution in the solution space, and the one with the largest objective function value among the feasible solutions (considered as the most suitable solution) is noted as − → X α ; then, the next best solutions are − → X β and − → X δ .where: • initialise( X) denotes the initialisation of n solutions using the elite reverse learning strategy; • t is the current number of iterations; • calculateFitness( X) denotes the calculation of the objective function value of each grey wolf individual in the population, and the three individuals with the highest current objective function values are selected and their position vectors are assigned to − → X α , − → X β , and − → X δ .
During each iteration, CGWO needs to continuously select the three individuals with the highest objective function value according to these values in the population and then guide Algorithm 1. Algorithm of CGWO rnums = randnums(popnum) //puts the integers from 1 to popnum into the rnum array after random sorting 9: for i = 1 to popnum/2 do / / horizontal cross 10: 26: end while the other individuals to update their positions, while performing horizontal crossover and vertical crossover operations on the position vectors of the population with p 1 probability.The optimisation process stops when the number of iterations reaches the maximum value maxIter; then, the position vector − → X α with the highest objective function value in the population is returned as the result of the microservice combination optimisation.Therefore, each element in − → X α is the individual code of the microservice instance selected to form the microservice combination.

Experimental analysis
The experiments evaluate the performance of the microservice combination optimisation approach using CGWO with two evaluation objectives: (1) The first objective is to assess the performance of the microservice instance portfolio obtained using the CGWO optimisation algorithm in terms of optimality and execution time; (2) The second objective is to assure a certain degree of improvement of the CGWO algorithm with respect to the Grey Wolf algorithm in the solution selection process and in the selection results.
The experiments were executed on an Intel(R) Core (TM) i5-6500 CPU @ 3.20 GHz 3.19 GHz, 16GB RAM, 64-bit, Windows 10 operating system, and developed in IntelliJ IDEA environment using java language.

Performance of the CGWO algorithm in finding the optimal solution
For the first objective -the performance of the CGWO algorithm in finding the optimal solution -two experiments are conducted.In the first experiment, the solution found using the CGWO algorithm is compared with the optimal solution found using the exhaustive method in terms of objective function value and execution time to verify that CGWO greatly improves computational efficiency with less loss of accuracy.As the convergence of CGWO to the optimal solution is influenced by a tunable parameter p 1 specific to the algorithm, the steps are as follows: • Step 1: an exhaustive search is performed on the set of instances corresponding to each microservice in the microservice portfolio to obtain the objective function value and the execution time for the actual best combination of that microservice portfolio; • Step 2: iteratively fine-tune the adjustable parameter p 1 ∈[0, 1] in CGWO and calculate the optimisation result of the CGWO algorithm after 400 iterations for different p 1 values, the optimisation result includes the maximum objective function value and execution time; • Step 3: compare the maximum objective function value and execution time returned using CGWO and using the exhaustive search and calculate the deviation of the objective function values between both methods.
The second experiment compares the difference between the average fitness value and the best fitness value of the CGWO algorithm for different numbers of iterations to verify the stability of the CGWO algorithm.The specific implementation steps are: • Step 1: record the best configuration parameter p 1 obtained from evaluation experiment 1; • Step 2: optimise the microservice combination using the CGWO algorithm under the same parameter p 1 configuration and record the maximum fitness value obtained for different iterations; • Step 3: repeat Step2 50 times and calculate the maximum fitness value and its average for the different iterations in the 50 experiments; • Step 4: compare the average and the maximum fitness value under different iterations with the CGWO algorithm.

Experiment 1: verification of algorithm accuracy and computational efficiency
In order to illustrate the accuracy and computational efficiency of the CGWO algorithm for different problem complexities, this paper sets up two different sizes of search spaces and compares the experimental results with the exhaustive method, which has the highest accuracy.Table 0 shows the experimental results of the exhaustive enumeration method for microservice combination A and microservice combination B, where microservice combinations A and B in Table 0 represent two different configurations in terms of the number of microservices in the combination and the number of instances corresponding to each microservice, respectively.For example, Microservice combination A consists of six microservices having respective numbers of instances equal to 20, 15, 10, 10, 5 and 5.The search space complexity of the microservice combination is represented by the number of all executable microservice combination solutions.Similarly, microservice combination B consists of eight microservices whose number of instances is equal to 15, 15, 5, 5, 5, 5, 5 and 5.The microservice instance QoS metrics include service response time q t , service cost q c , service availability q av and service reliability q re , where q t takes the values [0, 300], q c takes the values [0, 30], q av takes the values [0.7, 1] and q re takes the values [0.5, 1].
Tables 1 and 2 show the experimental results obtained when varying the value of the tunable parameter p 1 for microservice combinations A and B, respectively.p 1 was set to 0.4, 0.6, 0.8 and 1.The results of the comparison experiments include the average best fit, the average execution time and the average deviation.Each row of the table represents the average of the microservice portfolio optimisation results obtained when the CGWO algorithm was run for 50 times on the same configuration of adjustable parameters p 1 , where the initial population size is 200 for each optimisation and the number of iterations is 400.The rows highlighted in bold are the best results of the trade-off between fitness and time, indicating the best configuration of the adjustable parameter p 1 .
By comparing the experimental results of the CGWO algorithm and the exhaustive method, it can be concluded that: • The lowest deviation between the maximum objective function value optimised using the CGWO algorithm and the actual objective function value derived using the exhaustive method reached 0.0033 with an error rate of 0.528% at different computational space sizes, and the higher the complexity of the problem search space, the smaller the deviation of the results and the higher the accuracy When the problem search space complexity is increased (microservice combination B), the minimum execution time of the CGWO algorithm is 1139.36ms, which is close to the minimum time of 1136.74 ms for low search space complexity and much smaller than the execution time of 42009ms for the exhaustive method, with a 97.29% improvement in computational efficiency.
Based on these two conclusions, it is clear that the CGWO algorithm has the ability to accurately explore the solution space in a relatively short time.Added to that, by changing the value of the adjustable parameter value p 1 and considering the best adaptation value and execution time of the microservice, it is found that the best parameter configuration is obtained for p 1 equal to 0.8.

Experiment 2: algorithm stability evaluation
In order to verify the stability of the CGWO optimisation performance, 50 optimisation experiments were conducted using the CGWO optimisation algorithm for each of microservice combination A with low search space complexity and microservice combination B with high search space complexity, and the average optimisation function value and the best optimisation function value with the number of iterations in these 50 experiments were recorded, and the results are shown in Figures 3 and 4.This experiment setsadjustable parameter p 1 = 0.8, an initial population size of 200 and a maximum number of iterations of 500.The CGWO algorithm was run 50 times for microservice combination A and microservice combination B. The difference between the average and maximum value of the search function for different iterations was recorded and compared for each of the 50 experiments.Figures 3 and 4 show that, with the change of the number of iterations, the difference between the average value of the optimisation function and the maximum value of the optimisation function of microservice combination is small, and the change trend of both is relatively consistent, and with the increase of the number of iterations, the average value of the optimisation function of the multiple experiments approaches the direction of the best value of the optimisation function with the increase of the number of iterations, which indicates that the performance of the algorithm tends to be stable after convergence, which indicates that the CGWO optimisation algorithm proposed in this paper has high stability.

Degree of improvement of the CGWO algorithm
Experiment 3: In order to examine the degree of improvement of the CGWO algorithm relative to the Grey Wolf algorithm, which is the second objective, both algorithms were compared together, using the average optimal objective function value and the average deviation as the comparative evaluation metrics of the two optimisation algorithms.The CGWO and GWO algorithms were used to optimise microservice combinations A and B. The same initial population was used for each optimisation, the initial population size was 200, the number of iterations was 500, the parameter p 1 of the CGWO algorithm was taken to be 0.8, and a total of 50 experiments were performed.The average optimal objective function value and the average deviation of the two algorithms in the 50 experiments were taken and compared.The results of the experiments are shown in Tables 3 and 4 according to the different microservice combinations A and B. Meanwhile, the average fitness values of the two optimisation algorithms for microservice combinations A and B at different numbers of iterations are calculated and the optimisation effects of the two algorithms at different numbers of iterations are compared as shown in Figures 5 and 6.
The average optimal objective function values for optimising microservice combinations A and B using the CGWO algorithm are 0.6175 and 0.6217, respectively, which are higher  than those of the GWO algorithm (respectively equal to 0.6049 and 0.6074), indicating that the CGWO calculation is superior.Meanwhile, the average deviations between the optimal objective function values and the actual optimal objective function values of the CGWO algorithm are 0.0065 and 0.0033, respectively, which are smaller than those of the GWO algorithm (equal to 0.0191 and 0.0176, respectively), indicating that CGWO has improved 65.97% and 81.25%, respectively, in the accuracy of finding the optimal value.This indicates that the improved strategy of the CGWO algorithm helps to find the global optimal solution in the process of iterative optimisation beyond the current local optimum.Also, as can be seen in Figures 5 and 6, at different iterations, the CGWO algorithm is consistently better than the GWO algorithm.In summary, it can be concluded that the CGWO algorithm is more effective than the GWO algorithm in finding the optimal solution for the microservice combination.

Conclusion
An improved grey wolf optimisation algorithm, CGWO, is proposed to find the combination of microservice instances with the largest objective function value.CGWO is based on the standard grey wolf algorithm (Yang et al., 2019) and uses an elite backward learning strategy to initialise the grey wolf population during its initialisation to ensure the population diversity for the initial iteration; then, during the iterative optimisation process, the vertical and the horizontal crossover strategies are employed.The operations on both crossovers avoid premature convergence of the optimisation process and prevent the final solution obtained from being a local optimum.The vertical crossover operation can make some dimensions whose optimisation has stalled jump out of the local maximum, and the horizontal crossover operation can expand the exploration space of the optimal solution.Finally, an experimental validation of the proposed CGWO algorithm applied in microservice combinatorial optimisation is conducted to verify the performance of the CGWO algorithm for two different sets of problem sizes.Firstly, the CGWO algorithm is compared with the most accurate exhaustive method to verify the accuracy and computational efficiency of the CGWO algorithm in finding the optimal solution; and then the CGWO algorithm is compared with the GWO algorithm to verify the superiority and stability of the CGWO algorithm in searching for the optimal solution.In future work, the increased time complexity associated with algorithm improvements will also be further optimised.Some research work has demonstrated that the incorporation of neural networks may bring better results (Huang, Chen, et al., 2020), but it has not been proven to be effective at present.Other impacts of the actual deployment and operating environment of microservices on the non-functionality of the microservice portfolio will also be considered, such as the reliability and security of containers and network transmission latency (Attaoui et al., 2022;Zhang, Zhu, et al., 2022;Zhang, Zhu, et al., 2021;Zhang, Wang, et al., 2019).

Figure 3 .
Figure 3. Diagram of microservice portfolio A objective function.

Figure 4 .
Figure 4. Diagram of the microservice portfolio B objective function.

Figure 5 .
Figure 5. Adaptation values of the optimised microservice combination A for different iterations of the two algorithms.

Figure 6 .
Figure 6.Adaptation values of the optimised microservice combination B for different iterations of the two algorithms.

Table 0 .
Experimental results of the exhaustive enumeration method for microservice combination A and microservice combination B.

Table 1 .
Experimental results of CGWO optimisation for microservice portfolio A.

Table 2 .
Experimental results of CGWO optimisation for microservice portfolio B.

Table 3 .
Comparison of experiments for optimisation of microservice portfolio A.

Table 4 .
Experimental comparison of microservice portfolio B optimisation.