Issue 
Wuhan Univ. J. Nat. Sci.
Volume 29, Number 4, August 2024



Page(s)  323  337  
DOI  https://doi.org/10.1051/wujns/2024294323  
Published online  04 September 2024 
Computer Science
CLC number: TP183
MultiStrategy Improvement of Sparrow Search Algorithm for Cloud Manufacturing Service Composition
麻雀搜索算法在云制造服务组合中的多策略改进
^{1}
Tenth Research Institute, China Electronics Technology Group Corporation, Chengdu 610000, Sichuan, China
^{2}
Department of Computer Science, Nanjing University of Posts and Telecommunications, Nanjing 210023, Jiangsu, China
^{3}
Research Institution of Information Technology, Tsinghua University, Beijing 100084, China
^{4}
School of Computer Science, Wuhan University, Wuhan 430072, Hubei, China
^{†} Corresponding author. Email: gfzhou@whu.edu.cn
Received:
29
February
2024
In existing research, the optimization of algorithms applied to cloud manufacturing service composition based on the quality of service often suffers from decreased convergence rates and solution quality due to singlepopulation searches in fixed spaces and insufficient information exchange. In this paper, we introduce an improved Sparrow Search Algorithm (ISSA) to address these issues. The fixed solution space is divided into multiple subspaces, allowing for parallel searches that expedite the discovery of target solutions. To enhance search efficiency within these subspaces and significantly improve population diversity, we employ multiple group evolution mechanisms and chaotic perturbation strategies. Furthermore, we incorporate adaptive weights and a global capture strategy based on the golden sine to guide individual discoverers more effectively. Finally, differential Cauchy mutation perturbation is utilized during sparrow position updates to strengthen the algorithm's global optimization capabilities. Simulation experiments on benchmark problems and service composition optimization problems show that the ISSA delivers superior optimization accuracy and convergence stability compared to other methods. These results demonstrate that our approach effectively balances global and local search abilities, leading to enhanced performance in cloud manufacturing service composition.
摘要
在现有研究中, 基于服务质量的云制造服务组合算法优化常常由于在固定空间中进行单一种群搜索和信息交换不足, 导致收敛速度和解质量下降。本文提出了一种改进的麻雀搜索算法 (ISSA) 来解决这些问题。将固定的解空间划分为多个子空间, 并使用并行搜索,从而加快目标解的发现速度。为了提高这些子空间内的搜索效率并显著改善种群多样性, 采用了多组进化机制和混沌扰动策略。此外, 结合了基于黄金正弦的自适应权重和全局捕捉策略,更有效地引导个体发现者。最后, 在麻雀位置更新过程中使用差分柯西变异扰动, 以增强算法的全局优化能力。在基准问题和服务组合优化问题上的模拟实验表明, ISSA在优化精度和收敛稳定性方面优于其他方法。结果表明, 本文的方法有效平衡了全局搜索和局部搜索能力, 从而在云制造服务组合中表现出更佳的性能。
Key words: cloud manufacturing / service composition optimization / quality of service / sparrow search algorithm
关键字 : 云制造 / 服务组合优化 / 服务质量 / 麻雀搜索算法
Cite this article: ZHOU Liliang, LI Ben, YU Qing, et al. MultiStrategy Improvement of Sparrow Search Algorithm for Cloud Manufacturing Service Composition[J]. Wuhan Univ J of Nat Sci, 2024, 29(4): 323337.
Biography: ZHOU Liliang，male, Senior engineer, research direction: avionics information systems and sensor management. Email: zhoull@163.com
Fundation item: Supported by the National Natural Science Foundation of China (62272214)
© Wuhan University 2024
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
0 Introduction
Cloud manufacturing is a new paradigm of serviceoriented manufacturing, where manufacturing resources and capabilities are virtualized and encapsulated into manufacturing services^{[1]}. Service composition and optimization selection (SCOS)^{[2]} is the key technology to realize resource and capacity sharing in the cloud manufacturing system. The main process is to search for services from the candidate service pool for each subtask in the cloud manufacturing system, then generate composite services and optimize them^{[3]}. Therefore, the optimization goal of the SCOS problem is to find the optimal composite services within the feasible time, so that it not only meets the functional requirements of users, but also needs to have the optimal global service quality (QoS)^{[4]}.
In recent years, many scholars have explored cloud manufacturing service composition, achieving significant progress. Zhou et al^{[5]} proposed a mixed artificial bee colony method, combining Archimedean distribution with an artificial bee colony to guide the population. This method enhances local search performance but risks falling into local optima during late convergence due to fixed neighborhood modes. Jin et al^{[6]} improved the whale algorithm's local exploration by using a uniform mutation operator and adaptive probability fusion with the Levy flight strategy, but local information feedback can still lead to local optima. Jiang et al^{[7]} introduced a variablelength coding scheme with structural information to enhance genetic algorithms' local search ability, though increased cross mutation operations caused frequent individual aggregation near current optimal solutions, leading to local convergence. Yang et al^{[8]} enhanced the Grey Wolf Algorithm (GWO) by adjusting the control factor and adaptively sharing information, improving local search performance. Liao et al^{[9]} added adaptive crossover probability and random disturbance to the krill swarm (KH) algorithm, improving local search ability. Zeng et al^{[10]} proposed a hybrid teaching and learning optimization algorithm, integrating cross optimization in the learning stage to balance local search performance. Liao et al^{[11] }used a variable field of view strategy and genetic algorithm mutation strategy to enhance the polar bear algorithm's local search performance.
Due to the unique search mechanisms and global optimization capabilities of the sparrow search algorithm (SSA)^{[12]}, which are particularly effective for handling complex tasks with many different candidate service sets. Unlike other commonly used optimization algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Ant Colony Optimization (ACO), SSA mimics the foraging behavior of sparrows, allowing it to perform effective global searches and local optimizations. Therefore, in this paper, SSA is first introduced into the cloud manufacturing SCOS problem based on service quality perception for optimal solutions. However, when SSA is applied to composition optimization scenarios, the sparrow population needs to frequently randomly search the solution space to find candidate services that meet the task requirements. To address this, a multipopulation evolution mechanism is designed, saving the optimal solutions found in the divided subpopulations in a record layer. To enhance the effectiveness of individual local search strategies in subpopulations, Logistic chaotic mapping is used to improve population diversity in the feasible domain, helping subgroups escape local optima.
To reduce the impact of low population richness and insufficient search traversal on algorithm optimization, and to ensure the quality of the final optimal service composition solution, this paper proposes an improved SSA (ISSA) based composition optimization method for manufacturing cloud services. This method includes updating the position of the finders in the sparrow population through the golden sine strategy of dynamic adaptive weights, better guiding the group to search for the best food source. Later, a differential Cauchy mutation strategy is used to further improve the search ability of the sparrow population. Finally, the global optimal sparrow individual position is updated through a greedy strategy and restored to the corresponding optimal service combination solution, ultimately improving the quality of the service composition solution. Extensive experiments demonstrate the effectiveness of the proposed algorithm in solving multiple benchmark functions and cloud manufacturing service composition problems of different scales.
1 Problem Description and Mathematical Model
1.1 Cloud Manufacturing Service Composition Problem Description
Figure 1 illustrates the main process of a single objective cloud manufacturing service composition based on service quality, including a complex manufacturing task $\mathrm{T}$, which is usually decomposed into a series of subtasks $\mathrm{S}\mathrm{T}$, arranged in a logical relationship and corresponding to a candidate manufacturing service set (CMSS). The SCOS problem is that the cloud manufacturing system searches for the corresponding Manufacturing Cloud Service (MCS) for each subtask from the candidate service pool, and then selects a candidate service that meets the needs of each subtask under the multidimensional cloud manufacturing background, and aggregates it into a Composite Manufacturing Service (CMS) that meets the task requirements^{.} This article focuses on the service composition and optimization selection stages of this process, and evaluates the quality of the final composite service through multidimensional QoS attribute indicators. The optimal combination path is selected with the goal of maximum QoS. Considering that each cloud service in the cloud system contains conflicting and diverse QoS attributes, that is,
$\mathrm{Q}\mathrm{o}\mathrm{S}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)$
$=\uff5b{q}_{\mathrm{1}}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right),\cdots ,{q}_{\mathrm{2}}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right),\cdots ,{q}_{r}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)\uff5d$
Fig.1 Cloud manufacturing services composition main flowchart 
where $\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}$ represents the jth cloud service in the ith candidate service CMSS selected from the cloud pool, and $r$ represents the rth dimension of QoS attribute values, including manufacturing time, cost, reliability, and transaction quality.
To meet the changing task requirements, optimizing service quality attributes is essential. Swarm intelligence algorithms, based on biological population behavior, can be highly effective in this area. However, when faced with complex service composition optimization problems, these algorithms often suffer from reduced convergence rates and solution quality due to their fixed search ranges and localized information transmission. The SSA, with its characteristics of population guidance, following, and early warning, can ensure good local search performance and convergence speed. To address the issues of low population diversity and insufficient search comprehensiveness, this paper proposes a multistrategy SSA based on chaotic mapping. This approach adapts to complex solution spaces and improves the quality of service composition solutions.
1.2 The Classical QoS Evaluation
Firstly, attribute indicators of manufacturing time, cost and reliability are selected, and then user satisfaction and deliverable quality indicators are further integrated. Considering that the manufacturing cloud service portfolio has four common structures, namely, sequence, parallel, selective, and cyclic^{[13]}, all of which can be simplified into a sequential structure. Therefore, this article uses the aggregation method under the sequential structure for each attribute, as described below.
Manufacturing time $T$: mainly covering waiting time for processing ${T}_{\mathrm{w}\mathrm{a}\mathrm{i}\mathrm{t}}$, customized production time ${T}_{\mathrm{m}\mathrm{a}\mathrm{n}\mathrm{u}}$, and logistics transportation time ${T}_{\mathrm{l}\mathrm{o}\mathrm{g}\mathrm{i}\mathrm{s}}$, namely, $T={T}_{\mathrm{w}\mathrm{a}\mathrm{i}\mathrm{t}}+{T}_{\mathrm{m}\mathrm{a}\mathrm{n}\mathrm{u}}+{T}_{\mathrm{l}\mathrm{o}\mathrm{g}\mathrm{i}\mathrm{s}}$. Perform accumulation operation through aggregate function $\sum _{i=\mathrm{1}}^{n}}{q}_{T}(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i})$.
Manufacturing cost $C$: mainly covering processing cost ${C}_{\mathrm{m}\mathrm{a}\mathrm{n}\mathrm{u}}$, and transportation costs ${C}_{\mathrm{l}\mathrm{o}\mathrm{g}\mathrm{i}\mathrm{s}}$, namely, $C={C}_{\mathrm{m}\mathrm{a}\mathrm{n}\mathrm{u}}+{C}_{\mathrm{l}\mathrm{o}\mathrm{g}\mathrm{i}\mathrm{s}}$. Perform accumulation operation through aggregate function $\sum _{i=\mathrm{1}}^{n}}{q}_{C}(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i})$.
Service reliability$R$: $R={N}_{\mathrm{1}}/({N}_{\mathrm{0}}+{N}_{\mathrm{1}})$, where ${N}_{\mathrm{0}}$ and ${N}_{\mathrm{1}}\text{}$represent the number of times required for failure and success, respectively. Perform the multiplication operation through aggregate function ${\prod}_{i=\mathrm{1}}^{n}{q}_{R}(MC{S}_{i})$.
User satisfaction $S$: $S={\displaystyle \sum _{i=\mathrm{1}}^{n}}{S}_{i}/n$, where ${S}_{i}$ represents the ith service demander obtaining historical service results through score, and n represents the total number of users who provide an evaluated composite service solution. Perform the accumulation operation through aggregate function $\frac{\mathrm{1}}{n}{\displaystyle \sum _{i=\mathrm{1}}^{n}}{q}_{S}(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i})$.
Secondly, due to the different forms of QoS expression and quantification units of various manufacturing services, dimensionality elimination and normalization^{[14]} should be carried out in advance for them, and the specific calculation method used is as follows:
$\begin{array}{l}\mathrm{Q}\mathrm{o}\mathrm{S}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)=\sum _{{q}_{j}\in {Q}^{+}}\frac{{q}_{j}\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i}^{j}\right)\mathrm{m}\mathrm{i}\mathrm{n}\text{}{q}_{j}}{\mathrm{m}\mathrm{a}\mathrm{x}\text{}{q}_{j}\mathrm{m}\mathrm{i}\mathrm{n}\text{}{q}_{j}}{\omega}_{i}\\ +\sum _{{q}_{j}\in {Q}^{}}\frac{\mathrm{m}\mathrm{a}\mathrm{x}\text{}{q}_{j}{q}_{j}(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i}^{j})}{\mathrm{m}\mathrm{a}\mathrm{x}\text{}{q}_{j}\mathrm{m}\mathrm{i}\mathrm{n}\text{}{q}_{j}}{\omega}_{i}\end{array}$(1)
where, $\mathrm{m}\mathrm{a}\mathrm{x}\text{}{q}_{j}$ and $\mathrm{m}\mathrm{i}\mathrm{n}\text{}{q}_{j}$ represent the maximum and minimum values of the jth attribute of the candidate service owned by the ith manufacturing cloud service, respectively, ${Q}^{+}$, ${Q}^{}$ represent the positive and negative attribute parameters respectively.
1.3 Comprehensive Mathematical Model
According to the analysis in Section 2.1, this paper constructs a problem model with the optimization goal of maximizing QoS, and uses a simple weighting method to obtain the comprehensive utility value (objective function), which can be calculated by formula (2).
$\mathrm{m}\mathrm{a}\mathrm{x}\left(\mathrm{Q}\mathrm{o}\mathrm{S}\right)=\mathrm{m}\mathrm{a}\mathrm{x}\sum {\omega}_{k}\times \mathrm{N}\mathrm{o}\mathrm{r}\mathrm{m}{Q}_{k},{\displaystyle \sum _{k=\mathrm{1}}^{r}}{\omega}_{k}=\mathrm{1}$(2)
where ${\omega}_{k}$ is the weight of the kth evaluation indicator and ${\omega}_{k}\in [\mathrm{0,1}]$, $\mathrm{N}\mathrm{o}\mathrm{r}\mathrm{m}{Q}_{k}$ is the normalized value of the kth QoS attribute, and r is the number of indicators. The value can be specified based on user preference or system.
Considering the specific requirements of the production and manufacturing process and the characteristics of the service composition problem, the following constraints are provided:
$\left[\begin{array}{c}\frac{\mathrm{1}}{T\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)}\\ \frac{\mathrm{1}}{C\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)}\\ R\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)\\ S\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)\\ D\left(\mathrm{M}\mathrm{C}{\mathrm{S}}_{i,j}\right)\end{array}\right]\ge \left[\begin{array}{c}\frac{{q}_{T}}{{T}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}}\\ \frac{{q}_{C}}{{C}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}}\\ {q}_{R}{R}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}\\ {q}_{S}{S}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}\\ {q}_{D}{D}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}\end{array}\right],\forall j=\mathrm{1},\cdots ,N$(3)
where ${r}_{i,j}$ represents the jth candidate service in the execution path of the ith composite service. ${T}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}},\text{}{C}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}},\text{}{R}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}},\text{}{S}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}},\text{}{D}_{\mathrm{b}\mathrm{a}\mathrm{s}\mathrm{e}}$ respectively represent the worstcase level that users can accept in terms of execution time, cost, reliability, satisfaction, and quality of deliverable, and $N$ indicates the total number of cloud services.
2 The Basic Concept of SSA Algorithm
The SCOS is a typical dynamic and uncertain composition optimization problem. Swarm intelligence algorithms are wellsuited for solving such problems due to their adaptability and robustness. Recently, a new intelligent algorithm, the Sparrow Search Algorithm (SSA)^{[12]}, has garnered significant attention.
2.1 The Principle of Algorithm and Mathematical Model
SSA can establish a discovererjoiner mathematical model based on the foraging process of sparrows. The discoverer will be the first to gain energy in the search for resources, and at the same time to guide the followers to the foraging location. The location formula is as follows:
${X}_{i,j}^{t+\mathrm{1}}=\{\begin{array}{l}{X}_{i,j}^{t}\cdot \mathrm{e}\mathrm{x}\mathrm{p}\text{}\left(\frac{i}{\alpha \cdot {T}_{\mathrm{m}\mathrm{a}\mathrm{x}}}\right),\text{}{R}_{\mathrm{2}}\mathrm{S}\mathrm{T}\\ {X}_{i,j}^{t}+Q\cdot \mathit{L},\text{}{R}_{\mathrm{2}}\ge \mathrm{S}\mathrm{T}\end{array}$(4)
where ${R}_{\mathrm{2}}\in [\mathrm{0,1}],\mathrm{S}\mathrm{T}\in [\mathrm{0.5,1}]$ represent the warning line and safety limit, which is uniformly distributed and generated arbitrarily within the range of (0,1), $Q$ is any number that satisfies the normal law, and $\mathit{L}$ is a 1 by d matrix.
Followers closely follow the discoverer to complete the food search process, and at the same time try their best to grab the discoverer's resources to improve their fitness value. The mathematical model is shown in formula (5):
${X}_{i,j}^{t+\mathrm{1}}=\{\begin{array}{cc}Q\cdot exp\text{}(\frac{{X}_{\mathrm{w}\mathrm{o}\mathrm{r}\mathrm{s}\mathrm{t}}^{t}{X}_{i,j}^{t}}{{i}^{\mathrm{2}}}),& i\frac{N}{\mathrm{2}}\\ {X}_{pbest}^{t+\mathrm{1}}+{X}_{i,j}^{t}{X}_{pbest}^{t+\mathrm{1}}\cdot {\mathit{A}}^{+}\cdot \mathit{L},& i\le \frac{N}{\mathrm{2}}\end{array}$(5)
where ${X}_{\mathrm{w}\mathrm{o}\mathrm{r}\mathrm{s}\mathrm{t}}^{t},\text{}{X}_{\mathrm{p}\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{t+\mathrm{1}}$ correspond to the global worst and local best regions in the current iteration and the next iteration respectively, and ${\mathit{A}}^{+}={\mathit{A}}^{\mathrm{T}}(\mathit{A}{\mathit{A}}^{\mathrm{T}}{)}^{\mathrm{1}}$ is a matrix that satisfies and all values are 1 or 1.
The sparrows responsible for monitoring and warning only account for 10% and 20% of the whole population. Their task is to spread the warning signal to the entire population, while leading the population to quickly move to a safe area or randomly approach other sparrow individuals. Their position updates are shown in equation (6):
${X}_{i,j}^{t+\mathrm{1}}=\{\begin{array}{cc}{X}_{gbest}^{t}+\beta \cdot {X}_{i,j}^{t}{X}_{gbest}^{t},& {f}_{i}\ne {f}_{g}\\ {X}_{i,j}^{t}+K\cdot (\frac{{X}_{i,j}^{t}{X}_{\mathrm{w}\mathrm{o}\mathrm{r}\mathrm{s}\mathrm{t}}^{t}}{({f}_{i}{f}_{w})+\epsilon}),& {f}_{i}={f}_{g}\end{array}$(6)
where ${X}_{\mathrm{g}\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{t}$ represents the position of the globally optimal individual, $\beta $ is a random number and follows a normal distribution, $K\in \mathrm{r}\mathrm{a}\mathrm{n}\mathrm{d}[\mathrm{1,1}]$ is a control step parameter, ${f}_{\mathrm{g}},\text{}{f}_{\mathrm{w}}$ are the optimal and worst fitness values, respectively, $\epsilon $ is the infinitesimally small constant that regulates the wandering area of sparrows.
2.2 Analysis of Algorithm Advantages and Disadvantages
As a new heuristic algorithm, the SSA features fast convergence speed, easy implementation, few adjustable parameters, and excellent local search performance. It has been successfully applied to various engineering optimization problems in recent years, such as threedimensional path planning for unmanned aerial vehicles (UAVs)^{[15]} and reactor parameter optimization^{[16]}.
However, there are still some shortcomings when applying SSA to service composition optimization. One issue is the decreased population diversity. The random initialization of the sparrow population in the search space results in an uneven distribution, limiting the full utilization of environmental information. Unlike PSO and ACO, which gradually move towards the optimal solution, SSA converges directly to the current optimal solution, leading to a rapid decrease in population diversity in the later stages. Additionally, there is insufficient search traversal for discoverer location updates. Discoverers occupy positions with high current fitness values, leading the population based on the best fitness individuals. This method only considers current and optimal positions without incorporating the individual's previous experience, resulting in premature convergence and a limited search scope. These issues contribute to poor convergence accuracy and solution quality, especially for highdimensional problems.
3 MultiStrategy ISSA Based on Chaotic Mapping
3.1 Overview of the ISSA
Definition 1 The sparrow individual in ISSA can be represented by the quintuple $\mathrm{S}=\{t,\text{}T,\text{}f(X),\text{}{X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{\mathrm{*}},$$f({X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{\mathrm{*}})\}$, and $X=\{{x}_{\mathrm{1}},{x}_{\mathrm{2}},\cdots ,{x}_{d},\cdots ,{x}_{n}\}$, where ${x}_{d}$ represents the distribution position of individuals in the Ddimensional solution space, $f(X)$ is the fitness function value, which can judge the quality of each individual in all distributed populations of the sparrow algorithm, and ${X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{\mathrm{*}},\text{}f({X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{\mathrm{*}})$ is the optimal individual obtained after updating and its fitness value. The sparrow population is an $n\times d$ vector composed of n individual sparrows, and each sparrow has the same iteration number, fitness function and iterative process of position update.
Definition 2 The ith sparrow is represented as ${x}_{i}=({x}_{i,\mathrm{1}},\text{}{x}_{i,\mathrm{2}},\cdots ,\text{}{x}_{i,j},\cdots ,\text{}{x}_{i,N})$, where ${x}_{i,j}$ represents the serial number of the manufacturing cloud service selected in the jth candidate service set in the composite service, with a range limited to $\{{x}_{i,j}:\mathrm{0}\le {x}_{i,j}\le {K}_{j}\}$, and ${K}_{j}$ represents the total number of cloud services in the jth subtask's candidate service set.
Specifically, this article visualizes the mapping relationship between candidate cloud service sets and sparrow individuals using Fig. 2. If the set of CMSS to be selected is composed of $\{\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{1,4}},\text{}\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{2,6}},$$\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{3,5}},\text{}\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{4,1}},\text{}\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{5,2}},\text{}\mathrm{M}\mathrm{C}{\mathrm{S}}_{\mathrm{6,3}}\}$, it means that the composite solution contains six different types of services that have been selected. The first position indicates that the subtask 1 is executed by the fourth manufacturing service instance in its corresponding candidate service set, and the other positions are also the same. Then the final corresponding individual coordinates can be represented as a real number array (4, 6, 5, 1, 2, 3).
Fig.2 Mapping relationship between composite services and sparrow individuals 
As a whole, the ISSA includes three major improvement designs: multipopulation evolution mechanism, dynamic global search strategies for discoverers, and mutation perturbations for position update. Specifically, ① Considering the mapping relationship between sparrow individuals and composite services, the original fixed search space is divided into subspaces, and the sparrow population searches for optimal solutions in each subspace in parallel and combines into the best solution required. In order to reasonably allocate the population to search for the target solution in the subspace and improve the traversal accuracy, multipopulation evolution and chaotic disturbance search mechanisms are used; ② In response to the situation where discoverers blindly gather too early, the global capture strategy of adaptive weight and golden sine is introduced to better stabilize the exploration performance of the algorithm; ③ In order to solve the problem of insufficient search scope when the search individuals update their positions around elite individuals, crossvariation with reference to the differential strategy makes the diversity of sparrow population more abundant. Meanwhile, the Cauchy operator is used in the mutation process to continuously perturb the optimal solution during the evolution process, and further enhance the global optimization ability.
3.2 MultiGroup Evolution Mechanism
From the overall overview of the ISSA in Section 3.1, it can be seen that when selecting suitable cloud services from a large number of candidate service sets, the ISSA abandons the traditional fixed solution space single search method. Instead, it divides the population into different subspaces corresponding to the actual number of candidate service sets and adopts parallel searches for the current optimal solution. This poses a challenge to the development and exploration process of the sparrow population in the algorithm. To reasonably allocate the population to search for target solutions in subspaces and improve optimization efficiency, this paper designs a mechanism for parallel evolution from multiple populations. The search strategy is described as follows:
1) Subgroup Division: The sparrow population is evenly divided into n subgroups $(\mathrm{s}{\mathrm{b}}_{\mathrm{1}},\mathrm{s}{\mathrm{b}}_{\mathrm{2}},\cdots ,\mathrm{s}{\mathrm{b}}_{n})$, each corresponding to m candidate service sets divided by tasks. This ensures that a large sparrow population can be divided into several independent small populations, improving the parallel search efficiency of individuals in different subgroups.
2) Parallel Search: Each subgroup performs a parallel search for the current optimal solution within its designated subspace. The optimal individual ${P}_{b}$ in each sparrow subgroup guides other individuals in the group by sharing its advantageous experience. This ensures that feasible solutions can be found among the numerous candidate solutions.
3) Optimal Individual Retention: After the evolution of any subgroup, the optimal sparrow individual ${P}_{b}$ in the subgroup is retained, and its position is stored in their respective message blocks $({M}_{\mathrm{1}},\text{}{M}_{\mathrm{2}},\cdots ,\text{}{M}_{n})$. As indicated by step ① in Fig. 3, the positions of optimal individuals from all subgroups are combined into a preferred decision layer.
Fig.3 Schematic diagram of the evolution mechanism of multiple populations 
4) Preferred Decision Layer: This layer, indicated by step ② in Fig. 3, is responsible for recording feasible solutions selected from different subgroups. It contains the optimal individuals of all subgroups, ensuring that the best solutions are retained and refined throughout the iterations.
In order to ensure the effectiveness of individual local search strategies in the subpopulation, the use of Logistic chaotic mapping with improved population diversity in the feasible domain helps subgroups escape the dilemma of local optimization, ensuring that the algorithm searches for feasible candidate services for ondemand combinations. To reduce the impact of low population richness and insufficient search traversal on algorithm optimization and ensure the quality of the final optimal service composition solution, this paper takes service quality as the evaluation indicator. An ISSAbased composition optimization method for manufacturing cloud services is proposed. This method specifically includes updating the position of the finders in the sparrow population through the golden sine strategy of dynamic adaptive weights to better guide the group to search for the best food source and later using the differential Cauchy mutation strategy to further improve the search ability of the sparrow population. Finally, the global optimal sparrow individual position is updated through a greedy strategy, and it is restored to the corresponding optimal service combination solution, ultimately improving the quality of the service composition solution.
Based on the information recorded in the preferred decision layer, it is determined that if the value of the ith record board does not change for six consecutive iterations, the subgroup is likely performing redundant iterations near a local extremum due to decreased individual competitiveness, ultimately falling into the local extreme range. To encourage each subgroup to continue searching for feasible solutions and to avoid premature convergence, more optimal individuals are selected from the preferred decision layer to guide the underperforming subgroup by learning from their experiences multiple times, corresponding to position ③ in Fig. 3. Position changes are then performed by formulas $(\mathrm{4})$$(\mathrm{6})$ to better optimize the search area.
Since dominant sparrows from all subgroups are stored in the preferred decision layer, it is necessary to guide those in the underperforming subgroup to learn from the search experiences of these dominant individuals. This guidance helps other individuals in the subgroup to continue absorbing information from betterperforming individuals, ensuring that all populations are actively engaged in iterative evolution. If there are no better individuals in the preferred layer to provide effective information, and a subgroup falls into a local optimum again, the subgroup will experience evolutionary stagnation. To address this, chaotic sequences are selected to perturb the subgroup. Chaotic sequence mapping has the characteristics of randomness and sensitivity to initial values. Logistic mapping, a typical representative of chaotic mapping, is widely used in optimization search problems due to its uniform sequence distribution.
${Y}_{n+\mathrm{1}}=\mu {Y}_{n}\left(\mathrm{1}{Y}_{n}\right)$(7)
where ${Y}_{n}$ is a chaotic sequence, which indicates the nth chaotic sequence value generated, and $\mathrm{0}<\mu \le \mathrm{4}$ is the control parameter of the chaotic sequence.
By integrating chaotic perturbations, the ISSA enhances the exploration capabilities of subgroups, helping them escape local optima and maintain diversity in the search process. This method ensures a more thorough and effective optimization, ultimately improving the quality of the service composition solution.
As shown in Fig. 4, the horizontal axis represents the control parameter, the vertical axis represents the range of changes in the Logistic sequence values. When the control parameter $\mu $ is changed while the sparrow is fixed, it is found that the chaos state is better when $\mu \in [\mathrm{3.85,4}]$ and the $\mu $ is closer to 4. Therefore, in order to enable sparrow individuals trapped in local extreme regions to explore other effective search regions as much as possible through perturbation for traversing more service components, this article sets the parameter $\mu $ to 4.
Fig.4 Logistic chaotic map bifurcation graph 
In order to enhance the search diversity of the sparrow population, all the individuals in the subgroup are first arranged in increasing order of fitness values, with higher positions indicating more advantageous individuals. Next, the top 10% of sparrows in the subgroup are selected, representing those with the highest fitness values. These topperforming sparrows gradually replace the bottom 10% of sparrows in the subgroup which are the individuals with the lowest fitness values.
After this replacement process, Logistic chaos is used to implement a onedimensional perturbation process on the updated population to further diversify the search. This process is described by equation (8), which introduces chaotic perturbations to the positions of the individuals, ensuring a more thorough exploration of the solution space and preventing premature convergence.
${X}_{i,d}=\mathrm{L}{\mathrm{b}}_{d}+\frac{\left(\mathrm{U}{\mathrm{b}}_{d}\mathrm{L}{\mathrm{b}}_{d}\right)}{\mathrm{2}}\left(\mathrm{1}+{Y}_{i,d}\right)$(8)
where, $\mathrm{U}{\mathrm{b}}_{d}$ represents the upper bound in the ddimension for the current optimal solution, $\mathrm{L}{\mathrm{b}}_{d}$ represents the lower bound in the ddimension for the current optimal solution, and $i$ represents the dimension that needs to be perturbed, which is randomly selected. This strategy mainly perturbs the current optimal solution with a onedimensional logistic chaotic sequence.
The singledimensional perturbation strategy based on the multipopulation evolution mechanism and Logistic chaos is specifically designed for the single search space of service composition optimization problems. This strategy has a dual purpose. On one hand, it retains most of the excellent dimensional information of the current optimal solution during the service composition process, ensuring the algorithm's convergence to the service composition optimization problem. On the other hand, it utilizes the Logistic sequence and multipopulation evolution mechanism to perturb the single dimension of the current optimal solution.
As the sparrow algorithm iterates, this selfadjustment strategy within the subgroups helps them escape the constraints of local optima. By continuously introducing chaotic perturbations, the strategy enhances the global optimization ability of the algorithm. This approach not only improves search accuracy but also ensures convergence stability across different subspaces. Through these mechanisms, the ISSA effectively balances exploration and exploitation, ultimately leading to more robust and reliable service composition solutions.
3.3 Discoverer's Dynamic Global Search Strategy
Due to the significant impact of inertia weight on the iterative update of sparrow search algorithms, this article introduces an adaptive weight factor for the position of the original sparrow discoverer, as expressed as follows:
$\omega =\left({f}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}{f}_{\mathrm{w}\mathrm{o}\mathrm{r}\mathrm{s}\mathrm{t}}\right)\cdot \frac{\mathrm{c}\mathrm{o}\mathrm{s}\text{}\left(\frac{\mathrm{\pi}t}{{T}_{\mathrm{m}\mathrm{a}\mathrm{x}}}+{\omega}_{\mathrm{m}\mathrm{a}\mathrm{x}}\right)\left({\omega}_{\mathrm{m}\mathrm{a}\mathrm{x}}{\omega}_{\mathrm{m}\mathrm{i}\mathrm{n}}\right)}{\mathrm{2}}+\mathrm{1}$(9)
where ${f}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}},\text{}{f}_{\mathrm{w}\mathrm{o}\mathrm{r}\mathrm{s}\mathrm{t}}$ are the best and worst fitness values respectively, ${\omega}_{\mathrm{m}\mathrm{a}\mathrm{x}},\text{}{\omega}_{\mathrm{m}\mathrm{i}\mathrm{n}}$ are the maximum value and minimum value of the weight factor, respectively. Multiple experiments have shown that the effect reaches its maximum when ${\omega}_{\mathrm{m}\mathrm{a}\mathrm{x}}=\mathrm{0.85},\text{}{\omega}_{\mathrm{m}\mathrm{i}\mathrm{n}}=\mathrm{0.25}$.
It can be seen from Fig. 5 that in the early iteration stage of the sparrow individuals, they approach the global optimal at a faster speed. In the middle and later stages of the iteration, the decline gradually begins to slow down, so as to facilitate the subsequent more accurate mining, and finally achieve the effect of improving the convergence speed and global space exploration capability.
Fig.5 Adaptive weight factor iteration diagram 
Then the weight factor is combined with gold sine to improve the discoverer's position. The updated formula is as follows:
$\{\begin{array}{l}{X}_{i,j}^{t}\cdot \mathrm{s}\mathrm{i}\mathrm{n}{r}_{\mathrm{1}}\omega \cdot {r}_{\mathrm{2}}\cdot \mathrm{s}\mathrm{i}\mathrm{n}{r}_{\mathrm{2}}\cdot {\beta}_{\mathrm{1}}{X}_{best}^{t}{\beta}_{\mathrm{2}}{X}_{i,j}^{t},{R}_{\mathrm{2}}<\mathrm{S}\mathrm{T}\\ {X}_{i,j}^{t}+Q\cdot L,\text{}{R}_{\mathrm{2}}\ge \mathrm{S}\mathrm{T}\end{array}$(10)
where ${r}_{\mathrm{1}}$ is the number generated randomly within [0,2$\mathrm{\pi}$], and ${r}_{\mathrm{2}}$ is the number generated randomly within [0,$\mathrm{\pi}$], ${\beta}_{\mathrm{1}},{\beta}_{\mathrm{2}}$ are parameters constructed using the golden section coefficient, which will cooperate with the weight factor to ensure the uniform effect of the overall search of the algorithm and further improve the convergence speed of the algorithm. Among them ${\beta}_{\mathrm{1}}=\mathrm{\pi}+\mathrm{2}\mathrm{\pi}\cdot (\mathrm{1}\tau ),\text{}{\beta}_{\mathrm{2}}=\mathrm{\pi}+\mathrm{2}\mathrm{\pi}\cdot \tau ,\tau =(\sqrt[]{\mathrm{5}}\mathrm{1})/\mathrm{2}$.
3.4 Design of Optimal Position Variation Disturbance
The differential evolution strategy, an evolutionary algorithm proposed by Storn et al^{[17]}, is based on the genetic algorithm. It generates individuals with variations from the original population and then performs crossover mutations to enrich population diversity. This paper adopts this strategy within the SSA. The specific process is as follows: First, two different sparrows are randomly selected from the entire population. The position changes of these two sparrows are calculated. Next, these changes are perturbed with the globally optimal individual to obtain a mutated new individual. This process introduces new genetic material into the population, helping to maintain diversity and avoid premature convergence.
By incorporating differential evolution into SSA, the algorithm benefits from enhanced exploration capabilities, ensuring a more thorough search of the solution space and improving the overall optimization performance. The mutation expression is as follows:
${X}_{\mathrm{n}\mathrm{e}\mathrm{w}}^{t+\mathrm{1}}={X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{t+\mathrm{1}}+P\cdot {X}_{r\mathrm{1}}^{t}{X}_{r\mathrm{2}}^{t}$(11)
where ${X}_{r\mathrm{1}}^{t},\text{}{X}_{r\mathrm{2}}^{t}$ represent two sparrows chosen at random, $P\in (\mathrm{0,6})$ represents the scaling parameter, ${X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}^{t}$ indicates the position of the sparrow with the best fitness value in the current population.
In order to further enhance the optimization effect of SSA and prevent the algorithm from falling into the dilemma of local optimal solution due to a sharp decrease in population diversity during the search process, the Cauchy distribution strategy is used to perturb the position of sparrows:
${X}_{\mathrm{n}\mathrm{e}\mathrm{w}}^{t+\mathrm{1}}={X}_{i}^{t}+t\left(\mathrm{i}\mathrm{t}\mathrm{e}\mathrm{r}\right)\cdot {X}_{i}^{t}$(12)
where ${X}_{i}^{t}$ denotes the position of the ith sparrow individual, ${X}_{\mathrm{n}\mathrm{e}\mathrm{w}}^{t+\mathrm{1}}$ represents the position of the individual after mutation, and t (iter) represents the t distribution with the current iteration number iter as the degree of freedom.
Although the above mutation strategy can increase the chances of escaping local optima, it does not guarantee that the newly generated sparrow individual will be better than the original optimal individual. Therefore, we incorporate a greedy approach to ensure that individuals with higher fitness values are selected at each step. Specifically, after generating the new mutated individual through differential evolution, the fitness values of the new and original individuals are compared. The individual with the larger fitness value is retained for the next iteration. This greedy strategy ensures that the population continually improves, enhancing the overall optimization process and ensuring a convergence to a better solution. The specific selection and updating operation are as follows:
${X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}=\{\begin{array}{cc}{X}_{i,j}^{t+\mathrm{1}},& f\left({X}_{i,j}^{t+\mathrm{1}}\right)\ge f\left({X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}\right)\\ {X}_{best},& f\left({X}_{i,j}^{t+\mathrm{1}}\right)<f\left({X}_{\mathrm{b}\mathrm{e}\mathrm{s}\mathrm{t}}\right)\end{array}$(13)
3.5 Analysis of Composition Optimization of Cloud Manufacturing Services Based on ISSA
According to above description, the algorithm flow of cloud manufacturing service composition optimization based on the ISSA is given and described in Algorithm 1.
Algorithm 1 ISSA framework for solving the cloud manufacturing service composition problem based on service quality 

Input: population size (PopSize), spatial search dimension (Dim), maximum number of iterations (MaxIter), proportion of subscribers (PD), proportion of alert value (SD) and alert threshold (${R}_{2}$), the objective function value ($\mathrm{m}\mathrm{a}\mathrm{x}\text{}\mathrm{Q}\mathrm{o}\mathrm{S}=\mathrm{m}\mathrm{a}\mathrm{x}{\displaystyle \sum _{i=1}^{5}}{\omega}_{k}\times {Q}_{k})$ Output: output the optimal sparrow position and its fitness value 1. While$t\le \mathrm{M}\mathrm{a}\mathrm{x}\mathrm{I}\mathrm{t}\mathrm{e}\mathrm{r}\text{}$do 2. Calculate the QoS value of composite services in the group PopSize according to equation (2), sort the value and save the optimal QoS value. 3. Obtain the fitness value of the current sparrow population and decrease it in order, and then get the current best fitness and its corresponding position, as well as the current worst fitness and its corresponding position. 4. Divide the population into different subgroups, and follow equation (8) to perturb the replaced new subgroup using the logistic chaotic sequence. 5. ${R}_{2}$=rand (1) 6. for $i=1:\mathrm{P}\mathrm{D}$do 7. The sparrow with the best fitness value will be the discoverer through weight distribution, and its position will be updated according to formula (10). 8. end for 9. for $i=(\mathrm{P}\mathrm{D}+1):\mathrm{P}\mathrm{o}\mathrm{p}\mathrm{S}\mathrm{i}\mathrm{z}\mathrm{e}$do 10. Use the surplus sparrows in the population as followers and update their positions according to formula (5). 11. end for 12. for $i=\mathrm{S}\mathrm{D}:\mathrm{P}\mathrm{o}\mathrm{p}\mathrm{S}\mathrm{i}\mathrm{z}\mathrm{e}$do 13. Update the position of the watcher according to formula (6). 14. end for 15. According to the current fitness value and average fitness value, differential variation strategy or Cauchy distribution perturbation strategy is selected to disturb the current optimal solution, generate new solutions. 16. Update the fitness value through the greedy formula (13), and record the global optimal position and the sparrow's individual optimal position information. 17. end while 18. return the optimal location and its optimal fitness value to obtain the optimal QoS value 
4 Simulation Results and Experimental Analysis
4.1 Performance Verification of the Algorithm on Benchmark Problems
In order to verify the practicability of the proposed method, it is applied to solve six classical standard reference functions, namely Sphere (${f}_{\mathrm{1}}$), Schwefel 2.22 (${f}_{\mathrm{2}}$), Schwefel 2.21 (${f}_{\mathrm{3}}$), Griewank (${f}_{\mathrm{4}}$), Alpine (${f}_{\mathrm{5}}$) and Shekel (${f}_{\mathrm{6}}$), including single mode and multimodal functions, see Table 1. The term ${f}_{\mathrm{m}\mathrm{i}\mathrm{n}}$ refers to the theoretical minimum value of the objective function, which is the best solution that the algorithm aims to approximate or achieve. The functions ${f}_{\mathrm{1}}$${f}_{\mathrm{3}}$ are unimodal and have only one global optimal value, so the convergence speed of the algorithm is more important than the optimization results. The functions ${f}_{\mathrm{4}}$${f}_{\mathrm{6}}$ are multimodal functions that have many local optima. Therefore, the importance of optimization results is greater than that of the convergence speed, which can reveal the algorithm's ability to break free from local optima and find global optima.
For the benchmark function, the proposed ISSA is compared with three metaheuristic methods, including Butterfly Optimization Algorithm (BOA), Grey Wolf Optimization Algorithm (GWO), and SSA. The parameters of GWO are the initial and final control parameters, i.e., ${a}_{\mathrm{i}\mathrm{n}\mathrm{i}\mathrm{t}\mathrm{i}\mathrm{a}\mathrm{l}}=\mathrm{2},\text{}{a}_{\mathrm{f}\mathrm{i}\mathrm{n}\mathrm{a}\mathrm{l}}=\mathrm{0}$; BOA is the switching probability P=0.8, sensory mode c=0.001, a=0.1; The parameter of SSA is PD=0.2, SD=0.2, ST=0.8; The same parameters in the four algorithms are set identically: that is, the population size (Popsize) is 30, and the maximum running times (MaxIter) is 500.
Test functions description
4.1.1 Analysis of search accuracy
In this paper, ISSA and the comparison algorithm proposed in this paper are independently run 30 times on each benchmark function. The optimal value, average value, standard deviation, and worst value are used as evaluation criteria for the performance of each algorithm. These metrics are chosen to comprehensively assess the accuracy and stability of the algorithms. Specifically, the optimal value represents the best objective function value achieved, the worst value represents the worst objective function value achieved, the average value represents the mean objective function value over multiple runs, and the standard deviation indicates the variability and consistency of the results. Finally, the data generated by each algorithm will be recorded in an exponential manner, highlighted in black and bold to highlight the best results, and the average time spent at the end of each run process will be recorded. The comparison of the results of the benchmark function is shown in Table 2.
As can be seen from Table 2, the ISSA shows a better convergence rate compared with the other three algorithms when facing the first three unimodal test functions, and their optimal values and mean values are both 0. It can be seen that the chaotic initialization population plays a crucial role, which greatly improves its convergence rate, and thus verifies that the improved algorithm has good stability. For multimodal test functions, although the original algorithm and the improved algorithm do not differentiate when dealing with function ${f}_{\mathrm{4}}$, the algorithm in this paper still has great advantages based on the comparison of the solution performance of function ${f}_{\mathrm{5}}$. Finally, for the multimodal test function ${f}_{\mathrm{6}}$, the proposed algorithm has strong robustness in searching the entire solution space. This is because the addition of differential mutation strategy and Cauchy perturbation in the later stage helps the algorithm successfully overcome the local extremum dilemma and obtain the global optimal solution. Although all four algorithms are extremely close to the theoretical values, further analysis of their respective mean and standard deviation reveals that ISSA has significantly better overall stability than other algorithms.
Results of benchmark functions
4.1.2 Analysis of the rate of convergence
Considering that the convergence speed of each algorithm cannot be directly reflected only by the average value and standard deviation, the convergence curve is further drawn to better show the process of each algorithm converging to the optimal solution.
For different types of test functions, it can be seen from Fig. 6(a) to Fig. 6(c) that when the four algorithms converge to the same fitness, the ISSA takes the least number of iterations and the fastest rate of convergence, which is due to the early multipopulation mechanism and Logistic onedimensional chaotic perturbation. Observing that the curves of GWO and BOA gradually tend to moderate, and finally there will be stagnation, that is, they fall into a local optimum, while the convergence trend of the ISSA shows a gradient wave drop overall, indicating that the differential Cauchy disturbance in the later stage helps the algorithm successfully get rid of the local extreme value dilemma. As the process progresses, for these benchmark functions with local optima, the ISSA has a stronger global search ability than other algorithms.
Fig.6 Convergence curves for benchmark functions 
To sum up, the results in this section show that the ISSA is highly competitive with highdimensional singlemode functions and multimode functions. The ISSA retains the unique guidance mechanism of followers and the warning mode of scouters. Meanwhile, the ISSA uses the golden sinusoidal fusion dynamic factor to improve the position update stage of discoverers in the SSA. Finally, Cauchy differential perturbation is used to enhance the optimization ability of the algorithm. This not only maintains population diversity, but also improves the optimization efficiency and accuracy of the algorithm.
4.2 Performance Verification of the Algorithm on Composition Optimization Problem
To further demonstrate the practicality and rationality of the ISSA in cloud manufacturing service composition, the cost, time, reliability, satisfaction and quality of deliverables of each manufacturing cloud service are randomly generated between [0.8,0.95]. Their weights in customer QoS preferences are respectively set as ${\omega}_{\mathrm{t}\mathrm{i}\mathrm{m}\mathrm{e}}=\mathrm{0.3},\text{}{\omega}_{\mathrm{c}\mathrm{o}\mathrm{s}\mathrm{t}}=\mathrm{0.2},\text{}{\omega}_{\mathrm{r}\mathrm{e}\mathrm{l}}=\mathrm{0.2},\text{}{\omega}_{\mathrm{s}\mathrm{t}\mathrm{a}}=\mathrm{0.15},\text{}{\omega}_{\mathrm{d}\mathrm{e}\mathrm{l}}=\mathrm{0.15}$. The parameter settings for each algorithm remain the same as in Section 4.1, with a specified number of runs of 200. Finally, a series of experiments are conducted to verify the effectiveness and efficiency of the algorithm in this paper.
4.2.1 Convergence analysis
This section sets the number of subtasks to 3050, with an increment of 10. The number of candidate manufacturing cloud services for each subtask ranges from 150 to 300 in an increment of 150. For example, T30150 indicates that the number of subtasks (abstract services) is 30 and the number of candidates MCS for each subtask (candidate services) is 150.
It can be seen from Table 3 that the average fitness value obtained by the ISSA is superior to other comparison algorithms, and the optimal solution obtained by the ISSA is also better than other comparison algorithms. Obviously, as the number of subtasks and candidate services increases, the optimal solutions of all four algorithms have decreased, which may be related to the complexity of the solution space of the problem. Based on various comparisons, it can be concluded that the performance of the ISSA in this paper outperforms the other three comparison algorithms.
Then Figure 7 plots the optimal fitness value obtained by four algorithms in solving 50 manufacturing subtasks and the required number of iterations.
Fig.7 Optimal fitness values of algorithms 
Under the same conditions, the optimal fitness value obtained by the comparison algorithm is small, and the overhead of the number of iterations is relatively high, which is likely to be affected by the population position. The ISSA can obtain better optimal solutions while guaranteeing the number of iterations, introduce multiple swarm mechanisms and Logistic chaotic perturbation strategies to ensure the diversity of the algorithm's early population and the balance of exploration and development in the later stage. This indicates that the ISSA has more accurate optimization results than other algorithms, because it expands more search ranges during the evolution process to enhance its possibility of jumping out of local optimal. In summary, the ISSA is superior to the other four algorithms in terms of overall search efficiency and solving quality.
Performance of algorithms for different problems
4.2.2 Time performance analysis
Finally, in order to further effectively compare the average running time required by the four algorithms in solving the manufacturing cloud service composition model based on service quality, this paper conducts experiments in two scenarios respectively. In the first scenario, the number of candidate services is guaranteed to be 200, and the number of manufacturing tasks is gradually increased from 20 to 200. In the second scenario, the number of manufacturing tasks remains the same at 50, and the number of candidate services changes from 50 to 500, giving the time trend graph in Fig. 8.
Fig.8 Comparison of the computation time of four algorithms in different scenarios 
As can be seen from Fig. 8(a), when fewer manufacturing tasks are arranged, all four algorithms spend relatively less time. However, as manufacturing tasks increase, the gap between them gradually widens. Especially for the GWO, its running time shows an exponential growth trend. This is because the increase of the number of subtasks will lead to the increase of the dimensionality of the target vector, which will make the calculation of the QoS fitness function more complicated, and thus bring additional time cost. By observing Fig. 8(b), it can be observed that increasing the number of candidate services does not have a significant impact on the four algorithms, as increasing the number of candidate services will expand the solution space but will not affect the optimization efficiency. From a global perspective, the ISSA has significantly less solving time than other algorithms. This is because the use of a Logistic single dimensional perturbation strategy in sparrow individuals lacking diversity in each subpopulation further improves the convergence accuracy and optimization speed of the algorithm. Moreover, the difference variation strategy and Cauchy distribution disturbance are added later to help the algorithm successfully get rid of the local extreme value dilemma. The stronger the global detection ability of the algorithm, the better the results.
5 Conclusion
This article addresses the current challenges in cloud manufacturing service composition by improving the SSA to solve the optimal portfolio solution. To counter the disadvantages of loss of population diversity and lack of global optimization ability in SSA, we introduce a multipopulation evolution mechanism and Logistic chaotic mapping to enhance population diversity in the early stages of the search. Additionally, a global capture strategy based on dynamic weight factors and the golden sine method is employed to stabilize the algorithm's exploration performance. In the later stages, differential Cauchy mutation perturbation is used to further enhance the algorithm's global optimization ability.
The superiority of the proposed algorithm is verified through benchmark function problems and service composition scenarios. Results show that the improved algorithm not only reduces computing time but also ensures convergence accuracy, demonstrating the correctness and effectiveness of the proposed model and algorithm in solving cloud manufacturing service composition optimization problems. Although the improved algorithm can solve the issues in the cloud manufacturing service portfolio to a certain extent, it typically results in suboptimal solutions, requiring compromises between effectiveness, utility value, and computing time. In the future, exploring the use of mature algorithms from other fields to solve the cloud manufacturing service portfolio problem could be a highly effective approach.
References
 Li B H, Chai X D, Hou B C, et al. Cloud manufacturing system 3.0—New intelligent manufacturing system in the era of "intelligence+"[J]. Computer Integrated Manufacturing Systems, 2019, 25(12): 29973012(Ch). [Google Scholar]
 Li X B, Zhuang P J, Yin C. A metadata based manufacturing resource ontology modeling in cloud manufacturing systems[J]. Journal of Ambient Intelligence and Humanized Computing, 2019, 10(3): 10391047. [Google Scholar]
 Yao J, Xing B, Zeng J, et al. Overview of cloud manufacturing service portfolio research[J]. Computer Science, 2021, 48(7): 245255(Ch). [Google Scholar]
 Bouzary H, Chen F F. Service optimal selection and composition in cloud manufacturing: A comprehensive survey[J]. The International Journal of Advanced Manufacturing Technology, 2018, 97(1): 795808. [CrossRef] [Google Scholar]
 Zhou J J, Yao X F. A hybrid artificial bee colony algorithm for optimal selection of QoSbased cloud manufacturing service composition[J]. The International Journal of Advanced Manufacturing Technology, 2017, 88(9): 33713387. [CrossRef] [Google Scholar]
 Jin H, Lv S P, Yang Z, et al. Eagle strategy using uniform mutation and modified whale optimization algorithm for QoSaware cloud service composition[J]. Applied Soft Computing, 2022, 114: 108053. [CrossRef] [Google Scholar]
 Jiang Y R, Tang L, Liu H L, et al. A variablelength encoding genetic algorithm for incremental service composition in uncertain environments for cloud manufacturing[J]. Applied Soft Computing, 2022, 123: 108902. [Google Scholar]
 Yang Y F, Yang B, Wang S L, et al. An improved grey wolf optimizer algorithm for energyaware service composition in cloud manufacturing[J]. The International Journal of Advanced Manufacturing Technology, 2019, 105(7): 30793091. [CrossRef] [Google Scholar]
 Liao S C, Sun P, Liu X C. Service combinatorial optimization based on improved krill swarm algorithm [J]. Computer Application, 2021, 41(12): 36523657(Ch). [Google Scholar]
 Zeng J, Yao J, Gao M, et al. A service composition method using improved hybrid teaching learning optimization algorithm in cloud manufacturing[J]. Journal of Cloud Computing, 2022, 11(1): 66. [Google Scholar]
 Liao W L, Wei L, Wang Y. Manufacturing cloud service composition optimization based on modified polar bear algorithm[J]. Computer Application Research, 2022, 39(4): 10991104(Ch). [Google Scholar]
 Xue J K, Shen B. A novel swarm intelligence optimization approach: Sparrow search algorithm[J]. Systems Science & Control Engineering, 2020, 8(1): 2234. [Google Scholar]
 Tao F, Zhao D M, Hu Y F, et al. Correlationaware resource service composition and optimalselection in manufacturing grid[J]. European Journal of Operational Research, 2010, 201(1): 129143. [Google Scholar]
 Zhou J J, Yao X F. Multipopulation parallel selfadaptive differential artificial bee colony algorithm with application in largescale service composition for cloud manufacturing[J]. Applied Soft Computing, 2017, 56(C): 379397. [Google Scholar]
 Liu G Y, Shu C, Liang Z W, et al. A modified sparrow search algorithm with application in 3d route planning for UAV[J]. Sensors, 2021, 21(4): 1224. [NASA ADS] [CrossRef] [PubMed] [Google Scholar]
 Zhu Y L, Yousefi N. Optimal parameter identification of PEMFC stacks using adaptive sparrow search algorithm[J]. International Journal of Hydrogen Energy, 2021, 46(14): 95419552. [NASA ADS] [CrossRef] [Google Scholar]
 Storn R, Price K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces[J]. Journal of Global Optimization, 1997, 11: 341359. [CrossRef] [Google Scholar]
All Tables
All Figures
Fig.1 Cloud manufacturing services composition main flowchart  
In the text 
Fig.2 Mapping relationship between composite services and sparrow individuals  
In the text 
Fig.3 Schematic diagram of the evolution mechanism of multiple populations  
In the text 
Fig.4 Logistic chaotic map bifurcation graph  
In the text 
Fig.5 Adaptive weight factor iteration diagram  
In the text 
Fig.6 Convergence curves for benchmark functions  
In the text 
Fig.7 Optimal fitness values of algorithms  
In the text 
Fig.8 Comparison of the computation time of four algorithms in different scenarios  
In the text 
Current usage metrics show cumulative count of Article Views (fulltext article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 4896 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.