FORECASTING MECHANICAL PROPERTIES OF STEEL STRUCTURES THROUGH DYNAMIC METAHEURISTIC OPTIMIZATION FOR ADAPTIVE MACHINE LEARNING

. Machine learning (ML) presents a promising method for predicting mechanical properties in structural engineering, particularly within complex nonlinear structures under extreme conditions. Despite its potential, research has shown a disproportionate focus on concrete structures, leaving steel structures less explored. Furthermore, the prevalent combination of metaheuristic optimization (MO) and ML in existing studies is often subjective, pointing to a significant gap in identifying and leveraging more effective hybrid models. To bridge these gaps, this study introduces a novel system named the Multiple Metaheuristic Optimizers – Multiple Machine Learners (MMOMML) system, designed for predicting mechanical strength in steel structures. The MMOMML system amalgamates 17 MO algorithms with 15 ML techniques, generating 255 hybrid models, including numerous novel configurations not previously examined. With a user-friendly interface, MMOMML enables structural engineers to tackle inference challenges efficiently, regardless of their coding proficiency. This capability is convincingly demonstrated through two practical applications: steel beams’ shear strength and steel cellular beams’ elastic buckling. By offering a versatile and robust tool, the MMOMML system meets construction engineers’ and researchers’ practical and research needs, marking a significant advancement in the field.


Introduction
Structural engineering involves the analysis and design of load-bearing structures.However, current structural analysis and design approaches require a time-consuming calibration process to handle complex structural systems that exhibit highly nonlinear behavior under extreme actions.This process is excessively complicated and impractical for use in most projects.Machine learning (ML), the most successful branch of artificial intelligence (AI), is well suited to make structural engineering more predictable because of its ability to handle complex nonlinear structural systems under extreme actions (Salehi & Burgueño, 2018;Sun et al., 2021;Zhang et al., 2023).Thai (2022) reviewed 474 Scopus-indexed studies from 1989 to 2021 on ML applications in structural engineering, concluding that scholarly interest in these applications has been rising, especially over the last five years.Thai (2022) distinguished 474 studies on ML applications into five distinct areas of study, including (1) Pre-diction of structural members, (2) Prediction of structural fire resistance, (3) Structural health monitoring (SHM) and damage detection, (4) Structural analysis and design; and (5) Prediction of concrete mechanical properties.The area focused on structural members includes studies designed to predict member strength (i.e., shear strength, flexural strength, axial strength, torsional strength, buckling strength, and bond strength) and member deformation (i.e., deflection, drift, and rotation).The area focused on mechanical properties includes studies designed to predict compressive strength, tensile strength, bending strength, and Young's modulus and to optimize concrete mix designs.The number of published studies on ML applications in structural engineering by topic is presented in Table 1.
As shown in Table 1, 373 of the 474 studies (78.7%) addressed problems related to concrete and reinforced concrete (RC) structural components.Although structural steel is the preferred material for high-rise building Hybrid machine learning models combining ML techniques with metaheuristic optimization (MO) algorithms have demonstrated advantages in terms of increased prediction accuracy and reduced model dependence on user expertise (Cao et al., 2021(Cao et al., , 2022)).However, the MO algorithms and ML techniques used in these hybrid models have typically been selected subjectively by researchers.For example, Tran-Ngoc et al. (2019) proposed a cuckoo search (CS)-tuned artificial neural network (ANN) model for detecting damage in bridges and beam-like structures.Although the performance of the proposed model was superior to that of the single ML model used as the comparison, the decision to use CS and ANN rather than other techniques was not explained, leaving the possibility that other hybrid AI models may solve related problems even more effectively and efficiently.construction in many developed countries (Deng et al., 2020;Liu et al., 2018), only 84 studies (17.7%) addressed problems related to steel structures.This may be attributable to the complex, non-linear nature of the inference problems involved in calculating mechanical properties in steel structures and to the time-consuming calibrations required for related structural analyses and design methods, which are overly complicated for practical implementation and generate low-accuracy results (Asif Bin Kabir et al., 2021;Truong et al., 2022).The absence of related research on steel structures highlights the critical need in the construction industry for ML applications that assist engineers in addressing prediction/estimation problems in steel structures.
End of Table 1 Furthermore, the programs used in hybrid models have rarely been shared publicly, often requiring civil engineers, who typically have limited programming knowledge, to use unreliable code published on the internet.These drawbacks mean that the application of ML in steel structures remains mainly limited to research settings, leaving a gap between research and practical application.Civil engineers and construction managers require a user-friendly ML tool that may be used without coding knowledge to handle forecasting and prediction problems effectively and efficiently.
Therefore, this project was designed to create a novel application interface that integrates different advanced AI techniques, including ML and MO algorithms, to extract usable information hidden in the data and to infer new facts.This application interface, the "multiple metaheuristic optimizers -multiple machine learners" (MMOMML), is a user-friendly, assistive tool that allows users without cod-ing knowledge to determine the AI model best suited to handle a particular dataset.Moreover, the proposed interface provides a reliable benchmark for verifying the reliability of new and existing inference models.The ability of MMOMML to resolve numerous prediction and forecasting problems in steel structures fulfills the expectations of both users and researchers.
The breakdown percentages of the various ML methods used in structural engineering from 1990 to 2021, shown in Figure 1, support the suitability of using these methods to resolve structural engineering problems.Thus, these methods were adopted as the core models in the proposed MMOMML system.Based on our survey of the most prominently used MO metaphors, presented in Figure 2 (Chou & Nguyen, 2020), integration of these algorithms was proposed to optimize ML technique performance and increase the automation level.Therefore, the proposed MMOMML system was designed to encompass  255 powerful and efficient hybrid AI models by integrating 17 MO algorithms and 15 ML techniques.The system was developed to automatically train models to produce the performance criteria values for each model and then determine the most appropriate value for the targeted structure problem.MMOMML may further enrich the options for improving model performance by suggesting using novel hybrid models that have not previously been used.For example, the system may be used to create a hybrid of FBI and either XGB or LGBM, neither of which have been investigated in the structural engineering field.Furthermore, the MMOMML interface will be updated with emerging MO algorithms and AI techniques and will not be limited only to the proposed MO and AI techniques.Thus, promising hybrid models may be generated progressively to improve prediction accuracy in solving structural engineering problems.
The proposed MMOMML system is expected to resolve numerous prediction and forecasting problems in steel structures and fulfill the expectations of both users and researchers.This powerful interface will be user-friendly and help structural engineers efficiently resolve steel structure inference problems using on-hand data without requiring prior coding knowledge.The remainder of this paper is organized as follows: Section 2 provides a review of the existing literature about machine learning applications in the context of steel structures; Section 3 outlines the research methodology employed in this study, encompassing the model evaluation metrics utilized; Section 4 introduces MMOMML framework and implementation; Section 5 presents the experimental datasets that were obtained in this study, discusses the predictions, and compares model performances; and the final section presents concluding remarks and outlines the research contributions made.

Literature review
As previously noted, only 84 articles indexed in Scopus and published between 1990 and 2021 were designed to investigate the feasibility of applying ML to steel-struc-ture-related engineering problems (Thai, 2022).These articles cover four areas of study: Prediction of structural members (31 articles; 36.9%),Steel structure analysis and design (21 articles; 25.0%), Prediction of fire resistance (6 articles; 7.1%), and SHM and damage detection (26 articles; 31.0%).As shown in Figure 3, fewer than four studies have addressed many categories within these four areas, providing insufficient samples to analyze the merits of applying ML to related problems.
The techniques used in the 84 ML studies on steelstructure-related engineering problems are outlined in Table 2. Backpropagation method-based ANN was the most frequently used technique to predict mechanical behavior in steel structures.In terms of other techniques considered, Zakir Sarothi et al. (2022) used ANN, SVM, DT, RF, kNN, linear regression, lasso regression, ridge regression, AB, XGB, and CatBoost to predict bearing capacity in double shear bolted connections, finding that RF delivered the best performance.Also, Ghiasi et al. (2016) conducted a comparative study on the performances of ANN, AN-FIS, LMNN, LS-SVM, MARS, ELM, RF, and GP in classifying damage detected in steel structures, finding LS-SVM to be the best model with the lowest prediction error.
In light of the limited nature of current studies, the potential of other ML techniques to predict mechanical behavior in steel structures should be investigated to assess their potential advantages over currently dominant ANN techniques.Several scholars have proposed applying SVM, LS-SVM, and RF to problems in SHM and damage detection (Ghiasi et al., 2016;Zhou et al., 2012) and structural fire resistance prediction (Hasni et al., 2017).The authors of these studies confirmed the proposed models as superior to ANN in reducing prediction error and computational time.In addition to the above, ANFIS and RBFNN (variants of ANN), DT (Fu, 2020), and CNN (a deep learning technique) have been used in the literature to predict the mechanical behavior of steel structures.
However, most published studies have used conventional ML techniques that require the manual setting of hyperparameter values.Thus, a trial-and-error or grid search method is typically needed to fine-tune these values.However, these methods generally do not achieve optimal model performance because of the extensive range of potential continuous hyperparameter values.Several scholars who investigated using a metaheuristic algorithm instead of trial-and-error/grid search methods reported promising results (Tran-Ngoc et al., 2019).Tran-Ngoc et al. (2019) used the cuckoo search algorithm to optimize the configuration of an ANN model, obtaining better results than those generated by the standard ANN model.
In addition, a literature review by Thai (2022) highlighted several concerns related to published studies in which ML was used to predict problems in steel structures.These concerns include: (1) low diversity among the ML techniques used, as most studies used and analyzed the performance of ANN techniques only; (2) few studies proposed advanced hybrid ML models designed to improve prediction accuracy related to steel structure behavior; and (3) failure to fully realize the capabilities of ML because

Single machine learning models
This section focuses on ML algorithms commonly applied in structural engineering, categorized into six groups based on their prevalent use in the literature concerning steel structures.

Regression analysis (RA)
Regression analysis (RA) is a predictive modeling approach rooted in statistics, initially designed to explore the intricate relationship between independent (predictor) and dependent (target) variables.Over time, RA has been integrated into Machine Learning (ML), operating within the framework of supervised learning algorithms.Its primary goal is to predict output values based on corresponding input variable values.In the context of ML, regression models differ based on three main factors: (i) the number of incorporated variables, (ii) the type of variables includ-ed, and (iii) the geometric configuration of the regression line.Commonly used RA models in structural engineering encompass Linear regression, Multivariate regression, Logistic regression, and Multivariate Adaptive Regression Splines (MARS) (Cheng & Cao, 2016).

Artificial neural networks (ANNs)
ANN is a versatile machine learning algorithm widely used in structural engineering.It mimics biological neurons and consists of layers, including an input layer for data reception and an output layer for result prediction.The Backpropagation Neural Network (BPNN) is a common variant that adjusts weights during training to minimize errors.Radial Basis Function Neural Network (RBFNN) uses RBF as an activation function, offering fast training with a single hidden layer.Adaptive Neuro-Fuzzy Inference System (ANFIS) combines fuzzy logic and NN to minimize errors between input data and predictions, with an architecture featuring fuzzy, product, normalized, de-fuzzy, and overall output layers.

Support vector machine (SVM)
The SVM algorithm has gained significant prominence owing to its efficacy and simplicity, rendering it among the most influential and widely employed methodologies.The fundamental concept underpinning the SVM algorithm entails an initial differentiation of data feature groups, succeeded by pursuing an optimal separating hyperplane endowed with a maximal margin.To address nonlinearly separable data, kernel functions such as the Radial Basis Function (RBF) and the sigmoid function are harnessed, facilitating the transformation of the original data into a novel space wherein linear separation becomes feasible.LSSVR is an SVM algorithm and extension of support vector regression (SVR) that incorporates many advanced features, including expedited computation and heightened generalizability (Suykens et al., 2002).

Decision tree (DT)
Alternatively referred to as the Classification and Regression Tree (CART), DT constitutes a tree-structured paradigm employed to elucidate and portray the intricacies of decision-oriented procedures.The prevalence of DT has surged notably, attributed to its intrinsic simplicity and adeptness in accommodating diverse data types encompassing numerical and categorical attributes.The compositional makeup of this model comprises four constituent elements: a foundational root node, an array of bifurcating branches, intermediary decision nodes, and culminating leaf nodes, commonly known as terminal nodes.

Random forest (RF)
RF employs an ensemble learning strategy utilizing Decision Trees (DTs) as weak learners.Multiple DTs are cultivated within a forest framework through bagging with parallel training.The stochastic feature selection process, earning the name "Random Forest", involves creating a multitude of individual DTs.The outcome aggregation consists of a consensus mechanism, with classification using a majority vote and regression employing an averaging procedure.This approach enhances predictive models in various applications, offering robust and accurate results.

Boosting algorithm (BA)
Boosting Algorithms (BA) enhance predictive models in structural engineering by combining multiple models for improved effectiveness.Key strategies include AdaBoost, employing Decision Trees (DT) and Random Forest (RF) algorithms; Gradient Boosting Machine (GBM), which enhances AdaBoost with DT weak learners and minimizes the loss function using gradient descent; XGBoost, optimizing GBM with techniques like parallel processing for efficiency; LightGBM, prioritizing computational efficiency with a unique leaf-wise tree expansion strategy; and LogitBoost (LB), a binary classification-focused variant using logistic regression as its base classifier.Collectively, these techniques contribute to swift and precise model predictions in structural engineering applications.

Metaheuristic optimization algorithms
Optimization problems aim to find optimal solutions within a solution set that maximizes (or minimizes) an objective function while adhering to constraints.Metaheuristics, versatile heuristic algorithms, address large-scale problems effectively and have gained popularity due to their efficiency.They excel in solving engineering problems by working efficiently with simple concepts, not requiring gradient information, overcoming local optima issues, and being applicable across various disciplines (Gandomi & Alavi, 2012;Mirjalili & Lewis, 2016;Wang, 2018).

Single solution-based metaheuristics (S-MOs)
S-MOs navigate problem spaces iteratively, generating candidate solutions from the current solution.In the generation phase, a set of candidates (C(s)) is created through local transformations.In the replacement phase, a new solution (s') from C(s) is chosen to replace the current solu-tion.This process repeats until the stopping criterion is met.Noteworthy examples of S-MOs, also termed local search algorithms, include local search, simulated annealing (SA), and tabu search (TS).While these algorithms once held prominence, recent advancements have diminished usage in contemporary research.

Population-based metaheuristics
Population-based metaheuristic optimizers (P-MOs) are crucial in iteratively enhancing solution populations.These optimizers commence with an initial set (see Figure 4), systematically generating new populations and replacing existing ones until a predefined stopping criterion is met.Notable examples of P-MOs include evolutionary algorithms, ant colony optimization, scatter search, differential evolution, particle swarm optimization, bee colony, and artificial immune systems.The efficacy of P-MOs lies in their ability to balance exploration and exploitation strategies.While early P-MOs such as genetic algorithm (GA) and particle swarm optimization (PSO) primarily focus on exploitation, which may result in confinement to local solutions, contemporary P-MOs like artificial bee colony (ABC), forensic-based investigation (FBI), and symbiotic organisms search (SOS) strategically incorporate both exploration and exploitation.This nuanced approach enhances the probability of discovering globally optimal solutions.

Hybrid machine learning and MO algorithms
Although the advantages of using ML models to address prediction problems are well established, selecting suitable ML models and properly configuring them is daunting, even for users with rich experience using them.Most related algorithms incorporate hyperparameters (HPs), the values of which directly influence the bias and, thus, the predictive performance of the induced models.Despite the increase in the number and popularity of ML tools, users still need help to determine the best HP settings.This is a complex task and may lead practitioners to choose an inferior algorithm over a superior algorithm, as users typi- cally adjust HP values using a subjective process of trial and error.
Ideally, the HP values should be defined separately for each problem (Hamdia et al., 2021) using an optimization process that works to find the (near) best settings.Several tuning techniques have been used for this purpose, of which grid search (GS) and random search (RS) are the simplest and most common (Yang & Shami, 2020).The former is better suited to low-dimensional problems with few HPs to set.Due to the large hyperspace, GS cannot explore more promising regions for more complex scenarios.Conversely, RS can explore all possible solutions in hyperspace but cannot perform informed searches, which may lead to high computational costs.
Sequential model-based optimization (SMBO) (Hutter et al., 2011) is a more recently developed technique that has gained attention due to its probabilistic nature.SMBO replaces the target function (ML algorithm) with a surrogate model with faster computational speeds.However, this technique has many HPs and only eliminates the shortcoming of requiring the function targeted for optimization to be evaluated iteratively.Meta-heuristics have also been used to fine-tune the most suitable HP values with an optimization loop, as shown in Figure 5.This method considers the number of pre-defined HPs as the dimension of the optimization problem, with the HPs used as variables for the accuracy objective function.Despite their high computation cost, MOs are widely accepted as a reasonable alternative to GS and RS due to their obviation of tedious trial-and-error efforts and ensuring elegant performance from ML models.

Training the hybrid models
Data instances must be collected and processed to feed the training process.The data process first normalizes all attributes into a standard range of [0, 1], ensuring the constructed model avoids the negative impact of attributes with large values by making all attributes equally crucial in the initial stage.The normalized dataset is then divided into learning and validation data, which are used for machine learners' learning and validation processes.To compare model generalization, test data are partitioned to verify the difference between actual values and the predicted values generated by the optimal trained model.
The optimization process starts with initializing tuning parameter values, which differ for each ML.For example, the most miniature square support vector machine uses sigma and gamma.At the same time, the hyperparameters of the radial basis function neural network consist of a hidden neuron number and the width of the Gaussian function.It should be noted that the initial boundaries of the tuning parameters may be extended as the optimal values are approached.Information about the optimized hyperparameters and their respective ranges for all machine learning models implemented in the developed MMOMML system is provided in Appendix (Table A1).
Each set of hyperparameter values combined with the learning data forms a prediction model that may then be validated using the validation data.Hence, each iteration of the optimization process generates many potential pre- where and indicate the average training error and validating error, respectively.
The metaheuristic algorithm uses values of the objective functions to recognize proper solutions (sets of hyperparameter values).Based on the operation procedure of each MO used in the optimization, solutions are evolved to obtain better values at the next iteration.Generally, a better solution yielding a smaller objective function value will be retained for further calculation, while worse solutions are abandoned.This search loop continues until the termination criterion is met.Specifically, the search loop concludes when either the maximum iteration limit is reached, or there is no improvement in the accuracy objective value for a set number of consecutive iterations.The optimal set of tuning parameter values is found once the optimization process stops and is saved to produce the outcome of new instances in the future.These optimal values may be used as test data to compare model performance.The operational procedure used to develop the hybrid models in the interface is presented in Figure 6.

The MMOMML interface
The structure of the proposed MMOMML system, presented in Figure 7, includes four main phases: Data input, Modelling, Model saving, and Deployment.The MMOMML interface was designed to resolve numerous prediction and forecasting problems related to steel structures and to meet the needs/expectations of both industry users and researchers.This user-friendly, powerful interface can assist structural engineers in efficiently solving their steel structure inference problems using on-hand data and without prior coding knowledge.The steps for using the interface to train a hybrid ML model are shown in Figure 8. ■ Step 1.When using the interface for the first time, the user selects DATA PROCESS to handle and save the data in a suitable format that accelerates the training and optimization processes.After clicking the DATA PRO-CESS button, a new "Data Process" window will display the data processing options listed in the bullet points below.Selecting the "Load Data" button allows the user to browse the datasheet, which should be saved in .xls,.xlsx,or .csvformat.
• Shuffle → randomly changes the sequence of data points.
• Export CSV → allows users to record a new sequence of data points while the shuffle option is active.
• Standardize → normalizes the original data values to a standard range of [0, 1].
• k cross-fold → divides the entire dataset into k exclu- sive folds (default k = 10). ■ Step 2. The user selects an MO algorithm from the list for optimization.The user may set desired values for the general parameters, including several optimization iterations and population size, and display the optimization results for the specified steps.Suppose the selected MO has additional tuning parameters.In that case, these will appear with the default values suggested by the original authors of the MO but may also be adjusted by the user if desired.For example, additional tuning parameters associated with GA include mutation rate, crossover rate, and generation gap.

■
Step 3. The user designates the type of inference prob- lem by selecting classification or regression. ■ Step 4. The MACHINE LEARNERS button is activated, allowing the user to select the desired machine learner from a list.Other setting options are continuously activated to allow the user to: • Perform test folds for k-fold cross-validation runs or randomly set portions of train/test data • Set the lower bounds and upper bounds of the hy- perparameters ■ Step 5.The user clicks on the OPTIMIZE MODEL but- ton to start the optimization process, which returns the outcome.Upon completion of model performance optimization, the results are displayed in two tables and one figure, as follows: • In the first table, the evaluation index for the regres-  • The second table presents the optimal value set of the hyperparameters provided by the selected MO.
• The figure exhibits the convergence curve of the MO optimization process and the computational time.
■ Step 6.After completion of the optimization process, • Select the inference type (regression or classification) • Browse to select a saved model.
• Browse to load new data in Excel format.
• Click on the RUN button to execute the prediction task and click on the EXPORT button to save the outcome to a .xlsxfile.

Case data and experimental results
Two steel structure datasets were utilized in this research to validate the efficiency and reliability of the proposed MMOMML system, which included data on shear strength in steel beams with flat webs and elastic buckling in steel cellular beams.

Data processing
The shear strength of steel beams is a crucial parameter influencing the susceptibility of webs to various failure modes, making this variable a pivotal factor in structural design and safety.Experimental studies on the ultimate shear resistance of these girders have revealed that they exhibit typical diagonal shear buckling of the web and develop plastic hinges in the flanges at failure loads.Bending theory may be employed to analyze how internal forces carried by the web and flanges in a plate girder respond to minor shear loads (see Figure 9).The ultimate shear resistance of steel plate girders has been investigated extensively using experimental and theoretical methods.Despite the extensive research conducted in this area over the past three decades, current methods remain unable to accurately predict the ultimate shear resistance of plate girders (Elamary et al., 2023), underscoring the need for more reliable and precise procedures to enhance structural analysis and design.In this study, experimental data on the shear strength of steel beams with flat webs were collected from various articles in the literature.These data were then classified, filtered, and organized based on beam characteristics and other effective parameters.The systematically processed data generated a curated dataset of 90 experimental test results derived from 19 published papers (Adorisio, 1982;Basler et al., 1960;Carskaddan, 1968;Cooper et al., 1964;Evans et al., 1977Evans et al., , 1979;;Evans, 1984Evans, , 1986;;Evans & Tang, 1983;Kamtekar et al., 1972Kamtekar et al., , 1974;;Konishi, 1965;Lee & Yoo, 1998;Roberts & Shahabian, 2000;Rockey & Skaloud, 1972;Rockey et al., 1981;Sakai et al., 1966;Skaloud, 1971;Tang & Evans, 1984), which was used to validate the proposed MML system.The complete dataset is provided in Appendix (Table A2).A critical overview of the key parameters relevant to the experimental shear load at failure (V exp ) of steel plate girders is provided in Table 3.This comprehensive dataset facilitates an in-depth understanding of the behavior and performance of steel plate girders under various loading conditions.

Numerical results
Accurate predictive modeling requires that model performance be comprehensively evaluated.The ten-fold crossvalidation technique used in this study is a widely cited, robust approach to assessing the predictive capabilities of machine learning models.Ten-fold cross-validation involves partitioning a dataset into ten distinct subsets or folds.In each iteration, nine folds are utilized for model training, while the remaining fold is used for validation.This process is iteratively repeated ten times, ensuring that each fold serves as a validation set precisely once.Tenfold cross-validation is a practical approach to mitigating random-selection bias in model evaluations.The outcomes of these iterations are aggregated to provide a holistic understanding of the generalizability of a model to other diverse data subsets.
In this study, a suite of four evaluation metrics, including Mean Square Error (MSE), Mean Absolute Percentage Error (MAPE), and Coefficient of Determination (R 2 ), was used to comprehensively assess model performance in terms of predictive accuracy, error distribution, and explanatory power: 1. Mean Squared Error (MSE): MSE calculates the average of the squared differences between predicted and actual values.This metric provides a higher-level view of error distribution, facilitating a broader-scale model accuracy evaluation.

Mean Absolute Percentage Error (MAPE): MAPE com-
putes the average percentage difference between predicted and actual values.This metric is valuable in scenarios where understanding relative error's impact is crucial, allowing for an intuitive interpretation of predictive accuracy.3. Coefficient of Determination (R 2 ): R 2 assesses the proportion of variance in the dependent variable that is explainable by the independent variables in the model.This metric provides insights into the model's explanatory power and its ability to capture the underlying relationships within the data.The mathematical formulas for those metrics are as follows: ( ) )  Note: The lever arm of a steel beam is the perpendicular distance from the line of action of a force applied to the beam to the point about which the beam is expected to rotate.
This study comprehensively understood the predictive models' performance by combining the robust ten-fold cross-validation with the multidimensional, four-metric evaluation.In line with the objectives of this study, integrating these methodologies facilitates informed decisionmaking on model selection and refinement as well as realworld applicability.
Across all of the hybrid models in the MMOMML system, many generated outcomes indicate suboptimal performance, as indicated by unsatisfactory evaluation metric (MSE, MAPE, and R 2 ) values.The results demonstrate the complexities involved in accurately predicting the outcomes within the given dataset, reflecting the inherent challenges in modeling the behavior of steel structures.Considering MAPE as the primary assessment criterion, MAPE ≤ 20% values were presumed to indicate good prediction accuracy, and values > 20% were assumed to indicate normal to bad prediction accuracy (Kumar & Kaur, 2016).The performance of models achieving MAPE value less than 20% (good prediction accuracy) is shown in Table 4, offering insights into models with comparatively favorable performance.
As shown in Table 4, XGB, LSSVR, and RBFNN were quite productive and suitable for the given dataset.FBI_ LSSVR emerged as the most promising candidate by outperforming its counterparts across all evaluation metrics, underscoring its potential for enhanced predictive accuracy.Specifically, FBI_LSSVR exhibits the lowest MAPE value (14.13%) and the highest R 2 value (R 2 = 0.970, approximately 1).SOS_XGB gave the second-best performance with a MAPE = 14.23%.Some of the other models exhibited a trade-off between accuracy and correlation.For example, GWO_XGB achieved a low MAPE (15.6%) and a relatively low R 2 (0.875).
Two important insights may be taken from Table 4. First, ML models must be carefully selected to match the specific characteristics of each dataset.Suppose XGB, LSSVR, or RBFNN is chosen appropriately and hybridized with an MO algorithm for parameter optimization.In that case, the resulting hybrid model can be expected to yield predictions with relatively high accuracy.Conversely, when selecting ML models beyond those in Table 4 (e.g., MARS), predictive performance can be expected to be suboptimal (with a MAPE exceeding 20%) even after hybridization with an MO algorithm.Second, MO algorithms must be carefully selected for parameter optimization based on the ML context.When used with ML algorithms compatible with the provided dataset (XGB, LSSVR, or RBFNN), the appropriate MO algorithms for parameter optimization achieve MAPE values from 14.13% to 18.83%.However, using a metaheuristic algorithm not listed in Table 4 (e.g., GA_LSSVR) can be expected to result in suboptimal model performance, with MAPE values >20%.
Notably, FBI is the only candidate algorithm capable of seamless hybridization with XGB, LSSVR, and RBFNN models, optimizing parameter values to enhance predictive accuracy.By contrast, the other MO algorithms in the proposed system exhibit limitations in either their hybridization potential with these models or their ability to identify optimal parameter values, resulting in compromised accuracy and the MAPE values of the hybrid models surpassing 20%.These findings underscore the superior efficacy of the FBI algorithm as a versatile multi-objective optimization technique adept at finely adjusting parameters across a diverse array of ML models.
For enhanced visual analysis, the contrast between actual and predicted outputs for the two models is plotted graphically in Figure 10, with FBI_LSSVR shown to be more accurate overall than GA_LSSVR.For FBI_LSSVR, the visualization reveals a remarkable alignment and even coincidence between the predicted and actual data points.Conversely, for GA_LSSVR, the disparity between the expected and actual results is more prominent, indicating the model's relative inaccuracy and the broader differences between predicted and exact outcomes.

Description of the data
The elastic buckling behavior of perforated steel beams, specifically cellular beams with repeating web openings, has been studied for over a century (Degtyarev & Tsavdaridis, 2022;Sweedan, 2011).The advantages of these beams over solid-web steel beams include reduced weight, higher strength-to-weight ratio, integration of utilities, and improved aesthetics.The critical geometrical parameters of perforated steel beams are provided in Figure 11 (Rajana et al., 2020).The presence of multiple large openings in cellular beams considerably reduces beam shear strength and introduces various possible failure modes that in-crease the complexity of their flexural behavior and design.
Prior investigations have examined the specific aspects of cellular beams and beams with different webopening shapes (Grilo et al., 2018;Panedpojaman et al., 2014;Tsavdaridis & D'Mello, 2011).Rajana et al. (2020) performed an extensive numerical parametric study investigating elastic and inelastic buckling in cellular beams subjected to strong axis bending.The effects of the initial geometric imperfection, material nonlinearity, manufacture-introduced residual stresses, web opening diameter, web-post width, web height, flange width, web and flange thickness, end web-post width, and span of the beams, as well as their combinations, were examined thoroughly.
Rajana et al. ( 2020) investigated the impact of different parameters on elastic buckling loads and generated an extensive database of finite element simulation results comprising 3645 samples.This comprehensive dataset was used in this study to validate the effectiveness and reliability of the proposed MMOMML system in predicting elastic buckling behavior in steel cellular beams.The statistical characteristics of this dataset are shown in Table 5.

Numerical results
Due to the substantial size (3645 samples) of the dataset in this case study, the following approach was used to select a subset of the hybrid ML models in the MMO-MML system for testing: Step 1: The dataset was used first to train all ML models in the MMOMML system.The parameters of these ML models were optimized using the FBI algorithm due to its demonstrated superiority over the other optimization algorithms across a wide range of scenarios (Chou & Nguyen, 2020) and due to its validation in Subsection 5.1.2as a powerful and effective MO algorithm for finetuning parameters across a wide range of ML models.The two most suitable ML models for the given dataset were identified based on the results.
Step 2: The two most appropriate ML models were combined with each optimization algorithm available in the system to identify the most effective hybrid ML model for the specific characteristics of the case study.This process aimed to enhance model performance and achieve the best possible results.
In this case study, 10-fold cross-validation was also deployed to enhance model generalizability and mitigate overfitting concerns.As previously mentioned, in the first step, the FBI algorithm was deployed to optimize the parameters of the machine-learning models in the MMOM-ML system.The results in Table 6 showcase the predictive powers of the five most accurate hybrid models.These results encompass the mean values obtained through the 10-fold cross-validation procedure and the standard deviation values for the four performance metrics, which separately address each model's learning and testing phases.
The results in Table 6 identify FBI_XGB as the most effective hybrid model.FBI_XGB exhibited the best performance across all four metrics, earning a MAPE of 1.54%, MSE of 3.77 (kN/m) 2 , and R 2 of 0.997.FBI_LSSVR was the second-best hybrid model in terms of accuracy (MAPE = 2.68%, and R 2 = 0.996).Alternatively, FBI_MARS earned higher average MSE and MAPE values, indicating lower prediction accuracy.The standard deviations provide in-sights into the variability in these metrics across different evaluations, with lower values generally indicating more consistent model performance.Overall, the results highlight the efficacy of FBI_XGB and FBI_LSSVR in achieving reliably accurate predictions, as well as the other models' lower reliability and higher variability in achieving the same.
Next, XGB and LSSVR, the two most productive models, were integrated with each optimization algorithm into the MMOMML system to identify the optimal hybrid ML model for the unique demands of the current case study.The comprehensive evaluation metrics on the hybrid MO_ XGB and MO_LSSVR models are presented in Tables 7 and  8, respectively.
The results of the MSE, MAPE, and R 2 metrics shown in Table 7 shed light on the predictive efficacy of the topten MO_XGB models across both learning and test phases.Notably, FBI_XGB stands out as a strong performer, with consistently low average MSE values of 3.67 and 3.77 for the learning and test phases, respectively, and minimal MAPE values of 1.44% and 1.54%, respectively.The relatively narrow standard deviations accompanying these values underscore this model's reliability and ability to generate accurate predictions consistently.
Further analysis of Table 7 reveals intriguing patterns among the models.ABC_XGB, EO_XGB, SOS_XGB, TLBO_ XGB, WCA_XGB, and GWO_XGB exhibited similar average MAPE values of less than 2%, indicating their robust predictive capabilities.However, WCA_XGB exhibited slightly higher standard deviations, implying varying degrees of performance across different evaluation instances.In addition, both FBI_XGB and GWO_XGB stand out for their notable average R 2 values of 0.997 and above in both learning and test phases, implying their strong explanatory powers in representing observed variances in the dataset.
The results of the MSE, MAPE, and R 2 metrics shown in Table 8 shed light on the predictive efficacy of the top-ten MO_LSSVR models across both the learning and test phases.Notably, EO_LSSVR showcased consistent predictive strength, with average MAPE values of 2.58% and 2.68%, A similar trend was exhibited by FBI-LSSVR, with closely aligned average metrics and a balanced standard deviation underlining its stability.
The comprehensive assessment of the models in Table 8 reveals consistently high R 2 values from 0.993 to 0.996, indicating diverse performance levels across distinct evaluation instances.This variability in R 2 values underscores the adeptness of these models in capturing data variance.Concurrently, the low MAPE values observed (2.64% to 2.85%) underscore the high prediction accuracy of these models, substantiating their reliability and suitability for practical applications requiring both robust explanatory and precise forecasting capabilities.
Interpreting the interplay between average values and standard deviations is crucial to comprehensively understanding model performance consistency.While average values provide a measure of central tendency, standard deviations elucidate the dispersion of results, highlighting the stability of a model across different iterations.The data in Tables 7 and 8 permits the predictive aptitudes of the various learning models to be compared and contrasted, allowing the most reliable contenders to be identified and highlighting those models that may exhibit varied outcomes under differing conditions.

Conclusions
In this research, an innovative application interface called the multiple metaheuristic optimizers -multiple machine learners (MMOMML) system was developed and used to address the predictive and estimation challenges encountered in steel structures.Despite the current prevalence of using machine learning (ML) to predict mechanical strength in structural engineering, much of the related work in the literature has focused on concrete and reinforced concrete structural components.Furthermore, many of these studies have relied on subjectively selected metaheuristic optimization (MO) algorithms and ML models to construct their hybrid ML models, leaving open the possibility that other, unexamined hybrid models may perform even better.
To bridge these gaps, we harnessed two specific datasets related to distinct steel structure properties, in conjunction with advanced algorithms and models, to develop the novel MMOMML system.Advanced artificial intelligence (AI) techniques encompassing 17 MO algorithms and 15 ML techniques were integrated into this system to generate a comprehensive suite of 255 hybrid AI models that expose unexplored, potentially lucrative avenues for novel hybrid model development.Moreover, by automating the training of models using historical real-world data, the MMOMML system bridges the gap in the literature regarding predicting structural behavior in steel structural components.The user-friendly system interface empowers structural engineers to navigate inference challenges related to steel structures, leveraging existing data without requiring coding knowledge.Furthermore, the potential of this system to address a wide range of practical challenges affecting steel structures recommends MMOMML as a dependable tool for structural engineers that meets the expectations of both practitioners and researchers.
Two datasets addressed the shear strength of steel beams and elastic buckling in steel cellular beams were used to validate the proposed MMOMML system.The former comprised 90 experimental results curated from 19 published papers, and the latter comprised a comprehensive database of 3645 finite element simulation results.The system performance evaluation used the ten-fold cross-validation technique augmented by four evaluation metrics, including MSE, MAPE, and R 2 .The ten-fold crossvalidation technique and metrics provide comprehensive insights into the proposed system's predictive accuracy, error distribution, and explanatory power.To support transparency and reproducibility, all information about the 10fold cross-validation dataset and the optimal values of all hyperparameters in hybrid models within the MMOMML system is provided at https://drive.google.com/drive/folders/1jt2966noFuHkterKaUIFTkhFENNyXsGi?usp=sharing.
The rigorous analysis in this study highlights two key observations.The first is the pivotal role of selecting the most appropriate ML model for a given dataset.Appropriate models combined with MO algorithms yield substantial predictive accuracy.The second is that judicious MO algorithm selection significantly impacts ML parameter optimization, influencing performance accuracy.This research contributes to enhancing the accuracy and reliability of predictive modeling in steel structures by providing the most effective hybrid ML models for each specific dataset, highlighting the potential of the MMOMML system as a valuable tool for engineers to address structural challenges in these structures.
The MMOMML system is expected to enhance the accuracy and reliability of predictive modeling in steel structures.Its potential for advancement is clear, with plans to expand MMOMML system capabilities by incorporating broader spectrums of advanced ML models and novel MO algorithms.Further diversification of ML models and MO algorithms will provide the MMOMML system with an even more comprehensive array of data related to steel structures, enabling structural engineers to glean invaluable insights into their projects.The ability of the MMOMML system to adapt and evolve ensures its relevance across a myriad of construction industry applications, providing a robust foundation for continued exploration and innovation.The next step is to make the MMOMML system publicly available via a dedicated online platform or website to facilitate broader access and verification.This will empower readers and researchers to utilize the system, train it with their datasets, and validate its performance.
In conclusion, the MMOMML system, a pioneering amalgamation of data-driven insights and advanced AI techniques, marks a significant stride toward transforming how engineers approach challenges in steel structures.The ability of this system to generate accurate predictions, optimize parameters, and foster interdisciplinary collaboration between AI and engineering offers the opportunity to redefine the landscape of structural analysis and design.The MMOMML system is well situated to facilitate cata-lyzing advancements in both the theoretical and practical realms of steel-structure engineering by providing a dynamic, adaptable, and efficient predictive tool for related engineers and researchers.While this study focused on validating the system's performance within the context of steel structures, it is anticipated that the system's applicability can extend to address estimation and prediction challenges in diverse domains, including but not limited to concrete structures and construction engineering management.This demonstrates the versatility of the MMOMML system in addressing a broader spectrum of predictive modeling challenges across various construction-related domains.End of Table A2

Figure 1 .
Figure 1.ML methods used in the structural engineering field

Figure 3 .
Figure 3. ML applications in steel structural engineering

Figure 4 .
Figure 4. General flowchart of MO algorithms: a -general flowchart of a single-phase MO; b -general flowchart of a multiple-phase MO

Figure 5 .
Figure5.General procedure for integrating a metaheuristic optimizer using machine learning

Figure 6 .
Figure 6.Operational procedure used to develop hybrid models in the interface sion problem for training, validation, and testing phases are RMSE, MAPE, MAE, and R2, respectively.Meanwhile, CAR, PRE, REC, F1_score, NPV, TPR, and NPR are chosen to assess the robustness of the model for a classification problem.

Figure 7 .
Figure 7. Structure of the MMOMML system the user may save the optimized model by clicking on the SAVE MODEL button.The outcome of the optimization process may be exported to .xlsxfiles for further analysis by clicking on the EXPORT RESULT button.■ Step 7. The user may load a previously saved optimized model to infer outcomes from new data patterns by clicking on the PREDICTION button to open a new window, which facilitates the following tasks:

Figure 8 .
Figure 8. Main interfaces of the proposed hybrid ML interface

Figure 9 .
Figure 9. Plate girder responses to minor shear loads

Table 1 .
Numbers of studies on ML applications in structural engineering, by topic

Table 2 .
Applications of ML on steel structures

Table 3 .
Input and output values of shear strength in steel plate girders

Table 4 .
Performance of models with MAPE < 20% in the MMOMML system

Table 5 .
Input and output values of elastic buckling loads in steel cellular beams

Table 6 .
Performances of the top-five hybrid models

Table 7 .
Performances of the top-ten MO_XGB models

Table 8 .
Performances of the top-ten MO_LSSVR models

Table A1 .
Information about the optimized hyperparameters and their respective ranges for all machine learning models implemented in the developed MMOMML system

Table A2 .
The collected dataset of shear strength for steel beams with flat webs