Search Results - (( based application learning algorithm ) OR ( parameter optimization method algorithm ))
Search alternatives:
- parameter optimization »
- application learning »
- learning algorithm »
- based application »
- method algorithm »
-
1
Modeling time series data using Genetic Algorithm based on Backpropagation Neural network
Published 2018“…This study showed the task of optimizing the topology structure and the parameter values (e.g., weights) used in the BPNN learning algorithm by using the GA. …”
Get full text
Get full text
Thesis -
2
A review of training methods of ANFIS for applications in business and economic
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Article -
3
Algorithmic Loan Risk Prediction Method Based on PSO-EBGWO-Catboost
Published 2024“…PSO-EBGWO is used to optimize the parameters of the CatBoost model. In this method, the Gray Wolf optimized algorithm (EBGWO) is further optimized by particle swarm optimization (PSO), and when combined with it, the convergence performance of the model is improved, the parameters of the model are reduced, and the model is simplified. …”
Get full text
Get full text
Get full text
Article -
4
A review of training methods of ANFIS for applications in business and economics
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Get full text
Article -
5
Advances of metaheuristic algorithms in training neural networks for industrial applications
Published 2023“…Backpropagation; Gradient methods; Neural networks; Artificial neural network models; Complex applications; Exploration and exploitation; Gradient-based learning; Industry applications; Meta heuristic algorithm; Meta-heuristic search algorithms; Near-optimal solutions; Optimization…”
Article -
6
Enhanced Harris's Hawk algorithm for continuous multi-objective optimization problems
Published 2020“…Harris’s hawk multi-objective optimizer (HHMO) algorithm is a MOSIbased algorithm that was developed based on the reference point approach. …”
Get full text
Get full text
Get full text
Get full text
Thesis -
7
Identification of continuous-time model of hammerstein system using modified multi-verse optimizer
Published 2021“…It has been successfully implemented and used in various areas such as machine learning applications, engineering applications, network applications, parameter control, and other similar applications to solve optimization problems. …”
Get full text
Get full text
Thesis -
8
Gender classification on skeletal remains: efficiency of metaheuristic algorithm method and optimized back propagation neural network
Published 2020“…Besides that, another limitation that exists in previous researches is the absence of parameter optimization for the classifier. Thus, this paper proposed metaheuristic algorithms such as Particle Swarm Optimization, Ant Colony Algorithm and Harmony Search Algorithm based feature selection to identify the most significant features of skeleton remains. …”
Get full text
Get full text
Get full text
Article -
9
Hybridization of metaheuristic algorithm in training radial basis function with dynamic decay adjustment for condition monitoring / Chong Hue Yee
Published 2023“…In this research work, the motivation is to develop an autonomous learning model based on the hybridization of an adaptive ANN and a metaheuristic algorithm for optimizing ANN parameters so that the network could perform learning and adaptation in a more flexible way and handle condition classification tasks more accurately in industries, such as in power systems. …”
Get full text
Get full text
Get full text
Thesis -
10
Towards enhanced remaining useful life prediction of lithium-ion batteries with uncertainty using optimized deep learning algorithm
Published 2025“…In addition, to validate the prediction performance of the proposed LSA + LSTM model, extensive comparisons are performed with other popular optimization-based deep learning methods including artificial bee colony (ABC) based LSTM (ABC + LSTM), gravitational search algorithm (GSA) based LSTM (GSA + LSTM), and particle swarm optimization (PSO) based LSTM (PSO + LSTM) model using different error matrices. …”
Article -
11
Accelerated mine blast algorithm for ANFIS training for solving classification problems
Published 2016“…ANFIS accuracy depends on the parameters it is trained with. Keeping in view the drawbacks of gradients based learning of ANFIS using gradient descent and least square methods in two-pass learning algorithm, many have trained ANFIS using metaheuristic algorithms. …”
Get full text
Get full text
Article -
12
Acceleration Strategies For The Backpropagation Neural Network Learning Algorithm
Published 2001“…However, as with many gradient based optimization methods, it converges slowly and it scales up poorly as tasks become larger and more complex. …”
Get full text
Get full text
Thesis -
13
Differential evolution for neural networks learning enhancement
Published 2008“…These algorithms can be used successfully in many applications requiring the optimization of a certain multi-dimensional function. …”
Get full text
Get full text
Get full text
Thesis -
14
A decomposed streamflow non-gradientbased artificial intelligence forecasting algorithm with factoring in aleatoric and epistemic variables / Wei Yaxing
Published 2024“…To summarise, metaheuristic algorithms can give a superior optimization approach than the traditional artificial neural network method, providing the computing time is within an acceptable range. …”
Get full text
Get full text
Get full text
Thesis -
15
Optimal power flow based on fuzzy linear programming and modified Jaya algorithms
Published 2017“…In the proposed novel QOJaya algorithm, an intelligence strategy, namely, quasi-oppositional based learning (QOBL) is incorporated into the basic Jaya algorithm to enhance its convergence speed and solution optimality. …”
Get full text
Get full text
Thesis -
16
Optimal power flow using a hybridization algorithm of arithmetic optimization and aquila optimizer
Published 2024“…In this paper, a hybridization method based on Arithmetic optimization algorithm (AOA) and Aquila optimizer (AO) solver namely, the AO-AOA is applied to solve the Optimal Power Flow (OPF) problem to independently optimize generation fuel cost, power loss, emission, voltage deviation, and L index. …”
Get full text
Get full text
Article -
17
Comparison of Logistic Regression, Random Forest, SVM, KNN Algorithm for Water Quality Classification Based on Contaminant Parameters
Published 2024“…This research provides new insights into the application of machine learning algorithms for water quality management as well as guidance for optimal algorithm selection.…”
Get full text
Get full text
Get full text
Get full text
Article -
18
Parameter estimation in computational systems biology models: a comparative study of initialization methods in global optimization
Published 2022“…Global optimization method based on an enhanced scatter search (ESS) algorithm is a suitable choice to address this issue. …”
Get full text
Get full text
Get full text
Article -
19
Nature-inspired parameter controllers for ACO-based reactive search
Published 2015“…This study proposes machine learning strategies to control the parameter adaptation in ant colony optimization algorithm, the prominent swarm intelligence metaheuristic.The sensitivity to parameters’ selection is one of the main limitations within the swarm intelligence algorithms when solving combinatorial problems.These parameters are often tuned manually by algorithm experts to a set that seems to work well for the problem under study, a standard set from the literature or using off-line parameter tuning procedures. …”
Get full text
Get full text
Get full text
Article -
20
Parameter characterization of PEM fuel cell mathematical models using an orthogonal learning-based GOOSE algorithm
Published 2025“…The orthogonal learning mechanism improves the performance of the original GOOSE algorithm. This FC model uses the root mean squared error as the objective function for optimizing the unknown parameters. …”
Article
