Search Results - (( based optimization method algorithm ) OR ( parameter adaptation learning algorithm ))*
Search alternatives:
- parameter adaptation »
- adaptation learning »
- learning algorithm »
- method algorithm »
-
1
Optimising neural network training efficiency through spectral parameter-based multiple adaptive learning rates
Published 2024“…The process of training neural networks heavily involves solving optimization problems. Most optimization algorithms use a !…”
Get full text
Get full text
Get full text
Conference or Workshop Item -
2
Enhanced Harris's Hawk algorithm for continuous multi-objective optimization problems
Published 2020“…Harris’s hawk multi-objective optimizer (HHMO) algorithm is a MOSIbased algorithm that was developed based on the reference point approach. …”
Get full text
Get full text
Get full text
Get full text
Thesis -
3
Modeling time series data using Genetic Algorithm based on Backpropagation Neural network
Published 2018“…This study showed the task of optimizing the topology structure and the parameter values (e.g., weights) used in the BPNN learning algorithm by using the GA. …”
Get full text
Get full text
Thesis -
4
Adaptive route optimization for mobile robot navigation using evolutionary algorithm
Published 2021“…For example, Ant Colony Optimization (ACO) is an optimization algorithm based on swarm intelligence which is widely used to solve path planning problem. …”
Get full text
Get full text
Get full text
Get full text
Proceedings -
5
A review of training methods of ANFIS for applications in business and economic
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Article -
6
A review of training methods of ANFIS for applications in business and economics
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Get full text
Article -
7
Parameter characterization of PEM fuel cell mathematical models using an orthogonal learning-based GOOSE algorithm
Published 2025“…The orthogonal learning mechanism improves the performance of the original GOOSE algorithm. …”
Article -
8
Nature-inspired parameter controllers for ACO-based reactive search
Published 2015“…This study proposes machine learning strategies to control the parameter adaptation in ant colony optimization algorithm, the prominent swarm intelligence metaheuristic.The sensitivity to parameters’ selection is one of the main limitations within the swarm intelligence algorithms when solving combinatorial problems.These parameters are often tuned manually by algorithm experts to a set that seems to work well for the problem under study, a standard set from the literature or using off-line parameter tuning procedures. …”
Get full text
Get full text
Get full text
Article -
9
Hybridization of metaheuristic algorithm in training radial basis function with dynamic decay adjustment for condition monitoring / Chong Hue Yee
Published 2023“…In this research work, the motivation is to develop an autonomous learning model based on the hybridization of an adaptive ANN and a metaheuristic algorithm for optimizing ANN parameters so that the network could perform learning and adaptation in a more flexible way and handle condition classification tasks more accurately in industries, such as in power systems. …”
Get full text
Get full text
Get full text
Thesis -
10
Adaptive model predictive control based on wavelet network and online sequential extreme learning machine for nonlinear systems
Published 2015“…The WNMPC is developed by a proposed algorithm named adaptive updating rule (AUR) used with gradient descent optimization method to minimize a constrained cost function over the prediction and control horizons and to offer a robust control performances. …”
Get full text
Get full text
Thesis -
11
-
12
Fractional Stochastic Gradient Descent Based Learning Algorithm For Multi-layer Perceptron Neural Networks
Published 2021“…The performance is highly subjective to the optimization of learning parameters. In this study, we propose a learning algorithm for the training of MLP models. …”
Get full text
Get full text
Conference or Workshop Item -
13
Enhancing Wearable-Based Human Activity Recognition with Binary Nature-Inspired Optimization Algorithms for Feature Selection
Published 2026“…Conversely, for the PAMAP2 dataset, BDE algorithm displays superior feature selection quality and BPSO algorithm maintains competitive performance and adaptability. …”
Get full text
Get full text
Get full text
Get full text
Article -
14
A speech recognition system based on structure equivalent fuzzy neural network trained by firefly algorithm
Published 2012Get full text
Working Paper -
15
A Modified Particle Swarm Optimization for Efficient Maximum Power Point Tracking Under Partial Shading Condition
Published 2024“…Finally, a modified local search method using Perturb and Observe with adaptive step size method (P&O-ASM) is proposed to refine the near-optimal duty cycle and track GMPP with negligible oscillations. …”
Article -
16
Deep Learning-Driven Mobility And Utility-Based Resource Management In Mm-Wave Enable Ultradense Heterogeneous Networks
Published 2025thesis::doctoral thesis -
17
Enhancing hyperparameters of LSTM network models through genetic algorithm for virtual learning environment prediction
Published 2025“…Adaptive gradient-based algorithms, including ADAM, NADAM, ADADELTA, ADAGRAD, and ADAMAX, exhibited superior performance. …”
Get full text
Get full text
Get full text
Article -
18
CAT CHAOTIC GENETIC ALGORITHM BASED TECHNIQUE AND HARDWARE PROTOTYPE FOR SHORT TERM ELECTRICAL LOAD FORECASTING
Published 2017“…In the hybrid scheme, the initial parameters of the modified BP neural network are optimized by using the global search ability of genetic algorithm, improved by cat chaotic mapping to enrich its optimization capability. …”
Get full text
Get full text
Thesis -
19
Heart disease prediction using artificial neural network with ADAM optimization and harmony search algorithm
Published 2025“…The ADAM optimizer effectively tackles challenges in continuous parameter optimization by dynamically updating the model's weights and biases, adapting the learning rate for each parameter based on accumulated historical gradient information to achieve more efficient minimization of the loss function during training. …”
Get full text
Get full text
Get full text
Article -
20
Particle swarm optimization-based model-free adaptive control for time-varying batch processes
Published 2024“…Further, considering that the adopted model-free adaptive control involves seven control parameters, such as cognitive scaling factor (φ1), social scaling factor (φ2), inertia weight (φ3), learning rate (η), control parameter update rate, exploration rate and learning rate for MFAC obtained by a particle swarm optimization (PSO) algorithm in combination with a criterion function performance index. …”
Get full text
Get full text
Get full text
Article
