Search Results - (( using computer training algorithm ) OR ( using optimization method algorithm ))
Search alternatives:
-
1
Optimization and discretization of dragonfly algorithm for solving continuous and discrete optimization problems
Published 2024“…Based on the experimental results, the optimized DA algorithm is a much better training algorithm for ANNs as compared to the usual gradient-descent backpropagation algorithm since the resultant ANNs trained by the optimized DA achieve higher accuracy. …”
Get full text
Get full text
Thesis -
2
Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training
Published 2020“…To evaluate the performance of the proposed IBOA training method, the obtained results are compared to the results of the momentum backpropagation (MBP), the particle swarm optimization (PSO) and the standard butterfly optimization algorithm (BOA) training methods. …”
Get full text
Get full text
Get full text
Article -
3
Mussels wandering optimization algorithmn based training of artificial neural networks for pattern classification
Published 2013“…Traditional training algorithms have some drawbacks such as local minima and its slowness.Therefore, evolutionary algorithms are utilized to train neural networks to overcome these issues.This research tackles the ANN training by adapting Mussels Wandering Optimization (MWO) algorithm.The proposed method tested and verified by training an ANN with well-known benchmarking problems.Two criteria used to evaluate the proposed method were overall training time and classification accuracy.The obtained results indicate that MWO algorithm is on par or better in terms of classification accuracy and convergence training time.…”
Get full text
Get full text
Get full text
Conference or Workshop Item -
4
Training functional link neural network with ant lion optimizer
Published 2020“…The Ant Lion Optimizer (ALO) is the metaheuristic optimization algorithm that mimics the hunting mechanism of antlions in nature. …”
Get full text
Get full text
Get full text
Conference or Workshop Item -
5
-
6
A review of training methods of ANFIS for applications in business and economic
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Article -
7
Accelerated mine blast algorithm for ANFIS training for solving classification problems
Published 2016“…Keeping in view the drawbacks of gradients based learning of ANFIS using gradient descent and least square methods in two-pass learning algorithm, many have trained ANFIS using metaheuristic algorithms. …”
Get full text
Get full text
Article -
8
A review of training methods of ANFIS for applications in business and economics
Published 2016“…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
Get full text
Get full text
Get full text
Article -
9
Autoreclosure in Extra High Voltage Lines using Taguchi's Method and Optimized Neural Networks
Published 2009“…The fault identification prior to reclosing is based on optimized artificial neural network associated with standard Error Back-Propagation, Levenberg Marquardt Algorithm and Resilient Back-Propagation training algorithms together with Taguchi’s Method. …”
Get full text
Get full text
Conference or Workshop Item -
10
Optimising neural network training efficiency through spectral parameter-based multiple adaptive learning rates
Published 2024“…The process of training neural networks heavily involves solving optimization problems. Most optimization algorithms use a !…”
Get full text
Get full text
Get full text
Conference or Workshop Item -
11
Static code analysis of permission-based features for android malware classification using apriori algorithm with particle swarm optimization
Published 2015“…The algorithm is improved with Particle Swarm Optimization that trains three different supervised classifiers. …”
Get full text
Get full text
Get full text
Article -
12
Multi objective genetic algorithm for training three term backpropagation network
Published 2013“…Multi Objective Evolutionary Algorithms has been applied for learning problem in Artificial Neural Networks to improve the generalization of the training and testing unseen data.This paper proposes the simultaneous optimization method for training Three Term Back Propagation Network (TTBPN) learning using Multi Objective Genetic Algorithm.The Non-dominated Sorting Genetic Algorithm II is applied to optimize the TTBPN structure by simultaneously reducing the error and complexity in terms of number of hidden nodes of the network for better accuracy in classification problem.This methodology is applied in two kinds of multiclasses data set obtained from the University of California at Irvine repository.The results obtained for training and testing on the datasets illustrate less network error and better classification accuracy, besides having simple architecture for the TTBPN.…”
Get full text
Get full text
Get full text
Conference or Workshop Item -
13
Weight Optimization in Recurrent Neural Networks with Hybrid Metaheuristic Cuckoo Search Techniques for Data Classification
Published 2015“…The proposed CSERN and CSBPERN algorithms are compared with artificial bee colony using BP algorithm and other hybrid variants algorithms. …”
Get full text
Get full text
Get full text
Article -
14
Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms
Published 2023“…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
Get full text
Get full text
Get full text
Article -
15
PROPOSED METHODOLOGY FOR OPTIMIZING THE TRAINING PARAMETERS OF A MULTILAYER FEED-FORWARD ARTIFICIAL NEURAL NETWORKS USING A GENETIC ALGORITHM
Published 2011“…To overcome these limitations, there have been attempts to use genetic algorithm (GA) to optimize some of these parameters. …”
Get full text
Get full text
Thesis -
16
Hybrid honey badger algorithm with artificial neural network (HBA-ANN) for website phishing detection
Published 2024“…HBA as metahueristic algorithm is used to optimize the network training process of ANN to improve their performances. …”
Get full text
Get full text
Get full text
Article -
17
A new approach for forecasting OPEC petroleum consumption based on neural network train by using flower pollination algorithm
Published 2016“…Many meta-heuristic algorithms have been proposed in literature for the optimization of Neural Network (NN) to build a forecasting model. …”
Get full text
Get full text
Get full text
Get full text
Article -
18
Integrating genetic algorithms and fuzzy c-means for anomaly detection
Published 2005“…Fuzzy c-Means allow objects to belong to several clusters simultaneously, with different degrees of membership. Genetic Algorithms (GA) to the problem of selection of optimized feature subsets to reduce the error caused by using land-selected features. …”
Get full text
Get full text
Get full text
Conference or Workshop Item -
19
Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms
Published 2023“…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
Get full text
Get full text
Get full text
Article -
20
Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms
Published 2023“…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
Get full text
Get full text
Get full text
Article
