Search Results - (( using computer training algorithm ) OR ( using optimization method algorithm ))

Refine Results
  1. 1

    Optimization and discretization of dragonfly algorithm for solving continuous and discrete optimization problems by Bibi Amirah Shafaa, Emambocus

    Published 2024
    “…Based on the experimental results, the optimized DA algorithm is a much better training algorithm for ANNs as compared to the usual gradient-descent backpropagation algorithm since the resultant ANNs trained by the optimized DA achieve higher accuracy. …”
    Get full text
    Get full text
    Thesis
  2. 2

    Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training by Lee, Sen Tan, Zainuddin, Zarita, Ong, Pauline

    Published 2020
    “…To evaluate the performance of the proposed IBOA training method, the obtained results are compared to the results of the momentum backpropagation (MBP), the particle swarm optimization (PSO) and the standard butterfly optimization algorithm (BOA) training methods. …”
    Get full text
    Get full text
    Get full text
    Article
  3. 3

    Mussels wandering optimization algorithmn based training of artificial neural networks for pattern classification by Abusnaina, Ahmed A., Abdullah, Rosni

    Published 2013
    “…Traditional training algorithms have some drawbacks such as local minima and its slowness.Therefore, evolutionary algorithms are utilized to train neural networks to overcome these issues.This research tackles the ANN training by adapting Mussels Wandering Optimization (MWO) algorithm.The proposed method tested and verified by training an ANN with well-known benchmarking problems.Two criteria used to evaluate the proposed method were overall training time and classification accuracy.The obtained results indicate that MWO algorithm is on par or better in terms of classification accuracy and convergence training time.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  4. 4

    Training functional link neural network with ant lion optimizer by Mohmad Hassim, Yana Mazwin, Ghazali, Rozaida

    Published 2020
    “…The Ant Lion Optimizer (ALO) is the metaheuristic optimization algorithm that mimics the hunting mechanism of antlions in nature. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  5. 5
  6. 6

    A review of training methods of ANFIS for applications in business and economic by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
    Get full text
    Get full text
    Article
  7. 7

    Accelerated mine blast algorithm for ANFIS training for solving classification problems by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…Keeping in view the drawbacks of gradients based learning of ANFIS using gradient descent and least square methods in two-pass learning algorithm, many have trained ANFIS using metaheuristic algorithms. …”
    Get full text
    Get full text
    Article
  8. 8

    A review of training methods of ANFIS for applications in business and economics by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
    Get full text
    Get full text
    Get full text
    Article
  9. 9

    Autoreclosure in Extra High Voltage Lines using Taguchi's Method and Optimized Neural Networks by Desta, Zahlay F., K.S., Rama Rao

    Published 2009
    “…The fault identification prior to reclosing is based on optimized artificial neural network associated with standard Error Back-Propagation, Levenberg Marquardt Algorithm and Resilient Back-Propagation training algorithms together with Taguchi’s Method. …”
    Get full text
    Get full text
    Conference or Workshop Item
  10. 10

    Optimising neural network training efficiency through spectral parameter-based multiple adaptive learning rates by Yeong, Lin Koay, Hong, Seng Sim, Yong, Kheng Goh, Sing, Yee Chua, Wah, June Leong

    Published 2024
    “…The process of training neural networks heavily involves solving optimization problems. Most optimization algorithms use a !…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  11. 11
  12. 12

    Multi objective genetic algorithm for training three term backpropagation network by Osman Ibrahim, Ashraf, Shamsuddin, Siti Mariyam, Ahmad, Nor Bahiah, Qasem, Sultan Noman

    Published 2013
    “…Multi Objective Evolutionary Algorithms has been applied for learning problem in Artificial Neural Networks to improve the generalization of the training and testing unseen data.This paper proposes the simultaneous optimization method for training Three Term Back Propagation Network (TTBPN) learning using Multi Objective Genetic Algorithm.The Non-dominated Sorting Genetic Algorithm II is applied to optimize the TTBPN structure by simultaneously reducing the error and complexity in terms of number of hidden nodes of the network for better accuracy in classification problem.This methodology is applied in two kinds of multiclasses data set obtained from the University of California at Irvine repository.The results obtained for training and testing on the datasets illustrate less network error and better classification accuracy, besides having simple architecture for the TTBPN.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  13. 13

    Weight Optimization in Recurrent Neural Networks with Hybrid Metaheuristic Cuckoo Search Techniques for Data Classification by Nawi, N.M., Khan, A., Rehman, M.Z., Chiroma, H., Herawan, T.

    Published 2015
    “…The proposed CSERN and CSBPERN algorithms are compared with artificial bee colony using BP algorithm and other hybrid variants algorithms. …”
    Get full text
    Get full text
    Get full text
    Article
  14. 14

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Alzaeemi, Shehab Abdulhabib, Tay, Kim Gaik, Huong, Audrey, Sathasivam, Saratha, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  15. 15

    PROPOSED METHODOLOGY FOR OPTIMIZING THE TRAINING PARAMETERS OF A MULTILAYER FEED-FORWARD ARTIFICIAL NEURAL NETWORKS USING A GENETIC ALGORITHM by ABDALLA, OSMAN AHMED

    Published 2011
    “…To overcome these limitations, there have been attempts to use genetic algorithm (GA) to optimize some of these parameters. …”
    Get full text
    Get full text
    Thesis
  16. 16

    Hybrid honey badger algorithm with artificial neural network (HBA-ANN) for website phishing detection by Muhammad Arif, Mohamad, Muhammad Aliif, Ahmad, Zuriani, Mustaffa

    Published 2024
    “…HBA as metahueristic algorithm is used to optimize the network training process of ANN to improve their performances. …”
    Get full text
    Get full text
    Get full text
    Article
  17. 17
  18. 18

    Integrating genetic algorithms and fuzzy c-means for anomaly detection by Chimphlee, Witcha, Abdullah, Abdul Hanan, Sap, Noor Md., Chimphlee, Siriporn, Srinoy, Surat

    Published 2005
    “…Fuzzy c-Means allow objects to belong to several clusters simultaneously, with different degrees of membership. Genetic Algorithms (GA) to the problem of selection of optimized feature subsets to reduce the error caused by using land-selected features. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  19. 19

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  20. 20

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majid Khan bin Majahar Ali, Majid Khan bin Majahar Ali

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article