Search Results - (( square optimization method algorithm ) OR ( using function method algorithm ))

Refine Results
  1. 1

    A Novel Polytope Algorithm based on Nelder-mead method for localization in wireless sensor network by Gumaida, Bassam, Abubakar, Adamu

    Published 2024
    “…Methods: It is suggested that the objective function that will be optimized using NMM is the mean squared error of the range of all neighboring anchor nodes installed in the studied WSNs. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Article
  2. 2

    Analysis of toothbrush rig parameter estimation using different model orders in Real-Coded Genetic Algorithm (RCGA) by Ainul, H. M. Y., Salleh, S. M., Halib, N., Taib, H., Fathi, M. S.

    Published 2018
    “…Realcoded genetic algorithm (RCGA) as a stochastic global search method was applied for optimization. …”
    Get full text
    Article
  3. 3
  4. 4

    Analysis of Toothbrush Rig Parameter Estimation Using Different Model Orders in Real Coded Genetic Algorithm (RCGA) by Ainul, H. M. Y., Salleh, S. M., Halib, N., Taib, H., Fathi, M. S.

    Published 2024
    “…Real-coded genetic algorithm (RCGA) as a stochastic global search method was applied for optimization. …”
    Article
  5. 5

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Alzaeemi, Shehab Abdulhabib, Tay, Kim Gaik, Huong, Audrey, Sathasivam, Saratha, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  6. 6
  7. 7

    Combined heat and power (CHP) economic dispatch solved using Lagrangian relaxation with surrogate subgradient multiplier updates by Sashirekha A., Pasupuleti J., Moin N.H., Tan C.S.

    Published 2023
    “…It is also seen that simple step size rules such as the 'square summable but not summable' and 'constant step size' could be used easily and leads the method to convergence. …”
    Article
  8. 8

    An Optimized PID Parameters for LFC in Interconnected Power Systems Using MLSL Optimization Algorithm by Najeeb, Mushtaq, Shahooth, Mohammed, Mohaisen, Arrak, Ramdan, Razali, Hamdan, Daniyal

    Published 2016
    “…Integral Square Error (ISE) is considered as an objective function for both algorithms to determine its performance index value for the same interconnected power system. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Article
  9. 9

    Detecting problematic vibration on unmanned aerial vehicles via genetic-algorithm methods by Mohd Sharif, Zakaria, Mohammad Fadhil, Abas, Fatimah, Dg Jamil, Norhafidzah, Mohd Saad, Addie, Irawan, Pebrianti, Dwi

    Published 2024
    “…The fitness function with the Genetic Algorithm (GA) optimization method is tested and evaluated based on Root Mean Squared Error (RMSE), Mean Absolute Percentage Error (MAPE), and detection time. 51 sets of data have been collected using software in the loop (SITL) methods and are used to determine the effectiveness of the proposed fitness function and GA. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  10. 10

    Performance of particle swarm optimization under different range of direct current motor's moment of inertia / Mohd Azri Abdul Aziz by Abdul Aziz, Mohd Azri

    Published 2018
    “…Five fitness functions were used in comparing the performance of PSO algorithm which includes Integral Absolute Error (IAE), Integral Time-weighted Absolute Error (ITAE), Integral Squared Error (ISE), Integral Time-weighted Squared Error (ITSE) and WTRI. …”
    Get full text
    Get full text
    Thesis
  11. 11

    Parameter identification of thermoelectric modules using enhanced slime mould algorithm (ESMA) by Ponnalagu, Dharswini, Mohd Ashraf, Ahmad, Jui, Julakha Jahan

    Published 2024
    “…Acquired results which demonstrate lower values of RMSE and parameter deviation index against the standard SMA and other preceding algorithms such as particle swarm optimization, sine cosine algorithm, moth flame optimizer and ant lion optimizer ultimately verified ESMA’s efficacy as an effective approach for accurate model identification.…”
    Get full text
    Get full text
    Get full text
    Article
  12. 12
  13. 13

    Harmony search-based robust optimal controller with prior defined structure by Rafieishahemabadi, Ali

    Published 2013
    “…In this approach, a combination of interacting two levels HS optimization algorithm is presented. In the first level, a new method for analytical formulation of integral square error cost function based on controller variables is elaborated for performance evaluation purposes by the proposed optimization algorithm. …”
    Get full text
    Get full text
    Thesis
  14. 14

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  15. 15

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majid Khan bin Majahar Ali, Majid Khan bin Majahar Ali

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  16. 16

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  17. 17

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  18. 18

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Alzaeemi, Shehab Abdulhabib, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…With the use of SRBFNN-2SAT, a training method based on these algorithms has been presented, then training has been compared among algorithms, which were applied in Microsoft Visual C++ software using multiple metrics of performance, including Mean Absolute Relative Error (MARE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Bias Error (MBE), Systematic Error (SD), Schwarz Bayesian Criterion (SBC), and Central Process Unit time (CPU time). …”
    Get full text
    Get full text
    Get full text
    Article
  19. 19

    Logic Programming In Radial Basis Function Neural Networks by Hamadneh, Nawaf

    Published 2013
    “…I used different types of optimization algorithms to improve the performance of the neural networks. …”
    Get full text
    Get full text
    Thesis
  20. 20

    Automatic database of robust neural network forecasting / Saadi Ahmad Kamaruddin, Nor Azura Md. Ghani and Norazan Mohamed Ramli by Ahmad Kamaruddin, Saadi, Md. Ghani, Nor Azura, Mohamed Ramli, Norazan

    Published 2014
    “…The backpropagation algorithm is one of the most famous algorithms to train neural network based on the mean square error (MSE) of ordinary least squares (OLS). …”
    Get full text
    Get full text
    Get full text
    Book Section