Search Results - (( using function method algorithm ) OR ( set iteration method algorithm ))

Refine Results
  1. 1

    Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems by Zulkifli, Munierah, Abd Rahmin, Nor Aliza, Wah, June Leong

    Published 2023
    “…It is an iterative algorithm with descent properties that reduces computational cost by using derivatives of random data points. …”
    Get full text
    Get full text
    Get full text
    Article
  2. 2

    Augmentation of basic-line-search and quick-simplex-method algorithms to enhance linear programming computational performance by Nor Azlan, Nor Asmaa Alyaa

    Published 2021
    “…The LP’s application is need to be further computed with a technique and Simplex algorithm is the one that commonly used. The Simplex algorithm has three stages of computation namely initialization, iterative calculation and termination. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  3. 3

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Alzaeemi, Shehab Abdulhabib, Tay, Kim Gaik, Huong, Audrey, Sathasivam, Saratha, Majahar Ali, Majid Khan

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  4. 4

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  5. 5

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majid Khan bin Majahar Ali, Majid Khan bin Majahar Ali

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  6. 6

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  7. 7

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Shehab Abdulhabib Alzaeemi, Shehab Abdulhabib Alzaeemi, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  8. 8

    Evolution Performance of Symbolic Radial Basis Function Neural Network by Using Evolutionary Algorithms by Alzaeemi, Shehab Abdulhabib, Kim Gaik Tay, Kim Gaik Tay, Audrey Huong, Audrey Huong, Saratha Sathasivam, Saratha Sathasivam, Majahar Ali, Majid Khan

    Published 2023
    “…It has been confirmed that the EP algorithm is quite effective in training and obtaining the best output weight, accompanied by the slightest iteration error, which minimizes the objective function of SRBFNN-2SAT.…”
    Get full text
    Get full text
    Get full text
    Article
  9. 9

    Logistic regression methods for classification of imbalanced data sets by Santi Puteri Rahayu, -

    Published 2012
    “…These results can be seen as further explanation on the success of Truncated Newton method in TR-KLR and TR Iteratively Re-weighted Least Square (TR-IRLS) algorithm respectively, because of the equivalence of iterative method used by these algorithms. …”
    Get full text
    Get full text
    Thesis
  10. 10

    Quasi-Newton type method via weak secant equations for unconstrained optimization by Lim, Keat Hee

    Published 2021
    “…Overall, numerical results prove that these proposed methods are superior in terms of number of iterations and function evaluations. …”
    Get full text
    Get full text
    Thesis
  11. 11
  12. 12

    Optimization Of Pid Controller Using Grey Wolf Optimzer And Dragonfly Algorithm by Nik Mohamed Hazli, Nik Muhammad Aiman

    Published 2018
    “…However, to fully utilize the algorithm, the parameter of the algorithm need to be set properly. …”
    Get full text
    Get full text
    Monograph
  13. 13
  14. 14

    Modelling of multi-robot system for search and rescue by Poy, Yi Ler

    Published 2023
    “…Moreover, to cope with dynamic environments, a combination of global and local path planning methods is introduced. The PSO algorithm functions as a global path planner, determining the complete path for each robot, whereas a sensor-based obstacle avoidance algorithm serves as a local planner to avoid collision with dynamic obstacles during navigation. …”
    Get full text
    Get full text
    Final Year Project / Dissertation / Thesis
  15. 15
  16. 16

    A Rough-Apriori Technique in Mining Linguistic Association Rules by Choo, Yun Huoy, Abu Bakar, Azuraliza, Hamdan, Abdul Razak

    Published 2008
    “…Five UCI datasets were tested in the 10-fold cross validation experiment settings. The frequent itemsets discovery in the Apriori algorithm was constrained to five iterations comparing to maximum iterations. …”
    Get full text
    Get full text
    Get full text
    Book Chapter
  17. 17

    Comparative analysis of line search methods in the Steepest Descent algorithm for unconstrained optimization problems / Ahmad Zikri Shukeri, Puteri Qurratu Ain Megat Sulzamzamendi... by Shukeri, Ahmad Zikri, Megat Sulzamzamendi, Puteri Qurratu Ain, Ibrahim, Suhaida

    Published 2024
    “…The result from this study is choosing FMRI algorithm using exact line search to get faster convergence rate which means that the algorithm achieves a high level of accuracy in fewer iterations compared to using other algorithms and inexact line search.…”
    Get full text
    Get full text
    Student Project
  18. 18

    The effect of adaptive parameters on the performance of back propagation by Abdul Hamid, Norhamreeza

    Published 2012
    “…The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  19. 19

    A modified technique in RFID networking planning and optimization by Nawawi, Azli

    Published 2015
    “…In this research, PSO algorithm was used in the optimization process as it was considered as a very useful, efficient and well known algorithm. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  20. 20

    Memoryless modified symmetric rank-one method for large-scale unconstrained optimization by Modarres, Farzin, Abu Hassan, Malik, Leong, Wah June

    Published 2009
    “…Computational results, for a test set consisting of 73 unconstrained optimization problems, show that the proposed algorithm is very encouraging. …”
    Get full text
    Get full text
    Get full text
    Article