Search Results - (( using function search algorithm ) OR ( parameter classification problem algorithm ))

Refine Results
  1. 1

    Incremental continuous ant colony optimization for tuning support vector machine’s parameters by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…Support Vector Machines are considered to be excellent patterns classification techniques. The process of classifying a pattern with high classification accuracy counts mainly on tuning Support Vector Machine parameters which are the generalization error parameter and the kernel function parameter.Tuning these parameters is a complex process and Ant Colony Optimization can be used to overcome the difficulty. …”
    Get full text
    Get full text
    Get full text
    Article
  2. 2

    Adaptive differential evolution algorithm with fitness based selection of parameters and mutation strategies / Rawaa Dawoud Hassan Al-Dabbagh by Rawaa Dawoud Hassan, Al-Dabbagh

    Published 2015
    “…ARDE algorithm makes use of JADE strategy and the MDE_pBX parameters adaptive schemes as frameworks. …”
    Get full text
    Get full text
    Thesis
  3. 3

    Optimizing support vector machine parameters using continuous ant colony optimization by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2012
    “…Hence, in applying Ant Colony Optimization for optimizing Support Vector Machine parameters, which are continuous parameters, there is a need to discretize the continuous value into a discrete value.This discretization process results in loss of some information and, hence, affects the classification accuracy and seek time.This study proposes an algorithm to optimize Support Vector Machine parameters using continuous Ant Colony Optimization without the need to discretize continuous values for Support Vector Machine parameters.Seven datasets from UCI were used to evaluate the performance of the proposed hybrid algorithm.The proposed algorithm demonstrates the credibility in terms of classification accuracy when compared to grid search techniques.Experimental results of the proposed algorithm also show promising performance in terms of computational speed.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  4. 4

    Tree-based contrast subspace mining method by Florence Sia Fui Sze

    Published 2020
    “…Genetic algorithm has been widely used to find global solution to optimization and search problem. …”
    Get full text
    Get full text
    Get full text
    Thesis
  5. 5

    Algorithmic design issues in adaptive differential evolution schemes: Review and taxonomy by Al-Dabbagh, Rawaa Dawoud, Neri, Ferrante, Idris, Norisma, Baba, Mohd Sapiyan

    Published 2018
    “…The performance of most metaheuristic algorithms depends on parameters whose settings essentially serve as a key function in determining the quality of the solution and the efficiency of the search. …”
    Get full text
    Get full text
    Article
  6. 6

    RMIL/AG: A new class of nonlinear conjugate gradient for training back propagation algorithm by Basri, S.M.M., Nawi, N.M., Mamat, M., Hamid, N.A.

    Published 2018
    “…The efficiency of the proposed method is verified by means of simulation on four classification problems. The results show that the computational efficiency of the proposed method was better than the conventional BP algorithm.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  7. 7
  8. 8

    The effect of pre-processing techniques and optimal parameters on BPNN for data classification by HUSSEIN, AMEER SALEH

    Published 2015
    “…In this research, a performance analysis based on different activation functions; gradient descent and gradient descent with momentum, for training the BP algorithm with pre-processing techniques was executed. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  9. 9

    Hybrid ACO and SVM algorithm for pattern classification by Alwan, Hiba Basim

    Published 2013
    “…This study presents four algorithms for tuning the SVM parameters and selecting feature subset which improved SVM classification accuracy with smaller size of feature subset. …”
    Get full text
    Get full text
    Get full text
    Thesis
  10. 10

    Fuzzy modeling using Bat Algorithm optimization for classification by Noor Amidah, Ahmad Sultan

    Published 2018
    “…In order to create parameters, there are many problems arise in the process of fuzzy modeling. …”
    Get full text
    Get full text
    Get full text
    Undergraduates Project Papers
  11. 11

    Integrated ACOR/IACOMV-R-SVM Algorithm by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2017
    “…The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. …”
    Get full text
    Get full text
    Get full text
    Article
  12. 12

    Gravitational search – bat algorithm for solving single and bi-objective of non-linear functions by Abbas, Iraq Tareq

    Published 2018
    “…The second technique is to solve bi-objective functions by using the BOBAT algorithm. The third technique is an integration of BOGSA with BOBAT to produce a BOGSBAT algorithm. …”
    Get full text
    Get full text
    Thesis
  13. 13

    Feature selection and model selection algorithm using incremental mixed variable ant colony optimization for support vector machine classifier by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…In order to enhance SVM performance, these problems must be solved simultaneously because error produced from the feature subset selection phase will affect the values of the SVM parameters and resulted in low classification accuracy.Most approaches related with solving SVM model selection problem will discretize the continuous value of SVM parameters which will influence its performance.Incremental Mixed Variable Ant Colony Optimization (IACOMV) has the ability to solve SVM model selection problem without discretising the continuous values and simultaneously solve the two problems.This paper presents an algorithm that integrates IACOMV and SVM.Ten datasets from UCI were used to evaluate the performance of the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with small number of features.…”
    Get full text
    Get full text
    Get full text
    Article
  14. 14

    Inversed Control Parameter in Whale Optimization Algorithm and Grey Wolf Optimizer for Wrapper-Based Feature Selection: A Comparative Study by Li Yu Yab, Li Yu Yab, Wahid, Noorhaniza, A Hamid, Rahayu

    Published 2023
    “…—Whale Optimization Algorithm (WOA) and Grey Wolf Optimizer (GWO) are well-perform metaheuristic algorithms used by various researchers in solving feature selection problems. …”
    Get full text
    Get full text
    Article
  15. 15

    Inversed Control Parameter in Whale Optimization Algorithm and Grey Wolf Optimizer for Wrapper-Based Feature Selection: A Comparative Study by Li Yu Yab, Li Yu Yab, Wahid, Noorhaniza, A Hamid, Rahayu

    Published 2023
    “…Whale Optimization Algorithm (WOA) and Grey Wolf Optimizer (GWO) are well-perform metaheuristic algorithms used by various researchers in solving feature selection problems. …”
    Get full text
    Get full text
    Article
  16. 16

    Mixed-variable ant colony optimisation algorithm for feature subset selection and tuning support vector machine parameter by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2017
    “…ACOMV-SVM algorithm is able to simultaneously tune SVM parameters and feature subset selection. …”
    Get full text
    Get full text
    Article
  17. 17
  18. 18

    Classification of breast cancer disease using bagging fuzzy-id3 algorithm based on fuzzydbd by Nur Farahaina, Idris

    Published 2022
    “…One of the most powerful machine learning methods to handle classification problems is the decision tree. There are various decision tree algorithms, but the most commonly used are Iterative Dichotomiser 3 (ID3), CART, and C4.5. …”
    Get full text
    Get full text
    Thesis
  19. 19
  20. 20

    Solving SVM model selection problem using ACOR and IACOR by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretize process would result in loss of some information and hence affect the classification accuracy.In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM. …”
    Get full text
    Get full text
    Get full text
    Article