Search Results - (( variable optimization svm algorithm ) OR ( evolution classification learning algorithm ))

Refine Results
  1. 1

    Hybrid ACO and SVM algorithm for pattern classification by Alwan, Hiba Basim

    Published 2013
    “…The first two algorithms, ACOR-SVM and IACOR-SVM, tune the SVM parameters while the second two algorithms, ACOMV-R-SVM and IACOMV-R-SVM, tune the SVM parameters and select the feature subset simultaneously. …”
    Get full text
    Get full text
    Get full text
    Thesis
  2. 2

    Formulating new enhanced pattern classification algorithms based on ACO-SVM by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…This paper presents two algorithms that integrate new Ant Colony Optimization (ACO) variants which are Incremental Continuous Ant Colony Optimization (IACOR) and Incremental Mixed Variable Ant Colony Optimization (IACOMV) with Support Vector Machine (SVM) to enhance the performance of SVM.The first algorithm aims to solve SVM model selection problem. …”
    Get full text
    Get full text
    Get full text
    Article
  3. 3

    Feature selection and model selection algorithm using incremental mixed variable ant colony optimization for support vector machine classifier by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…In order to enhance SVM performance, these problems must be solved simultaneously because error produced from the feature subset selection phase will affect the values of the SVM parameters and resulted in low classification accuracy.Most approaches related with solving SVM model selection problem will discretize the continuous value of SVM parameters which will influence its performance.Incremental Mixed Variable Ant Colony Optimization (IACOMV) has the ability to solve SVM model selection problem without discretising the continuous values and simultaneously solve the two problems.This paper presents an algorithm that integrates IACOMV and SVM.Ten datasets from UCI were used to evaluate the performance of the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with small number of features.…”
    Get full text
    Get full text
    Get full text
    Article
  4. 4

    Mixed variable ant colony optimization technique for feature subset selection and model selection by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…This paper presents the integration of Mixed Variable Ant Colony Optimization and Support Vector Machine (SVM) to enhance the performance of SVM through simultaneously tuning its parameters and selecting a small number of features.The process of selecting a suitable feature subset and optimizing SVM parameters must occur simultaneously,because these processes affect each ot her which in turn will affect the SVM performance.Thus producing unacceptable classification accuracy.Five datasets from UCI were used to evaluate the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with the small size of features subset.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  5. 5

    Solving SVM model selection problem using ACOR and IACOR by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretize process would result in loss of some information and hence affect the classification accuracy.In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM. …”
    Get full text
    Get full text
    Get full text
    Article
  6. 6

    Intelligent classification algorithms in enhancing the performance of support vector machine by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2019
    “…This paper presents two intelligent algorithms that hybridized between ant colony optimization (ACO) and SVM for tuning SVM parameters and selecting feature subset without having to discretize the continuous values. …”
    Get full text
    Get full text
    Get full text
    Article
  7. 7

    Integrated ACOR/IACOMV-R-SVM Algorithm by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2017
    “…The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. …”
    Get full text
    Get full text
    Get full text
    Article
  8. 8

    Differential evolution for neural networks learning enhancement by Ismail Wdaa, Abdul Sttar

    Published 2008
    “…Three programs have developed; Differential Evolution Neural Network (DENN), Genetic Algorithm Neural Network (GANN) and Particle Swarm Optimization with Neural Network (PSONN) to probe the impact of these methods on ANN learning using various datasets. …”
    Get full text
    Get full text
    Get full text
    Thesis
  9. 9

    Solving Support Vector Machine Model Selection Problem Using Continuous Ant Colony Optimization by Alwan, Hiba Basim, Ku-Mahamud, Ku Ruhana

    Published 2013
    “…Ant Colony Optimization has been used to solve Support Vector Machine model selection problem.Ant Colony Optimization originally deals with discrete optimization problem.In applying Ant Colony Optimization for optimizing Support Vector Machine parameters which are continuous variables, there is a need to discretize the continuously value into discrete value.This discretize process would result in loss of some information and hence affect the classification accuracy and seeking time.This study proposes an algorithm that can optimize Support Vector Machine parameters using Continuous Ant Colony Optimization without the need to discretize continuous value for Support Vector Machine parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed hybrid algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM.…”
    Get full text
    Get full text
    Get full text
    Article
  10. 10

    Multi-Objective Hybrid Algorithm For The Classification Of Imbalanced Datasets by Saeed, Sana

    Published 2019
    “…The proposed algorithm is grounded on the two famous metaheuristic algorithms: cuckoo search (CS) and covariance matrix adaptation evolution strategy (CMA-es). …”
    Get full text
    Get full text
    Thesis
  11. 11

    Email spam classification based on deep learning methods: A review by Tusher, Ekramul Haque, Mohd Arfian, Ismail, Anis Farihan, Mat Raffei

    Published 2025
    “…Email spam is a significant issue confronting both email consumers and providers. The evolution of spam filtering has progressed considerably, transitioning from basic rule-based filters to more sophisticated machine learning algorithms. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Article
  12. 12

    Genetic ensemble biased ARTMAP method of ECG-Based emotion classification by Loo, C.K., Liew, W.S., Sayeed, M.S.

    Published 2012
    “…Individual emotional states are highly variable and are subject to evolution from personal experiences. For this reason, the above system is designed to be able to perform learning and classification in real-time to account for inter-individual and intra-individual emotional drift over time. …”
    Get full text
    Get full text
    Conference or Workshop Item
  13. 13

    Artificial fish swarm optimization for multilayer network learning in classification problems by Hasan, Shafaatunnur, Tan, Swee Quo, Shamsuddin, Siti Mariyam

    Published 2012
    “…Nature-Inspired Computing (NIC) has always been a promising tool to enhance neural network learning. Artificial Fish Swarm Algorithm (AFSA) as one of the NIC methods is widely used for optimizing the global searching of ANN.In this study, we applied the AFSA method to improve the Multilayer Perceptron (MLP) learning for promising accuracy in various classification problems.The parameters of AFSA: AFSA prey, AFSA swarm and AFSA follow are implemented on the MLP network for improving the accuracy of various classification datasets from UCI machine learning. …”
    Get full text
    Get full text
    Get full text
    Article
  14. 14

    Artificial Fish Swarm Optmization for Multilayernetwork Learning in Classification Problems by Hasan, Shafaatunnur, Tan, Swee Quo, Shamsuddin, Siti Mariyam, Sallehuddin, Roselina

    Published 2012
    “…In this study, we applied the AFSA method to improve the Multilayer Perceptron (MLP) learning for promising accuracy in various classification problems. …”
    Get full text
    Get full text
    Get full text
    Article
  15. 15

    Artificial neural network learning enhancement using Artificial Fish Swarm Algorithm by Hasan, Shafaatunnur, Tan, Swee Quo, Shamsuddin, Siti Mariyam, Sallehuddin, Roselina

    Published 2011
    “…Artificial Neural Network (ANN) is a new information processing system with large quantity of highly interconnected neurons or elements processing parallel to solve problems.Recently, evolutionary computation technique, Artificial Fish Swarm Algorithm (AFSA) is chosen to optimize global searching of ANN.In optimization process, each Artificial Fish (AF) represents a neural network with output of fitness value.The AFSA is used in this study to analyze its effectiveness in enhancing Multilayer Perceptron (MLP) learning compared to Particle Swarm Optimization (PSO) and Differential Evolution (DE) for classification problems.The comparative results indeed demonstrate that AFSA show its efficient, effective and stability in MLP learning.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  16. 16

    Developing flood mapping procedure through optimized machine learning techniques. Case study: Prahova river basin, Romania by Diaconu D.C., Costache R., Towfiqul Islam A.R.M., Pandey M., Pal S.C., Mishra A.P., Pande C.B.

    Published 2025
    “…We used 158 flood locations as dependent variables in the training of four hybrid models: Deep Learning Neural Network-Statistical Index (DLNN-SI), Particle Swarm Optimization-Deep Learning Neural Network-Statistical Index (PSO-DLNN-SI), Support Vector Machine-Statistical Index (SVM-SI), and Particle Swarm Optimization-Support Vector Machine-Statistical Index (PSO-SVM-SI). …”
    Article
  17. 17

    A New Quadratic Binary Harris Hawk Optimization For Feature Selection by Abdullah, Abdul Rahim, Too, Jing Wei, Mohd Saad, Norhashimah

    Published 2019
    “…A comparative study is conducted to compare the effectiveness of QBHHO with other feature selection algorithms such as binary differential evolution (BDE), genetic algorithm (GA), binary multi-verse optimizer (BMVO), binary flower pollination algorithm (BFPA), and binary salp swarm algorithm (BSSA). …”
    Get full text
    Get full text
    Get full text
    Article
  18. 18

    Improvement on rooftop classification of worldview-3 imagery using object-based image analysis by Norman, Masayu

    Published 2019
    “…The accuracy of each algorithm was evaluated using LibSVM, Bayes network, and Adaboost classifier. …”
    Get full text
    Get full text
    Thesis
  19. 19

    Feature selection optimization using hybrid relief-f with self-adaptive differential evolution by Zainudin, Muhammad Noorazlan Shah, Sulaiman, Md. Nasir, Mustapha, Norwati, Perumal, Thinagaran, Ahmad Nazri, Azree Shahrel, Mohamed, Raihani, Abd Manaf, Syaifulnizam

    Published 2017
    “…Hence, feature selection is embedded to select the most meaningful features based on their rank. Differential evolution (DE) is one of the evolutionary algorithms that are widely used in various classification domains. …”
    Get full text
    Get full text
    Get full text
    Article
  20. 20

    Improved whale optimization algorithm for feature selection in Arabic sentiment analysis by Tubishat, Mohammad, Abushariah, Mohammad A.M., Idris, Norisma, Aljarah, Ibrahim

    Published 2019
    “…To verify our proposed approach, four Arabic benchmark datasets for sentiment analysis are used since there are only a few studies in sentiment analysis conducted for Arabic language as compared to English. The proposed algorithm is compared with six well-known optimization algorithms and two deep learning algorithms. …”
    Get full text
    Get full text
    Article