Search Results - (( evolution classification problems algorithm ) OR ( variable optimization svm algorithm ))
Search alternatives:
- evolution classification »
- variable optimization »
- optimization svm »
- svm algorithm »
- problems »
-
1
Hybrid ACO and SVM algorithm for pattern classification
Published 2013“…The first two algorithms, ACOR-SVM and IACOR-SVM, tune the SVM parameters while the second two algorithms, ACOMV-R-SVM and IACOMV-R-SVM, tune the SVM parameters and select the feature subset simultaneously. …”
Get full text
Get full text
Get full text
Thesis -
2
Formulating new enhanced pattern classification algorithms based on ACO-SVM
Published 2013“…This paper presents two algorithms that integrate new Ant Colony Optimization (ACO) variants which are Incremental Continuous Ant Colony Optimization (IACOR) and Incremental Mixed Variable Ant Colony Optimization (IACOMV) with Support Vector Machine (SVM) to enhance the performance of SVM.The first algorithm aims to solve SVM model selection problem. …”
Get full text
Get full text
Get full text
Article -
3
Feature selection and model selection algorithm using incremental mixed variable ant colony optimization for support vector machine classifier
Published 2013“…In order to enhance SVM performance, these problems must be solved simultaneously because error produced from the feature subset selection phase will affect the values of the SVM parameters and resulted in low classification accuracy.Most approaches related with solving SVM model selection problem will discretize the continuous value of SVM parameters which will influence its performance.Incremental Mixed Variable Ant Colony Optimization (IACOMV) has the ability to solve SVM model selection problem without discretising the continuous values and simultaneously solve the two problems.This paper presents an algorithm that integrates IACOMV and SVM.Ten datasets from UCI were used to evaluate the performance of the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with small number of features.…”
Get full text
Get full text
Get full text
Article -
4
Mixed variable ant colony optimization technique for feature subset selection and model selection
Published 2013“…This paper presents the integration of Mixed Variable Ant Colony Optimization and Support Vector Machine (SVM) to enhance the performance of SVM through simultaneously tuning its parameters and selecting a small number of features.The process of selecting a suitable feature subset and optimizing SVM parameters must occur simultaneously,because these processes affect each ot her which in turn will affect the SVM performance.Thus producing unacceptable classification accuracy.Five datasets from UCI were used to evaluate the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with the small size of features subset.…”
Get full text
Get full text
Get full text
Conference or Workshop Item -
5
Solving SVM model selection problem using ACOR and IACOR
Published 2013“…In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretize process would result in loss of some information and hence affect the classification accuracy.In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM. …”
Get full text
Get full text
Get full text
Article -
6
Intelligent classification algorithms in enhancing the performance of support vector machine
Published 2019“…This paper presents two intelligent algorithms that hybridized between ant colony optimization (ACO) and SVM for tuning SVM parameters and selecting feature subset without having to discretize the continuous values. …”
Get full text
Get full text
Get full text
Article -
7
Integrated ACOR/IACOMV-R-SVM Algorithm
Published 2017“…The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. …”
Get full text
Get full text
Get full text
Article -
8
Solving Support Vector Machine Model Selection Problem Using Continuous Ant Colony Optimization
Published 2013“…Ant Colony Optimization has been used to solve Support Vector Machine model selection problem.Ant Colony Optimization originally deals with discrete optimization problem.In applying Ant Colony Optimization for optimizing Support Vector Machine parameters which are continuous variables, there is a need to discretize the continuously value into discrete value.This discretize process would result in loss of some information and hence affect the classification accuracy and seeking time.This study proposes an algorithm that can optimize Support Vector Machine parameters using Continuous Ant Colony Optimization without the need to discretize continuous value for Support Vector Machine parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed hybrid algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM.…”
Get full text
Get full text
Get full text
Article -
9
Differential evolution for neural networks learning enhancement
Published 2008“…Evolutionary computation is the name given to a collection of algorithms based on the evolution of a population toward a solution of a certain problem. …”
Get full text
Get full text
Get full text
Thesis -
10
A New Quadratic Binary Harris Hawk Optimization For Feature Selection
Published 2019“…In this study, twenty-two datasets collected from the UCI machine learning repository are used to validate the performance of proposed algorithms. A comparative study is conducted to compare the effectiveness of QBHHO with other feature selection algorithms such as binary differential evolution (BDE), genetic algorithm (GA), binary multi-verse optimizer (BMVO), binary flower pollination algorithm (BFPA), and binary salp swarm algorithm (BSSA). …”
Get full text
Get full text
Get full text
Article -
11
Algorithmic design issues in adaptive differential evolution schemes: Review and taxonomy
Published 2018“…A trend that has emerged recently is to make the algorithm parameters automatically adapt to different problems during optimization, thereby liberating the user from the tedious and time-consuming task of manual setting. …”
Get full text
Get full text
Article -
12
Feature selection optimization using hybrid relief-f with self-adaptive differential evolution
Published 2017“…Hence, feature selection is embedded to select the most meaningful features based on their rank. Differential evolution (DE) is one of the evolutionary algorithms that are widely used in various classification domains. …”
Get full text
Get full text
Get full text
Article -
13
Artificial fish swarm optimization for multilayer network learning in classification problems
Published 2012“…Artificial Fish Swarm Algorithm (AFSA) as one of the NIC methods is widely used for optimizing the global searching of ANN.In this study, we applied the AFSA method to improve the Multilayer Perceptron (MLP) learning for promising accuracy in various classification problems.The parameters of AFSA: AFSA prey, AFSA swarm and AFSA follow are implemented on the MLP network for improving the accuracy of various classification datasets from UCI machine learning. …”
Get full text
Get full text
Get full text
Article -
14
Artificial Fish Swarm Optmization for Multilayernetwork Learning in Classification Problems
Published 2012“…In this study, we applied the AFSA method to improve the Multilayer Perceptron (MLP) learning for promising accuracy in various classification problems. The parameters of AFSA: AFSA prey, AFSA swarm and AFSA follow are implemented on the MLP network for improving the accuracy of various classification datasets from UCI machine learning. …”
Get full text
Get full text
Get full text
Article -
15
Developing flood mapping procedure through optimized machine learning techniques. Case study: Prahova river basin, Romania
Published 2025“…We used 158 flood locations as dependent variables in the training of four hybrid models: Deep Learning Neural Network-Statistical Index (DLNN-SI), Particle Swarm Optimization-Deep Learning Neural Network-Statistical Index (PSO-DLNN-SI), Support Vector Machine-Statistical Index (SVM-SI), and Particle Swarm Optimization-Support Vector Machine-Statistical Index (PSO-SVM-SI). …”
Article -
16
Improvement on rooftop classification of worldview-3 imagery using object-based image analysis
Published 2019“…The accuracy of each algorithm was evaluated using LibSVM, Bayes network, and Adaboost classifier. …”
Get full text
Get full text
Thesis -
17
Design Of Feature Selection Methods For Hand Movement Classification Based On Electromyography Signals
Published 2020“…Therefore, this thesis aims to solve the feature selection problem in EMG signals classification and improve the classification performance of EMG pattern recognition system. …”
Get full text
Get full text
Get full text
Get full text
Thesis -
18
Adaptive differential evolution algorithm with fitness based selection of parameters and mutation strategies / Rawaa Dawoud Hassan Al-Dabbagh
Published 2015“…Differential evolution (DE) is a simple yet powerful evolutionary algorithm (EA). …”
Get full text
Get full text
Thesis -
19
-
20
Artificial neural network learning enhancement using Artificial Fish Swarm Algorithm
Published 2011“…Artificial Neural Network (ANN) is a new information processing system with large quantity of highly interconnected neurons or elements processing parallel to solve problems.Recently, evolutionary computation technique, Artificial Fish Swarm Algorithm (AFSA) is chosen to optimize global searching of ANN.In optimization process, each Artificial Fish (AF) represents a neural network with output of fitness value.The AFSA is used in this study to analyze its effectiveness in enhancing Multilayer Perceptron (MLP) learning compared to Particle Swarm Optimization (PSO) and Differential Evolution (DE) for classification problems.The comparative results indeed demonstrate that AFSA show its efficient, effective and stability in MLP learning.…”
Get full text
Get full text
Get full text
Conference or Workshop Item
