Search Results - (( parallel optimization method algorithm ) OR ( parameter training learning algorithm ))

Refine Results
  1. 1

    Probabilistic ensemble fuzzy ARTMAP optimization using hierarchical parallel genetic algorithms by Loo, C.K., Liew, W.S., Seera, M., Lim, Einly

    Published 2015
    “…The issues addressed are the sequence of training data for supervised learning and optimum parameter tuning for parameters such as baseline vigilance. …”
    Get full text
    Get full text
    Get full text
    Article
  2. 2

    Voting algorithms for large scale fault-tolerant systems by Karimi, Abbas

    Published 2011
    “…To solve this problem and gain benefits of this algorithm, we employed parallel algorithm technique and by using optimal number of processors, we could propose optimal algorithms known as Parallel Average Voting and Parallel Weighted Average Voting which both have optimal time complexity and less calculation cost. …”
    Get full text
    Get full text
    Thesis
  3. 3

    Enhancement processing time and accuracy training via significant parameters in the batch BP algorithm by Fatma Susilawati, Mohamad, Mumtazimah, Mohamad, Sarhan, AlDuais

    Published 2020
    “…The learning rate and momentum factor are the are the most significant parameter for increasing the efficiency of the BBP algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  4. 4

    Topology-aware hypergraph based approach to optimize scheduling of parallel applications onto distributed parallel architectures by Koohi, Sina Zangbari

    Published 2020
    “…Any optimization algorithm is suitable for only a specific domain of optimization problems. …”
    Get full text
    Get full text
    Thesis
  5. 5

    Investigation on the dynamic of computation of semi autonomous evolutionary computation for syntactic optimization of a set of programming codes by Mohammad Sigit Arifianto, Tze, Kenneth Kin Teo, Liau, Chung Fan, Liawas Barukang, Zaturrawiah Ali Omar

    Published 2007
    “…Genetic Algorithm as one of the Evolutionary Computation method improve the execution of parallel programming codes by optimizing the number of processors and the distribution of data. …”
    Get full text
    Get full text
    Research Report
  6. 6

    Improved black-winged kite algorithm and finite element analysis for robot parallel gripper design by Haohao, Ma, As’arry, Azizan, Yanwei, Feng, Lulu, Cheng, Delgoshaei, Aidin, Ismail, Mohd Idris Shah, Ramli, Hafiz Rashidi

    Published 2024
    “…This paper presents a comprehensive study on the design optimization of a robotic gripper, focusing on both the gripper modeling and the optimization of its parallel mechanism structure. …”
    Get full text
    Get full text
    Get full text
    Article
  7. 7

    Parameter Estimation of Lorenz Attractor: A Combined Deep Neural Network and K-Means Clustering Approach by Nurnajmin Qasrina Ann, ., Pebrianti, Dwi, Mohamad Fadhil, Abas, Bayuaji, Luhur

    Published 2022
    “…Therefore, it is crucial to assess the parameter of chaotic systems. To solve the issue of parameter estimation for a chaotic system, deep learning is utilized. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  8. 8

    Sequential and parallel multiple tabu search algorithm for multiobjective urban transit scheduling problems by Uvaraja, Vikneswary

    Published 2018
    “…Additionally, the MTS algorithm is also implemented in parallel computing to produce parallel MTS for generating comparable solutions in shorter computational times. …”
    Get full text
    Get full text
    Thesis
  9. 9

    Analysis of evolutionary computing performance via mapreduce parallel processing architecture / Ahmad Firdaus Ahmad Fadzil by Ahmad, Ahmad Firdaus

    Published 2014
    “…Examples of EC such as Genetic Algorithm (GA) and PSO (Particle Swarm Optimization) are prevalent due to their efficiency and effectiveness. …”
    Get full text
    Get full text
    Thesis
  10. 10
  11. 11

    Distributed parallel deep learning with a hybrid backpropagation-particle swarm optimization for community detection in large complex networks by Shing, Chiang Ta, Mohammed Al-Andoli, Mohammed Nasser, Wooi, Ping Cheah

    Published 2022
    “…Next, the method is integrated with two optimization algorithms: (1) backpropagation (BP), which optimizes deep learning locally within each local chunk of the CN; (2) particle swarm optimization (PSO), which is used to improve the BP optimization involving all CN chunks. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Article
  12. 12
  13. 13

    Accelerated mine blast algorithm for ANFIS training for solving classification problems by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…ANFIS accuracy depends on the parameters it is trained with. Keeping in view the drawbacks of gradients based learning of ANFIS using gradient descent and least square methods in two-pass learning algorithm, many have trained ANFIS using metaheuristic algorithms. …”
    Get full text
    Get full text
    Article
  14. 14

    PID Parameters Improvement for AGC in Three Parallel-Connected Power Systems by Mushtaq, Najeeb, Ramdan, Razali, K. G., Mohammed, Hamdan, Daniyal, Ali, M. Humada

    Published 2016
    “…The AGC loop is used to minimize the frequency deviation and control the power exchange in order to maintain them at their scheduled values due to the changes of the step-load disturbance. The optimal parameters of the PID scheme optimized by the proposed MS algorithm are compared with that one’s obtained by GA algorithm, and the proposed method has proven that its performance is more efficient and improved as well. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  15. 15

    Design and analysis of high performance matrix filling for DNA sequence alignment accelerator using asic design flow: article / Nurzaima Mahmod by Mahmod, Nurzaima

    “…The scope of this paper is to optimize the DNA sequences alignment on the matrix rilling module by implementing a parallel method of the Smith-Waterman algorithm. …”
    Get full text
    Get full text
    Article
  16. 16

    A review of training methods of ANFIS for applications in business and economic by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
    Get full text
    Get full text
    Article
  17. 17

    A review of training methods of ANFIS for applications in business and economics by Mohd Salleh, Mohd Najib, Hussain, Kashif

    Published 2016
    “…Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. …”
    Get full text
    Get full text
    Get full text
    Article
  18. 18

    PROPOSED METHODOLOGY FOR OPTIMIZING THE TRAINING PARAMETERS OF A MULTILAYER FEED-FORWARD ARTIFICIAL NEURAL NETWORKS USING A GENETIC ALGORITHM by ABDALLA, OSMAN AHMED

    Published 2011
    “…This research focuses on the use of binaryencoded genetic algorithm (GA) to implement efficient search strategies for the optimal architecture and training parameters of a multilayer feed-forward ANN. …”
    Get full text
    Get full text
    Thesis
  19. 19

    Design and analysis of high performance matrix filling for DNA sequence alignment accelerator using ASIC design flow / Nurzaima Mahmod by Mahmod, Nurzaima

    Published 2010
    “…The scope of this paper is to optimize the DNA sequences alignment on the matrix filling module by implementing a parallel method of the SmithWaterman algorithm. …”
    Get full text
    Get full text
    Thesis
  20. 20

    Application Of Genetic Algorithms For Robust Parameter Optimization by Belavendram, N.

    Published 2010
    “…Genetic algorithms (GA) are fairly recent in this respect but afford a novel method of parameter optimization. …”
    Get full text
    Get full text
    Article