Search Results - (( developing learner optimization algorithm ) OR ( java applications learning algorithm ))

Refine Results
  1. 1

    A conceptual multi-agent framework using ant colony optimization and fuzzy algorithms for learning style detection by Basheer G.S., Ahmad M.S., Tang A.Y.C.

    Published 2023
    “…The multi-agent system applies ant colony optimization and fuzzy logic search algorithms as tools to detecting learning styles. …”
    Conference Paper
  2. 2

    Meta-Heuristic Algorithms for Learning Path Recommender at MOOC by Son, N.T., Jaafar, J., Aziz, I.A., Anh, B.N.

    Published 2021
    “…We have developed Metaheuristic algorithms includes the Genetic Algorithm (GA) and Ant Colony Optimization Algorithm (ACO), to solve the proposed model. …”
    Get full text
    Get full text
    Article
  3. 3
  4. 4

    An ensemble deep learning classifier stacked with fuzzy ARTMAP for malware detection by Shing, Chiang Tan, Mohammed Al-Andoli, Mohammed Nasser, Kok, Swee Lim, Pey, Yun Goh, Chee, Peng Lim

    Published 2023
    “…The stacked ensemble method uses several heterogeneous deep neural networks as the base learners. During the training and optimization process, these base learners adopt a hybrid BP and Particle Swarm Optimization algorithm to combine both local and global optimization capabilities for identifying optimal features and improving the classification performance. …”
    Get full text
    Get full text
    Get full text
    Article
  5. 5

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning by Muslim, Much Aziz, Nikmah, Tiara Lailatul, Agustina Pertiwi, Dwika Ananda, Subhan, Subhan, Jumanto, Jumanto, Dasril, Yosza, swanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  6. 6

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning by Muslim, Much Aziz, Nikmah, Tiara Lailatul, Agustina Pertiwi b, Dwika Ananda, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  7. 7

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning by Muslim, Much Aziz, Nikmah, Tiara Lailatul, Dwika Ananda Agustina Pertiwi, Dwika Ananda Agustina Pertiwi, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  8. 8
  9. 9

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning* by Much Aziz Muslim, Much Aziz Muslim, Tiara Lailatul Nikmah, Tiara Lailatul Nikmah, Dwika Ananda Agustina Pertiwi, Dwika Ananda Agustina Pertiwi, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  10. 10

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning* by Muslim, Much Aziz, Tiara Lailatul Nikmah, Tiara Lailatul Nikmah, Dwika Ananda Agustina Pertiwi b, Dwika Ananda Agustina Pertiwi b, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  11. 11

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning* by Muslim, Much Aziz, Nikmah, Tiara Lailatul, Dwika Ananda Agustina Pertiwi b, Dwika Ananda Agustina Pertiwi b, Subhan b, Subhan b, Jumanto, Jumanto, Nikmah, Tiara Lailatul, Agustina Pertiwi, Dwika Ananda, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  12. 12

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning* by Muslim, Much Aziz, Tiara Lailatul Nikmah, Tiara Lailatul Nikmah, Dwika Ananda Agustina Pertiwi, Dwika Ananda Agustina Pertiwi, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  13. 13

    New model combination meta-learner to improve accuracy prediction P2P lending with stacking ensemble learning by Muslim, Much Aziz, Tiara Lailatul Nikmah, Tiara Lailatul Nikmah, Dwika Ananda Agustina Pertiwi, Dwika Ananda Agustina Pertiwi, Subhan, Subhan, Jumanto, Jumanto, Yosza Dasril, Yosza Dasril, Iswanto, Iswanto

    Published 2023
    “…The SMOTE method is used to balance the data, the feature selection LightGBM and stacking ensemble learning (LGBFS-StackingXGBoost) to optimize machine learning accuracy. A new model of stacking ensemble learning by combining three base-learner algorithms namely KNN, SVM and Random Forest into the XGBoost meta-learner algorithm. …”
    Get full text
    Get full text
    Get full text
    Article
  14. 14

    An Educational Tool Aimed at Learning Metaheuristics by Kader, Md. Abdul, Jamaluddin, Jamal A., Kamal Z., Zamli

    Published 2020
    “…In this paper, we introduce an education tool for learning metaheuristic algorithms that allows displaying the convergence speed of the corresponding metaheuristic upon setting/changing the dependable parameters. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  15. 15

    A Feature Ranking Algorithm in Pragmatic Quality Factor Model for Software Quality Assessment by Ruzita, Ahmad

    Published 2013
    “…The methodology used consists of theoretical study, design of formal framework on intelligent software quality, identification of Feature Ranking Technique (FRT), construction and evaluation of FRA algorithm. The assessment of quality attributes has been improved using FRA algorithm enriched with a formula to calculate the priority of attributes and followed by learning adaptation through Java Library for Multi Label Learning (MULAN) application. …”
    Get full text
    Get full text
    Get full text
    Thesis
  16. 16

    Adoption of machine learning algorithm for analysing supporters and non supporters feedback on political posts / Ogunfolajin Maruff Tunde by Ogunfolajin Maruff , Tunde

    Published 2022
    “…This thesis is based on the application of sentiment classification algorithm to tweet data with the goal of classifying messages based on the polarity of sentiment towards a particular topic (or subject matter). …”
    Get full text
    Get full text
    Get full text
    Thesis
  17. 17
  18. 18

    AI powered asthma prediction towards treatment formulation: an android app approach by Murad, Saydul Akbar, Adhikary, Apurba, Md Muzahid, Abu Jafar, Sarker, Md Murad Hossain, Khan, Md. Ashikur Rahman, Hossain, Md. Bipul, Bairagi, Anupam Kumar, Masud, Mehedi, Kowsher, Md

    Published 2022
    “…TensorFlow is utilized to integrate machine learning with an Android application. We accomplished asthma therapy using an Android application developed in Java and running on the Android Studio platform.…”
    Get full text
    Get full text
    Get full text
    Article
  19. 19

    AI powered asthma prediction towards treatment formulation : An android app approach by Murad, Saydul Akbar, Adhikary, Apurba, Muzahid, Abu Jafar Md, Sarker, Md. Murad Hossain, Khan, Md. Ashikur Rahman, Hossain, Md. Bipul, Bairagi, Anupam Kumar, Masud, Mehedi, Kowsher, Md.

    Published 2022
    “…TensorFlow is utilized to integrate machine learning with an Android application. We accomplished asthma therapy using an Android application developed in Java and running on the Android Studio platform.…”
    Get full text
    Get full text
    Get full text
    Article
  20. 20