A comparative performance of different convolutional neural network activation functions on image classification
Activation functions are crucial in optimising Convolutional Neural Networks (CNNs) for image classification. While CNNs excel at capturingspatial hierarchies in images, the activation functions substantially impact their effectiveness. Traditional functions, such as ReLU and Sigmoid, have drawbacks...
محفوظ في:
المؤلفون الرئيسيون: | , |
---|---|
التنسيق: | مقال |
اللغة: | English |
منشور في: |
IIUM Press
2024
|
الموضوعات: | |
الوصول للمادة أونلاين: | http://irep.iium.edu.my/116734/7/116734_A%20comparative%20performance.pdf http://irep.iium.edu.my/116734/ https://journals.iium.edu.my/kict/index.php/IJPCC/article/view/490/295 |
الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
الملخص: | Activation functions are crucial in optimising Convolutional Neural Networks (CNNs) for image classification. While CNNs excel at capturingspatial hierarchies in images, the activation functions substantially impact their effectiveness. Traditional functions, such as ReLU and Sigmoid, have drawbacks, including the "dying ReLU" problem and vanishing gradients, which can inhibit learning and efficacy. The study seeks to comprehensively analyse various activation functions across different CNN architectures to determine their impact on performance. The findings suggest that Swish and Leaky ReLU outperform other functions, with Swish particularly promising in complicated networks such as ResNet. This emphasises the relevance of activation function selection in improving CNN performance and implies that investigating alternative functions can lead to more accurate and efficient models for image classification tasks. |
---|