Improve of contrast-distorted image quality assessment based on convolutional neural networks
Many image quality assessment algorithms (IQAs) have been developed during the past decade. However, most of them are designed for images distorted by compression, noise and blurring. There are very few IQAs designed specifically for Contrast Distorted Images (CDI), e.g. Reduced-reference Image Qual...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
2020
|
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many image quality assessment algorithms (IQAs) have been developed during the past decade. However, most of them are designed for images distorted by compression, noise and blurring. There are very few IQAs designed specifically for Contrast Distorted Images (CDI), e.g. Reduced-reference Image Quality Metric for Contrast-changed images (RIQMC) and NR-IQA for Contrast-Distorted Images (NR-IQA-CDI). The existing NR-IQA-CDI relies on features designed by human or handcrafted features because considerable level of skill, domain expertise and efforts are required to design good handcrafted features. Recently, there is great advancement in machine learning with the introduction of deep learning through Convolutional Neural Networks (CNN) which enable machine to learn good features from raw image automatically without any human intervention. Therefore, it is tempting to explore the ways to transform the existing NR-IQA-CDI from using handcrafted features to machine-crafted features using deep learning, specifically Convolutional Neural Networks (CNN). The results show that NR-IQA-CDI based on non-pre-trained CNN (NR-IQA-CDI-NonPreCNN) significantly outperforms those which are based on handcrafted features. In addition to showing best performance, NR-IQA-CDI-NonPreCNN also enjoys the advantage of zero human intervention in designing feature, making it the most attractive solution for NR-IQA-CDI. Copyright © 2019 Institute of Advanced Engineering and Science. All rights reserved. |
---|