Kullback Leibler divergence for image quantitative evaluation

Medical imaging has been expanding ever since to give diagnostic information through different types of modalities. Currently, there are many types of modalities such as Computed Tomography (CT) scans, Magnetic Resonance Imaging (MRI), X-rays (plain radiography), Positron Emission Tomography (PET) s...

Full description

Saved in:
Bibliographic Details
Main Authors: Pheng, H. S., Shamsuddin, S. M., Leng, W. Y., Alwee, R.
Format: Conference or Workshop Item
Published: American Institute of Physics Inc. 2016
Subjects:
Online Access:http://eprints.utm.my/id/eprint/73207/
https://www.scopus.com/inward/record.uri?eid=2-s2.0-84984565338&doi=10.1063%2f1.4954516&partnerID=40&md5=afebb081855e03865fbb993fa456ad16
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Medical imaging has been expanding ever since to give diagnostic information through different types of modalities. Currently, there are many types of modalities such as Computed Tomography (CT) scans, Magnetic Resonance Imaging (MRI), X-rays (plain radiography), Positron Emission Tomography (PET) scan and Ultrasonographic diagnostics (USG), available in the field of medical and surgical. These modalities are widely used in clinical diagnosis and development of research in education. In terms of image quality, the qualitative analysis was always used to evaluate the quality of output image from classification results. By qualitative analysis, the researchers were able to judge the precision of detected lesion and hence calculated the accuracy of detection through the testing cases. However, the qualitative analysis was sometimes subjective and the verification from more than one radiologist was needed to confirm the results of classification. Therefore, the quantitative analysis was also needed to ensure the results from the classification algorithm can be assessed objectively. In this study, we propose pixel-based approach of Kullback Leibler (KL) divergence in assessing the medical images. Unlike the standard statistical analysis, the evaluation using KL divergence does not require testing of hypothesis or confidence interval construction based on the mean and standard deviation. The proposed framework of KL is useful to provide a descriptive measure for the purpose of summarizing data. Firstly, both of the original and computed images are normalized where the sum of all intensities is equal to one. Then, the probability distribution is calculated by column using function of hist (HO) and hist (HA) and each of the column are expressed as data vector h0i = {h01, h02, h03, h0i} and hAi = {hA1, hA2, hA3, hAi} respectively. In the computation of probability distribution, the function of hist bins the elements in each data vector of and into 10 equally spaced containers and return the amount of elements in each container as row vector. The results have shown that the proposed framework of Kullback Leibler divergence is promising in presenting better final images quantitatively.