A new DCT-PCM method for license plate number detection in drone images

License plate number detection in drone images is a complex problem because the images are generally captured at oblique angles and pose several challenges like perspective distortion, non-uniform illumination effect, degradations, blur, occlusion, loss of visibility etc. Unlike, most existing metho...

Full description

Saved in:
Bibliographic Details
Main Authors: Mokayed, Hamam, Shivakumara, Palaiahnakote, Woon, Hon Hock, Kankanhalli, Mohan, Lu, Tong, Pal, Umapada
Format: Article
Published: Elsevier 2021
Subjects:
Online Access:http://eprints.um.edu.my/28202/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:License plate number detection in drone images is a complex problem because the images are generally captured at oblique angles and pose several challenges like perspective distortion, non-uniform illumination effect, degradations, blur, occlusion, loss of visibility etc. Unlike, most existing methods that focus on images captured by orthogonal direction (head-on), the proposed work focuses on drone text images. Inspired by the Phase Congruency Model (PCM), which is invariant to non-uniform illuminations, contrast variations, geometric transformation and to some extent to distortion, we explore the combination of DCT and PCM (DCT-PCM) for detecting license plate number text in drone images. Motivated by the strong discriminative power of deep learning models, the proposed method exploits fully connected neural networks for eliminating false positives to achieve better detection results. Furthermore, the proposed work constructs working model that fits for real environment. To evaluate the proposed method, we use our own dataset captured by drones and benchmark license plate datasets, namely, Medialab for experimentation. We also demonstrate the effectiveness of the proposed method on benchmark natural scene text detection datasets, namely, SVT, MSRA-TD-50 0, ICDAR 2017 MLT and Total-Text. (c) 2021 Elsevier B.V. All rights reserved.