Progressive expansion: Cost-efficient medical image analysis model with reversed once-for-all network training paradigm

Low computational cost artificial intelligence (AI) models are vital in promoting the accessibility of real-time medical services in underdeveloped areas. The recent Once -For -All (OFA) network (without retraining) can directly produce a set of sub -network designs with Progressive Shrinking (PS) a...

全面介紹

Saved in:
書目詳細資料
Main Authors: Lim, Shin Wei, Chan, Chee Seng, Faizal, Erma Rahayu Mohd, Ewe, Kok Howg
格式: Article
出版: Elsevier 2024
主題:
在線閱讀:http://eprints.um.edu.my/45457/
https://doi.org/10.1016/j.neucom.2024.127512
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Low computational cost artificial intelligence (AI) models are vital in promoting the accessibility of real-time medical services in underdeveloped areas. The recent Once -For -All (OFA) network (without retraining) can directly produce a set of sub -network designs with Progressive Shrinking (PS) algorithm; however, the training resource and time inefficiency downfalls are apparent in this method. In this paper, we propose a new OFA training algorithm, namely the Progressive Expansion (ProX) to train the medical image analysis model. It is a reversed paradigm to PS, where technically we train the OFA network from the minimum configuration and gradually expand the training to support larger configurations. Empirical results showed that the proposed paradigm could reduce training time up to 68%; while still being able to produce sub -networks that have either similar or better accuracy compared to those trained with OFA-PS on ROCT (classification), BRATS and Hippocampus (3D -segmentation) public medical datasets. The code implementation for this paper is accessible at: https://github.com/shin-wl/ProX-OFA.