Please use this identifier to cite or link to this item:
http://hdl.handle.net/20.500.11861/6906
Title: | Mu-net: Multi-path upsampling convolution network for medical image segmentation |
Authors: | Chen, Jia He, Zhiqiang Zhu, Dayong Hui, Bei Prof. LI Yi Man, Rita Yue, Xiao-Guang |
Issue Date: | 2022 |
Source: | CMES - Computer Modeling in Engineering and Sciences, 2022, vol. 130(3), pp. 73-95. |
Journal: | CMES - Computer Modeling in Engineering and Sciences |
Abstract: | Medical image segmentation plays an important role in clinical diagnosis, quantitative analysis, and treatment process. Since 2015, U-Net-based approaches have been widely used for medical image segmentation. The purpose of the U-Net expansive path is to map low-resolution encoder feature maps to full input resolution feature maps. However, the consecutive deconvolution and convolutional operations in the expansive path lead to the loss of some high-level information.More high-level information can make the segmentation more accurate. In this paper, we propose MU-Net, a novel, multi-path upsampling convolution network to retain more high-level information. The MU-Net mainly consists of three parts: contracting path, skip connection, and multi-expansive paths. The proposed MU-Net architecture is evaluated based on three different medical imaging datasets. Our experiments show that MU-Net improves the segmentation performance of U-Net-based methods on different datasets. At the same time, the computational efficiency is significantly improved by reducing the number of parameters by more than half. |
Type: | Peer Reviewed Journal Article |
URI: | http://hdl.handle.net/20.500.11861/6906 |
ISSN: | 1526-1492 1526-1506 |
DOI: | 10.32604/cmes.2022.018565 |
Appears in Collections: | Economics and Finance - Publication |
Find@HKSYU Show full item record
SCOPUSTM
Citations
12
checked on Dec 15, 2024
Page view(s)
206
Last Week
1
1
Last month
checked on Dec 20, 2024
Google ScholarTM
Impact Indices
Altmetric
PlumX
Metrics
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.