This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Paper The following article is Open access

CT Guided Diagnosis: Cascaded U-Net for 3D Segmentation of Liver and Tumor

, , and

Published under licence by IOP Publishing Ltd
, , Citation G K Mourya et al 2021 IOP Conf. Ser.: Mater. Sci. Eng. 1128 012049 DOI 10.1088/1757-899X/1128/1/012049

1757-899X/1128/1/012049

Abstract

Volumetric estimation of the liver tumor is the first step to identifying the livers critical disorder. The liver and its tumor ratio prerequisite measures to select the therapeutic procedure. 3D printing and virtual reality platform require a segmented liver entity mask to evaluate the pre and post-treatment analysis. A cascaded U-Net model is proposed for automatic segmentation of liver and tumor in CT images. LiTS CT data set utilized for this study. The images were pre-processed using the windowing technique for contrast enhancement. Two U-Net models were modified for liver and tumor segmentation, respectively and connected in a cascaded manner. U-Net decoder end was modified in comparison to the original U-Net. The probability map of the first U-Net fed to the second U-Net and the input image to segment out the liver tumor. Eight subject volumetric CT datasets were utilized to test the cascaded U-Net performance and achieved average Dice coefficient for liver and tumor 0.95 and 0.69, respectively. Liver tumor diagnosis and treatment accuracy depend upon the precision of segmentation algorithms. Designed model segmented liver almost accurately and tumor segmented with limited accuracy. A further modification is required for the tumor segmentation cause of the occurrence of false negative.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1757-899X/1128/1/012049