资源论文DCAN: Deep Contour-Aware Networks for Accurate Gland Segmentation

DCAN: Deep Contour-Aware Networks for Accurate Gland Segmentation

2019-12-20 | |  75 |   50 |   0

Abstract

The morphology of glands has been used routinely bypathologists to assess the malignancy degree of adenocarcinomas. Accurate segmentation of glands from histologyimages is a crucial step to obtain reliable morphological statistics for quantitative diagnosis. In this paper, we pro-posed an efficient deep contour-aware network (DCAN) tosolve this challenging problem under a unified multi-tasklearning framework. In the proposed network, multi-levelcontextual features from the hierarchical architecture areexplored with auxiliary supervision for accurate gland seg-mentation. When incorporated with multi-task regulariza-tion during the training, the discriminative capability of in-termediate features can be further improved. Moreover, our network can not only output accurate probability maps of glands, but also depict clear contours simultaneously for separating clustered objects, which further boosts the gland segmentation performance. This unified framework can be efficient when applied to large-scale histopathological datawithout resorting to additional steps to generate contours based on low-level cues for post-separating. Our method won the 2015 MICCAI Gland Segmentation Challenge out of 13 competitive teams, surpassing all the other methods by a significant margin.

上一篇:ReD-SFA: Relation Discovery Based Slow Feature Analysis for Trajectory Clustering

下一篇:A Deeper Look at Saliency: Feature Contrast, Semantics, and Beyond

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...