awesome-semantic-segmentation-pytorch
This project aims at providing a concise, easy-to-use, modifiable reference implementation for semantic segmentation models using PyTorch.
# semantic-segmentation-pytorch dependencies pip install ninja tqdm # follow PyTorch installation in https://pytorch.org/get-started/locally/ conda install pytorch torchvision -c pytorch # install PyTorch Segmentation git clone https://github.com/Tramac/awesome-semantic-segmentation-pytorch.git # the following will install the lib with symbolic links, so that you can modify # the files if you want and won't need to re-build it cd awesome-semantic-segmentation-pytorch/core/nn python setup.py build develop
Single GPU training
# for example, train fcn32_vgg16_pascal_voc: python train.py --model fcn32s --backbone vgg16 --dataset pascal_voc --lr 0.0001 --epochs 50
Multi-GPU training
# for example, train fcn32_vgg16_pascal_voc with 4 GPUs: export NGPUS=4 python -m torch.distributed.launch --nproc_per_node=$NGPUS train.py --model fcn32s --backbone vgg16 --dataset pascal_voc --lr 0.0001 --epochs 50
Single GPU evaluating
# for example, evaluate fcn32_vgg16_pascal_voc python eval.py --model fcn32s --backbone vgg16 --dataset pascal_voc
Multi-GPU evaluating
# for example, evaluate fcn32_vgg16_pascal_voc with 4 GPUs: export NGPUS=4 python -m torch.distributed.launch --nproc_per_node=$NGPUS eval.py --model fcn32s --backbone vgg16 --dataset pascal_voc
cd ./scripts python demo.py --model fcn32s_vgg16_voc --input-pic ./datasets/test.jpg
.{SEG_ROOT} ├── scripts │ ├── demo.py │ ├── eval.py │ └── train.py
DETAILS for model & backbone.
.{SEG_ROOT} ├── core │ ├── models │ │ ├── bisenet.py │ │ ├── danet.py │ │ ├── deeplabv3.py │ │ ├── deeplabv3+.py │ │ ├── denseaspp.py │ │ ├── dunet.py │ │ ├── encnet.py │ │ ├── fcn.py │ │ ├── pspnet.py │ │ ├── icnet.py │ │ ├── enet.py │ │ ├── ocnet.py │ │ ├── ccnet.py │ │ ├── psanet.py │ │ ├── cgnet.py │ │ ├── espnet.py │ │ ├── lednet.py │ │ ├── dfanet.py │ │ ├── ......
You can run script to download dataset, such as:
cd ./core/data/downloader python ade20k.py --download-dir ../datasets/ade
Dataset | training set | validation set | testing set |
---|---|---|---|
VOC2012 | 1464 | 1449 | ✘ |
VOCAug | 11355 | 2857 | ✘ |
ADK20K | 20210 | 2000 | ✘ |
Cityscapes | 2975 | 500 | ✘ |
COCO | |||
SBU-shadow | 4085 | 638 | ✘ |
LIP(Look into Person) | 30462 | 10000 | 10000 |
.{SEG_ROOT} ├── core │ ├── data │ │ ├── dataloader │ │ │ ├── ade.py │ │ │ ├── cityscapes.py │ │ │ ├── mscoco.py │ │ │ ├── pascal_aug.py │ │ │ ├── pascal_voc.py │ │ │ ├── sbu_shadow.py │ │ └── downloader │ │ ├── ade20k.py │ │ ├── cityscapes.py │ │ ├── mscoco.py │ │ ├── pascal_voc.py │ │ └── sbu_shadow.py
PASCAL VOC 2012
Methods | Backbone | TrainSet | EvalSet | crops_size | epochs | JPU | Mean IoU | pixAcc |
---|---|---|---|---|---|---|---|---|
FCN32s | vgg16 | train | val | 480 | 60 | ✘ | 47.50 | 85.39 |
FCN16s | vgg16 | train | val | 480 | 60 | ✘ | 49.16 | 85.98 |
FCN8s | vgg16 | train | val | 480 | 60 | ✘ | 48.87 | 85.02 |
FCN32s | resnet50 | train | val | 480 | 50 | ✘ | 54.60 | 88.57 |
PSPNet | resnet50 | train | val | 480 | 60 | ✘ | 63.44 | 89.78 |
DeepLabv3 | resnet50 | train | val | 480 | 60 | ✘ | 60.15 | 88.36 |
Note: lr=1e-4, batch_size=4, epochs=80
.
See TEST for details.
.{SEG_ROOT} ├── tests │ └── test_model.py
add train script
remove syncbn
train & evaluate
test distributed training
fix syncbn (Why SyncBN?)
add distributed (How DIST?)
上一篇:ESPNetv2-COREML
下一篇:3D-ESPNet
还没有评论,说两句吧!
热门资源
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
capsnet-with-caps...
CapsNet with capsule-wise convolution Project ...
inferno-boilerplate
This is a very basic boilerplate example for pe...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com