资源算法AOGNet-v2

AOGNet-v2

2020-02-06 | |  44 |   0 |   0

AOGNet-v2

This project provides source code for AOGNets: Compositional Grammatical Architectures for Deep Learning (CVPR 2019) and Attentive Normalization.

图片.png

Installation

  1. Create a conda environment, and install all dependencies.

conda create -n aognet-v2 python=3.7
conda activate aognet-v2
git clone https://github.com/iVMCL/AOGNet-v2
cd AOGNet-v2
pip install -r requirements.txt
  1. Install PyTorch 1.0+, follows https://pytorch.org/

  2. Install apex, follows https://github.com/NVIDIA/apex

ImageNet dataset preparation

  • Download the ImageNet dataset to YOUR_IMAGENET_PATH and move validation images to labeled subfolders

  • Create a datasets subfolder under your cloned AOGNet-v2 and a symbolic link to the ImageNet dataset

$ cd AOGNet-v2
$ mkdir datasets
$ ln -s PATH_TO_YOUR_IMAGENET ./datasets/

Pretrained models for Evaluation

$ cd AOGNet-v2
mkdir pretrained_models

Download pretrained models from the links provided in the tables below, unzip it and place into the pretrained_models directory.

$ ./scripts/test_fp16.sh <pretrained model directory path>
  • Models trained with normal training setup: 120 epochs, cosine lr scheduling, SGD, ... (MobileNet-v2 trained with 150 epochs). All models are trained with 8 Nvidia V100 GPUs.

Method#ParamsFLOPStop-1 error (%)top-5 error (%)Link
ResNet-50-BN25.56M4.09G23.016.68Google Drive
ResNet-50-GN*25.56M4.09G23.526.85-
ResNet-50-SN*25.56M-22.436.35-
ResNet-50-SE*28.09M-22.376.36-
ResNet-50-AN (w/ BN)25.76M4.09G21.595.88Google Drive
ResNet-101-BN44.57M8.12G21.335.85Google Drive
ResNet-101-AN (w/ BN)45.00M8.12G20.615.41Google Drive
MobileNet-v2-BN3.50M0.34G28.699.33Google Drive
MobileNet-v2-AN (w/ BN)3.56M0.34G26.678.56Google Drive
AOGNet-12M-BN12.26M2.19G22.226.06Google Drive
AOGNet-12M-AN (w/ BN)12.37M2.19G21.285.76Google Drive
AOGNet-40M-BN40.15M7.51G19.844.94Google Drive
AOGNet-40M-AN (w/ BN)40.39M7.51G19.334.72Google Drive

* From original paper

  • Models trained with advanced training setup: 200 epochs, 0.1 label smoothing, 0.2 mixup

Method#ParamsFLOPStop-1 error (%)top-5 error (%)Link
ResNet-50-BN25.56M4.09G21.085.56Google Drive
ResNet-50-AN (w/ BN)25.76M4.09G19.925.04Google Drive
ResNet-101-BN44.57M8.12G19.714.89Google Drive
ResNet-101-AN (w/ BN)45.00M8.12G18.854.63Google Drive
AOGNet-12M-BN12.26M2.19G21.635.60Google Drive
AOGNet-12M-AN (w/ BN)12.37M2.19G20.575.38Google Drive
AOGNet-40M-BN40.15M7.51G18.704.47Google Drive
AOGNet-40M-AN (w/ BN)40.39M7.51G18.134.26Google Drive
  • Remarks: The accuracy from the pretrained models might be slightly different than that reported during the training (used in our papers). The reason is still unclear.

Perform training on ImageNet dataset

$ cd AOGNet-v2
$ ./scripts/train_fp16.sh <config_fie>

e.g. For training AOGNet-12M with AN

$ ./scripts/train_fp16.sh configs/aognet-12m-an-imagenet.yaml

See more configuration files in configs. Change the GPU settings in scripts/train_fp16.sh

Object Detection and Instance Segmentation on COCO

We performed object detection and instance segmentation task on COCO with our models pretrained on ImageNet. We implement it based on the mmdetection framework. The code is released in https://github.com/iVMCL/AttentiveNorm_Detection/tree/master/configs/attn_norm.

Citations

Please consider citing the AOGNets or Attentive Normalization papers in your publications if it helps your research.

@inproceedings{li2019aognets,
  title={AOGNets: Compositional Grammatical Architectures for Deep Learning},
  author={Li, Xilai and Song, Xi and Wu, Tianfu},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={6220--6230},
  year={2019}
}

@article{li2019attentive,
  title={Attentive Normalization},
  author={Li, Xilai and Sun, Wei and Wu, Tianfu},
  journal={arXiv preprint arXiv:1908.01259},
  year={2019}
}

Contact

Please feel free to report issues and any related problems to Xilai Li (xli47 at ncsu dot edu), and Tianfu Wu (twu19 at ncsu dot edu).

License

AOGNets related codes are under RESEARCH ONLY LICENSE.


上一篇:AOGNets

下一篇:GoogLeNet_v2

用户评价
全部评价

热门资源

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...