资源算法ImageNet pre-trained models with batch normalization

ImageNet pre-trained models with batch normalization

2019-09-11 | |  93 |   0 |   0

CNN models pre-trained on 1000 ImageNet categories. Currently contains:

* AlexNet and VGG16 with batch normalization added
* Residual Networks with 50 (ResNet-50) and 10 layers (ResNet-10)

Improves over previous pre-trained models and in particular reproduces the ImageNet results of ResNet50 using Caffe. Includes ResNet generation script, training code and log files.

<br/>@article{simon2016cnnmodels,<br/> Author = {Simon, Marcel and Rodner, Erik and Denzler, Joachim},<br/> Journal = {arXiv preprint arXiv:1612.01452},<br/> Title = {ImageNet pre-trained models with batch normalization},<br/> Year = {2016}<br/>}<br/>

[[Models](https://github.com/cvjena/cnn-models)] [[Paper](https://arxiv.org/abs/1612.01452)] [Website]

无链接

上一篇:DeepYeast

下一篇:ResNet-101 for regressing 3D morphable face models (3DMM) from single images

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...