资源算法ImageNet pre-trained models with batch normalization

ImageNet pre-trained models with batch normalization

2019-09-11 | |  116 |   0 |   0

CNN models pre-trained on 1000 ImageNet categories. Currently contains:

* AlexNet and VGG16 with batch normalization added
* Residual Networks with 50 (ResNet-50) and 10 layers (ResNet-10)

Improves over previous pre-trained models and in particular reproduces the ImageNet results of ResNet50 using Caffe. Includes ResNet generation script, training code and log files.

<br/>@article{simon2016cnnmodels,<br/> Author = {Simon, Marcel and Rodner, Erik and Denzler, Joachim},<br/> Journal = {arXiv preprint arXiv:1612.01452},<br/> Title = {ImageNet pre-trained models with batch normalization},<br/> Year = {2016}<br/>}<br/>

[[Models](https://github.com/cvjena/cnn-models)] [[Paper](https://arxiv.org/abs/1612.01452)] [Website]

无链接

上一篇:DeepYeast

下一篇:ResNet-101 for regressing 3D morphable face models (3DMM) from single images

用户评价
全部评价

热门资源

  • TensorFlow-Course

    This repository aims to provide simple and read...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • mxnet_VanillaCNN

    This is a mxnet implementation of the Vanilla C...

  • vsepp_tensorflow

    Improving Visual-Semantic Embeddings with Hard ...

  • DuReader_QANet_BiDAF

    Machine Reading Comprehension on DuReader Usin...