implement-ResNeXt-with-keras
ResNeXt is an important mileston in the growth of convolutional networks, especially for Residual Networks.
Here is a proposal to implement ReNeXt with Keras. Our implementation is inspired by ResNeXt-Tensorflow and Residual Networks
Python 3.x
Tensorflow 1.x
Keras 2.2.4
The original building block of ResNeXt : Aggregated residual transformations
First Reformulation, implemented as early concatenation
Second Reformulation which includes grouped convolution
We implemented the first reformulation which consists of split, transform+merge and transition operations.
Split the input channels into groups. The total number of groups is equals to the cardinality which is c=4 in our case
def split(inputs, cardinality): inputs_channels = inputs.shape[3] group_size = inputs_channels // cardinality groups = list() for number in range(1, cardinality+1): begin = int((number-1)*group_size) end = int(number*group_size) block = Lambda(lambda x:x[:,:,:,begin:end])(inputs) groups.append(block) return groups
Perform two consecutive convolution operations on each group: 1x1 and 3x3 (Transform) followed by concatenation (Merge)
def transform(groups, filters, strides, stage, block): f1, f2 = filters conv_name = "conv2d-{stage}{block}-branch".format(stage=str(stage), block=str(block)) bn_name = "batchnorm-{stage}{block}-branch".format(stage=str(stage), block=str(block)) transformed_tensor = list() i = 1 for inputs in groups: # first conv of the transformation phase x = Conv2D(filters=f1, kernel_size=(1,1), strides=strides, padding="valid", name=conv_name+'1a_split'+str(i), kernel_initializer=glorot_uniform(seed=0))(inputs) x = BatchNormalization(axis=3, name=bn_name+'1a_split'+str(i))(x) x = Activation('relu')(x) # second conv of the transformation phase x = Conv2D(filters=f2, kernel_size=(3,3), strides=(1,1), padding="same", name=conv_name+'1b_split'+str(i), kernel_initializer=glorot_uniform(seed=0))(x) x = BatchNormalization(axis=3, name=bn_name+'1b_split'+str(i))(x) x = Activation('relu')(x) # Add x to transformed tensor list transformed_tensor.append(x) i+=1 # Concatenate all tensor from each group x = Concatenate(name='concat'+str(stage)+''+block)(transformed_tensor) return x
The last conv layer of the building block of (b)
def transition(inputs, filters, stage, block): x = Conv2D(filters=filters, kernel_size=(1,1), strides=(1,1), padding="valid", name='conv2d-trans'+str(stage)+''+block, kernel_initializer=glorot_uniform(seed=0))(inputs) x = BatchNormalization(axis=3, name='batchnorm-trans'+str(stage)+''+block)(x) x = Activation('relu')(x) return x
The wall architecture is as follow
Identity Block : Building blocks contained is the same stage. They have same input and output dimensions
Downsampling Block : Transition between two building blocks whose the output dimension of the first block is different from that of the input of the other block.
resnext50-with-keras.ipynb
: details on how the model and its functions workresnext50.py
: define the ResNeXt50 classtraining-on-cifar10.py
: training script on the cifar10 dataset
Xie et al. (2017) Aggregated Residual Transformations for Deep Neural Networks
He et al. (2015) Deep Residual Learning for Image Recognition
Marco Peixeiro GitHub repository : Residual Networks
GitHub Repository: ResNeXt-Tensorflow
Keras Documentation : Trains a ResNet on the CIFAR10 dataset
Carmel WENGA, R&D Engineer | Data Scientist - Nzhinusoft
还没有评论,说两句吧!
热门资源
seetafaceJNI
项目介绍 基于中科院seetaface2进行封装的JAVA...
Keras-ResNeXt
Keras ResNeXt Implementation of ResNeXt models...
spark-corenlp
This package wraps Stanford CoreNLP annotators ...
shih-styletransfer
shih-styletransfer Code from Style Transfer ...
inferno-boilerplate
This is a very basic boilerplate example for pe...
智能在线
400-630-6780
聆听.建议反馈
E-mail: support@tusaishared.com