资源算法UntrimmedNets

UntrimmedNets

2019-09-17 | |  65 |   0 |   0

UntrimmedNet for Action Recognition and Detection

We provide the code and models for our CVPR paper (Arxiv Preprint):

  UntrimmedNets for Weakly Supervised Action Recognition and Detection
  Limin Wang, Yuanjun Xiong, Dahua Lin, and Luc Van Gool
  in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017

Updates

  • September 19th, 2017

    • Release the learned models on the THUMOS14 and ActivityNet1.2 datasets.

  • August 20th, 2017

    • Release the model protos.

Guide

The training of UntrimmedNet is composed of three steps: - Step 1: extract action proposals (or shot boundaries) for each untrimmed video. We provide a sample of detected shot boudary on the ActivityNet (v1.2) under the folders of data/anet1.2/anet_1.2_train_window_shot/ and data/anet1.2/anet1.2/anet_1.2_val_window_shot/. - Step 2: construct file lists for training and validation. There are two filelists: one containing file path, number of frames, and label; the other one containing the shot file path and number of frames (Examples are in the folder data/anet1.2/). - Step 3: train UntrimmedNets using our modified caffe: https://github.com/yjxiong/caffe/tree/untrimmednet

The testing of UntrimmedNet for action recognition is based on temporal sliding window and top-k pooling

The testing of UntrimmedNet for action detection is based on a simple baseline (see code in matlab/

Downloads

You could download our trained models on the THUMOS14 and ActivityNet datasets by using the scripts of scripts/get_reference_model_thumos.sh and scripts/get_reference_model_anet.sh.


上一篇:FCNT

下一篇:SENet

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...