资源算法pruned_lightweight_openpose

pruned_lightweight_openpose

2020-02-04 | |  145 |   0 |   0

A Pruned Version for Lightweight OpenPose

This repository contains channel-pruned models for lightweight OpenPose (Real-time 2D Multi-Person Pose Estimation on CPU: Lightweight OpenPose), and it mainly follows the work of Daniil Osokin. The channel pruning code is based on our ICCV 2019 submission which will be open-source after acceptance.

图片.png

Table of Contents

Requirements

  • Ubuntu 16.04

  • Python 3.6

  • PyTorch 0.4.1 (should also work with 1.0, but not tested)

Prerequisites

  1. Download COCO 2017 dataset: http://cocodataset.org/#download (train, val, annotations) and unpack it to <COCO_HOME> folder.

  2. Install requirements pip install -r requirements.txt

Training

  1. Fine-tune the pruned model. Run CUDA_VISIBLE_DEVICES=<DEVICES_ID> python train_prune.py --train-images-folder ./coco/train2017/ --prepared-train-labels prepared_train_annotation.pkl --val-labels val_subset.json --val-images-folder ./coco/val2017/ --checkpoint-path ./pruned_models/<CHECKPOINT> --num-refinement-stages 3 --experiment-name <NAME> --weights-only

Validation

  1. For training-time synchronous validation. Run CUDA_VISIBLE_DEVICES=<DEVICES_ID> python val_per_epoch.py

  2. Validation for a specific checkpoint. Run python val_prune_oneepoch.py --labels <COCO_HOME>/annotations/person_keypoints_val2017.json --images-folder <COCO_HOME>/val2017 --checkpoint-path <CHECKPOINT>

Demo

  1. For a simple demo. Run python demo.py --checkpoint-path ./fine-tuned_models/<CHECKPOINT> --images <YOUR_IMAGE>

Pruned model

We provide two pruned models with different compression rate: ./pruned_models/0.3.pth.tar (reduce 15.92% flops) and ./pruned_models/0.8.tar.pth (reduce 25.6% flops).

Fine-tuned model

The model fine-tuned from the pruned model ./pruned_models/0.3.pth.tar is available in ./fine-tuned_models/.

Unpruned pre-trained model

The model expects normalized image (mean=[128, 128, 128], scale=[1/256, 1/256, 1/256]) in planar BGR format. Pre-trained on COCO model is available at: ./pre-trained_models/checkpoint_iter_370000.pth.tar, it has 40% of AP on COCO validation set (38.6% of AP on the val subset).


上一篇:pytorch-admm-pruning

下一篇:Pruning-and-Refactoring-Tool

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...