资源算法sdm_face_alignment

sdm_face_alignment

2020-03-25 | |  35 |   0 |   0

Matlab Implementation of Supervised Descent Method

A simple Matlab implementation of Supervised Descent Method (SDM) for Face Alignment.

I provide both training and testing modules and one trained model of LFPW subset of 300-W dataset.

You can find the ogirinal paper of my implementation:

Xiong et F. De la Torre, Supervised Descent Method and its Applications to Face Alignment, CVPR 2013.

===========================================================================

Dependency:

Datasets in use:

[300-W] http://ibug.doc.ic.ac.uk/resources/facial-point-annotations/

How to use:

  1. Download 300-W data (i.e. LFPW) from above link and put into "./data" folder, then correct the dataset path to your dataset foler in setup.m

    mkdir -p data

    For example:

    options.trainingImageDataPath = './data/lfpw/trainset/';

    options.trainingTruthDataPath = './data/lfpw/trainset/';

    options.testingImageDataPath = './data/lfpw/testset/';

    options.testingTruthDataPath = './data/lfpw/testset/';

  2. Download and install dependencies: libLinear, Vlfeat, mexopencv, put into "./lib" folder and compile if necessary. Make sure you already addpath(...) all folders in matlab. Check and correct the library path in setup.m.

    mkdir -p lib

    libLinear:

    Vlfeat:

    • cd lib/vlfeat/ && make

    • cd ./toolbox in Matlab editor.

    • Run vl_setup

    • Compile mex Hog functions:

      cd misc mex -L../../bin/glnx86 -lvl -I../ -I../../ vl_hog.c

    • Setup libvl.so path.

    • Assume that your libvl.so located at: <vlfeat_folder>/bin/glnx86 Create soft link:

      ln -s <vlfeat_folder>/bin/glnx86/libvl.so /usr/local/libvl.so Check if the libvl.so is ready to use. ldd vl_hog.mexglx If libvl.so still not found. Add /usr/local/lib into /etc/ld.so.conf (sudo). sudo ldconfig ldconfig -p | grep libvl.so Check again: >> ldd vl_hog.mexglx

    • Open Matlab

    • Go to i.e. lib/liblinear-1.96/matlab/ in Matlab editor.

    • Run make.m to comile *.mex files.

  3. If you run first time. You should set these following parameters to learn shape and variation. For later time, reset to 0.

    options.learningShape = 1; options.learningVariation = 1;

  4. Do training:

    run_training();

  5. Do testing:

    do_testing();

Note: in the program, we provide training models of LFPW (68 landmarks) in folder: "./model". The program does not optimize speed and memory during training, the memory problem may happens if you train on too much data.


上一篇:binary-face-alignment

下一篇:face-alignment-dlib

用户评价
全部评价

热门资源

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • shih-styletransfer

    shih-styletransfer Code from Style Transfer ...