资源算法Yahoo-Open-NSFW

Yahoo-Open-NSFW

2019-12-30 | |  115 |   0 |   0

Yahoo Open NSFW

   

Detect Inappropriate Content In Your Images

Use Yahoo Open NSFW to detect the possibility that your image is not suitable for work (pornographic content). Detection ranges from 0 to 100%.

Possible Use Cases:

  • Automatically flag inappropriate content uploaded in your online community.

  • Block inappropriate images from being loaded for a child-safe environment.

  • Continue fine tuning this prediction model on your own dataset to detect a broader set of NSFW content.

Prediction Examples:

NSFW Probability: 0%NSFW Probability: 15.2%NSFW Probability: 0.09%
图片.pngboy with deer图片.png

How Good Is This Model?

As far as we can tell, this is the only ML solution to detecting not suitable for work (NSFW) content. However, since NSFW content is highly subjective, we recommend testing the algorithm on your own images, which you can easily do through running the notebook tutorial below. Different contexts will require different cutoffs for what probability constitutes NSFW content.

Yahoo has shown that this model accidentally classifies 7% of images as potentially NSFW for an undisclosed cutoff they used in testing.

Misc

Code licensed under the BSD 2 clause license. See Source Code for more details.

Model trained under ImageNet 1000 class dataset and then fine tuned on a proprietary NSFW dataset not released by Yahoo.


上一篇:play_nsfw

下一篇:opennn

用户评价
全部评价

热门资源

  • Keras-ResNeXt

    Keras ResNeXt Implementation of ResNeXt models...

  • seetafaceJNI

    项目介绍 基于中科院seetaface2进行封装的JAVA...

  • spark-corenlp

    This package wraps Stanford CoreNLP annotators ...

  • capsnet-with-caps...

    CapsNet with capsule-wise convolution Project ...

  • inferno-boilerplate

    This is a very basic boilerplate example for pe...