This project uses Yahoo Open NSFW (Not Safe For Work) to detect images that contain pornographic content. OpenNSFW uses Caffe pretrained neural network models and has a very big success rate.
I have come across this problem on multiple projects, especially when there is user generated content, or content from an unreliable source that cannot be easily monitored. My solutions were not that good so I thought I'd give it a go.
Since I found Caffe difficult to install, I modified the Caffe Docker to create this project to run Yahoo Open NSFW. I have also modified the Yahoo script to accept remote urls instead of only local images.
You can use it command line or start the build in server. The output is a float number from 0-1. Scores above 0.8 are NSFW. Everything below 0.2 is completely clean.
To install:
- Clone the project.
- Install Docker - Ubuntu instructions here
- Run sudo ./build_docker.sh (This will take some time)
There are 2 ways to run nsfw-docker:
-
Command Line:
sudo docker run -ti caffe:cpu python ./classify_nsfw.py [url|localfile]
-
For Example:
sudo docker run -ti caffe:cpu python ./classify_nsfw.py http://www.personal.psu.edu/jul229/mini.jpg
-
As a web service (./run_server.sh as root):
sudo docker run -ti -p 7981:7981 caffe:cpu python server.py 7981
Then to use the service:
Visit: http://127.0.0.1:7981/[url] (Image link after final /)
For example: http://127.0.0.1:7981/http://www.personal.psu.edu/jul229/mini.jpg
0.00936
0.0505
0.1126
The ip for the web service is not 127.0.0.1. To find out the ip run in Docker Terminal:
docker-machine ip default
Usually something like:
192.168.99.100
So the service can be accessed from: