Skip to content

Latest commit

 

History

History
17 lines (11 loc) · 1.17 KB

README.md

File metadata and controls

17 lines (11 loc) · 1.17 KB

NSFW Classifier

HitCount

The NSFW Classifier is used to Classify Not safe for work images vs Safe images. NSFW images includes Porn and Sexy images. Rest are classified as Safe Images

Take a look at a Example

Alt text

Take a look at a Example which predicts various images in a Single big image

Alt text

You can see in Frame by Frame file we can implement a function which tells if video is safe for kids or not and We can take actions accordingly.

Mobile Net Architecture is used for the classification as it is very fast and has less params to train.

This is inspired by https://www.freecodecamp.org/news/how-to-set-up-nsfw-content-detection-with-machine-learning-229a9725829c/ , https://github.com/GantMan/nsfw_model The Data for this project is collected with help of scripts at https://github.com/alex000kim/nsfw_data_scraper