NSFWDetector is a small (17 kB) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.
Usage
guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
return
}
detector.check(image: image, completion: { result in
switch result {
case let .success(nsfwConfidence: confidence):
if confidence > 0.9 {
// ๐ฑ๐๐
} else {
// ยฏ\_(ใ)_/ยฏ
}
default:
break
}
})
If you want to enforce stricter boundaries for your platform, just apply a lower threshold for the confidence.
Installation
NSFWDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'NSFWDetector'
App Size
The Machine Learning Model is only 17 kB in size, so App size won't be affected compared to other libraries using the yahoo model.
Using just the Model
If you don't want to use the Detection Code, you can also just download the MLModel file directly from the latest Release.
Feedback
If you recognize issues with certain kind of pictures, feel free to reach out via Mail or Twitter.
Author
Michael Berg, [email protected]
License
NSFWDetector is available under the BSD license. See the LICENSE file for more info.