« MBS FileMaker Plugin,… | Home | DbgView for watching … »

Use Machine Learning for detecting porn images

Today at the Macoun conference, I visited the session about CoreML and for a nudity scanner. This included the pointer to the following CoreML model to detect nudity in pictures. That's great, if you get pictures from users and you want to know whether they include any porn pictures.

You can download the model here: Yahoo's Open NSFW detector with Core ML as part of an open source project. In the download, please locate the OpenNSFW.mlmodel file and put the path into our CoreML.fmp12 example FileMaker database or the CoreML Xojo project coming with MBS Plugins.



When you run this model with a porn image, you may get an output like this:

{
  "classLabel" : "NSFW",
  "prob" : {
    "SFW" : 0.34505009651184082,
    "NSFW" : 0.65494996309280396
  }
}

for a normal picture of something else, you may get this output:

{
  "classLabel" : "SFW",
  "prob" : {
    "SFW" : 0.97758579254150391,
    "NSFW" : 0.02241421677172184
  }
}

As you see this model has only two classes SFW (safe for work) and NSFW (not safe for work) to categorize images in. For each we get the probabilities and the classLabel for the most likely class.

This can be used in various projects in both FileMaker and Xojo. Whatever image you get to post on social platforms, add to your image database or upload to your CMS, you can pre-check images. Whether you warn the user only, mark them as porn or decline them is your choice.

We are looking forward to improvements in MacOS 10.14 Mojave and iOS 12. Looks like CreateML lets you really easily create new models.

For other models, please check the awesome list of models for Core ML here: github.com/likedan/Awesome-CoreML-Models. There are some really cool models like predicting location of a picture, what type of food you got on a picture or gender classification for names. Claris FileMaker Plugin
16 09 18 - 13:20