With TweetDimmer™ Premium, you'll have the power of computer vision and advanced machine learning models to help you see only the images you want to see. Every single image is scanned and is is given a set of scores. The scoring is broken into 4 parts:

  • Adult
  • Spoof
  • Medical
  • Violence

Adult

The adult content classifier scores each image based on the likelihood of it containing pornographic content. Images containing sexual intercourse, nudity, and adult content in cartoons (including hentai). This classifier is not trained to score images containing nudity in a medical, scientific, educational, or artistic context as containing pornography. Images containing people in swimsuits or lingerie are not considered pornographic provided what are commonly considered private parts are covered.



Spoof

The spoof classifier looks for memes (typically an image with bold text superimposed).



Medical

The medical classifier looks for images that contain surguries, diseases, or body parts. Scoring is based on the likelihood of the image containing graphic depictions of open wounds, genital close-ups, and horrendous disease symptoms. Though the image below is clearly of a surgical nature, because it does not contain any graphic depictions listed above, it would not be filtered.



Violence

The violence classifier looks for images that contain killing, shooting, or otherwise contain blood and/or gore.
Images of weaponry by itself, such as the gun below, will not be classified as violent.