I tried a machine learning model to determine if a picture of a keyboard was SFW
it said no, because it detected the escape key as a nipple

I showed it a picture of me holding the laserdisc of Disney's The Santa Clause and it said it wasn't SFW, because of Tim Allen's breasts and my Penis Hand.

of COURSE I tried a thinkpad keyboard, and no, it didn't identify the TrackPoint as a nipple. I'm sorry.

I also tried the other way: giving it NSFW images.
I can't show those, but it did pretty abysmally. it spotted a penis in one, but an image of two people having sex that was shot from the side? marked SFW. A softcore shot of BDSM? SFW

I tested to see if it wasn't programmed to detect clitorises (clitorii?), thinking maybe it the thinkpad pointer is a clit and not a nipple?

NOPE! I tried turning the detection threshold way down and it managed to detect 27 overlapping pussies (their term, not mine) but zero clitorises.

So I guess it just can't detect clitorises. shame

man, this thing loves to mark my thumb as a penis.

I tried running it on some VHS tape covers but instead of noticing the woman in a leotard on the cover, it told me it was NSFW because my thumb at the bottom of the image was DEFINITELY a penis

I tried a photo where I was wearing nitrile gloves while holding up a phone, and it didn't mark my thumb as a penis this time!

it did mark my blurry wife in the background as a sex toy, which is more offensive tbh

if I lower the confidence threshold enough it'll detect my pupils as nipples

I keep trying laserdisc covers and it keeps telling me NSFW and then points out the thumb in the corner as a penis

kinda want to run it on my entire photo library and see what it marks as NSFW.

I ran it on a photo of me holding a box of Rice-A-Roni, and it marked my hand TWICE as penises, and the "A" in Rice-A-Roni as an asshole

HAH. it failed on my USB penis
This is 100% safe, baby!

(it's a plastic USB flashdrive in the shape of a penis)

maybe it should be a game.
like it shows you pictures like this, and asks you if you can guess where the NSFW detector thinks the problem is

well, that's enough for me to confirm that I'm not really missing out on anything but not using this sort of nonsense

Someone on blusky asked me to test this image of the kirby slippers.

Sadly it said SFW until I dropped the confidence threshold, but now it can see the penis, pussy, and nipple.

I'm now trying to feed the entire movie of Disney's The Santa Clause through it. Well, one frame every second.

So far I've got a lot of images like this:

there, that's better.
I mean, arguably.

okay I'll give you the tie.

looking through the preliminary results, it appears the only things it can detect with any reliability are tim allen's big supernaturals. Huh.

I might need to process the whole movie but with it set to label and mosaic anything it detects

if I start running that now it'll maybe be done just in time for christmas

current processing speed is 1 frame per 1.5 seconds.
hopefully I can speed that up somewhat. I have 139872 frames to process

cutting out the HTTP layers gets the speed up to about 0.6s/frame

and fixing my cuda install gets the whole thing to move to a GPU and now we're down to 0.06s/frame, or a little less than realtime

processed my one-frame-a-second version, and out of 5829 frames, 2537 of them contained NSFW content (at confidence threshold=0.01)

at the default confidence level of 0.2, I get 189 hits

I'm working on moving the annotation overlays from the existing opencv code to my own PIL(low) based code, so I can better customize how it looks.

then I want to make it automatically make GIFs where it plays like a couple seconds before the censoring, then mosaics+labels the censored bit, then shows a couple seconds afterwards

Here's some progress: my first pass at annotating a video instead of an image.

I need to write some code to make it less flickery but this is definitely progress

I think I'm basically going to post-process the annotations.

if an annotation appears for only a few frames (where "few" is tunable, like 1-5?), it'll not be shown
and if an annotation is on frame N-1, and frame N+1, but not frame N? interpolate it to show it on frame N too.

btw the model I'm using:
huggingface.co/Falconsai/nsfw_

has been downloaded 68 million times.
in the last month.

this thing is being heavily used.

the post processing is going to require a lot of refactoring because I didn't design it to work this way at all. yay

I specifically tried to program it to not constantly detect Tim Allen's Big (Super)Naturals and LOOK WHAT IT DOES ANYWAY

what's the reason for the season? I'm not sure, but I do know that SANTA CLAUS IS MADE OF FRACTAL PENISES

also some people have mumbled that this isn't a real test of NSFW detection models because I'm fiddling with the thresholds.
1. it's not supposed to be a real test. I'm trying to make an amusing GIF
2. I don't need to test these things: it's clearly obvious they don't work

Follow

@foone I think your gifs are really educational. Governments want such models installed on our PCs and Phones to alert the police because think of the children. And now I can say, look at this, do you feel safer yet?

· · Web · 0 · 0 · 0
Sign in to participate in the conversation
Awoo Space

Awoo.space is a Mastodon instance where members can rely on a team of moderators to help resolve conflict, and limits federation with other instances using a specific access list to minimize abuse.

While mature content is allowed here, we strongly believe in being able to choose to engage with content on your own terms, so please make sure to put mature and potentially sensitive content behind the CW feature with enough description that people know what it's about.

Before signing up, please read our community guidelines. While it's a very broad swath of topics it covers, please do your best! We believe that as long as you're putting forth genuine effort to limit harm you might cause – even if you haven't read the document – you'll be okay!