Allow hiding certain (NSFW etc) images by default and letting users explicitly expand them
Open, Needs TriagePublic

Description

For educational purposes, Wikipedia contains images of genitalia or illustrations of sexual activities. However, not all readers are comfortable encountering such content. Therefore, I have an idea that will allow such content to remain on Wikipedia: Blur the image, and in the foreground, have a short warning with a display button, like Reddit.

Reddit implementation (source: Google):

image.png (1×1 px, 87 KB)

Proposed WP implementation (clicking anywhere on the image displays it):

image.png (340×351 px, 52 KB)

How would images be marked?
Specific categories that are applied to such image pages automatically mark the image as NSFW. Moreover, specific NSFW images can be marked via the page (e.g. with a template). There is also MediaWiki:Bad image list.


Opt-out of warning via Preferences (for logged in users), perhaps a checkbox for logged-out users?

Event Timeline

Batreeq triaged this task as High priority.Jul 1 2018, 2:53 AM
Batreeq updated the task description. (Show Details)
Batreeq awarded a token.
Batreeq renamed this task from NSFW Images to NSFW Images on WP.Jul 1 2018, 2:55 AM
Batreeq updated the task description. (Show Details)
Batreeq rescinded a token.

Comment by non-developer: For logged-in users, permanently disabling this feature should be as easy as clicking "show NSFW image", "do not censor images for me again" in a row.

Note that on the English Wikipedia, this idea is neither new nor welcomed as default so far: wikipedia:en:Help:Options_to_hide_an_image#Hide_all_images_until_click_to_view

The following script is licensed Creative Commons: Attribution-ShareAlike, just like all these English Wikipedia pages. You may want to copy it and use it for your own purpose: https://en.wikipedia.org/wiki/User:Anomie/hide-images.js

Edit: You also need https://en.wikipedia.org/wiki/User:Anomie/hide-images.css

ToBeFree raised the priority of this task from High to Needs Triage.Jul 1 2018, 3:54 AM

Such change, if it would be global, must be discussed, see links in:
https://meta.wikimedia.org/wiki/Controversial_content

It was discussed before in 2011 and was not developed further for some reason, so any further development would have to require a newer poll (since 7 years is already quite long ago).

Note that a similar idea was discussed in 2011, see e.g. here and here.

Not an existing Beta-Feature hence removing tag. Sounds like something someone could write an extension for if they want such functionality for their wiki.

Aklapper renamed this task from NSFW Images on WP to Allow hiding certain (NSFW etc) images by default and letting users explicitly expand them.Jul 2 2018, 2:07 PM

I reviewed the pages, however, they were long and somewhat confusing. In a few sentences, can someone explain why this idea was discarded? Thank you!

Hi :) I will try. This is my opinion. This is not a neutral summary.

What is "controversial"? What is "bad"? Who decides what needs to be removed?

An image of the flying spaghetti monster is insulting to my religion! I want all images of flying spaghetti monsters removed from Wikipedia.
Images of nude people in articles about "Man" and "Woman" offend me! I do not want to see a man when opening the Man article!
I am a vegetarian. I do not want to see images of meat and ham. I do not want to see animal pictures if the animals are not happy! I do not want to see factory farming!
I hate vegetarians. I hate vegetables. When I see images of apples and bananas, I need to vomit. I do not want to see images of bananas and apples.
I am a 5 year old child. I do not want to see images about war. War traumatizes me. I do not want to see images/screenshots of games that are unsuitable for my age.

Who decides what gets censored?
Who decides what is unsuitable for work?
Why don't we delete these images instead of hiding them?
Wikipedia is not censored.
To change the behavior of Wikipedia, please discuss on Wikipedia. You have requested a controversial change that affects all users. If you want to see it implemented, please discuss with these users. Phabricator is not the place to do so. Phabricator should be used when the discussion is over.
If this is about the English Wikipedia, please discuss on the English Wikipedia.
Please note that the topic has already been discussed. You are not the first person who has requested this:

https://en.wikipedia.org/wiki/Wikipedia:Perennial_proposals#Censor_offensive_images

Yes, this is point 2 of a huge list. This appears at the top. This may have a reason. Please also have a look at the first point of the list.

This does not mean that this feature request is completely invalid. It only means that you will not be able to evade/override discussion on the English Wikipedia by opening a Phabricator ticket. :)

There is now a NSFW image classifier running on Cloud Services, and from my observations it has proven to be a very effective algorithm. It needs to be in production if we wanted to use it in an extension. That is tracked at T214201: Implement NSFW image classifier using Open NSFW but unfortunately it has lost traction and is no longer on the road map. Perhaps the extension itself could include the classifier. The idea is to pre-store the scores, then I suppose there's a hook we could tie into to hide the relevant imagery on page load, as opposed to retroactively hiding them which would be much too slow. Note the extension would need to also live on Commons since that's where most images live, and I guess there could be a configuration variable or something to disable the auto-hide feature itself, since Commons specifically would not want it.

Note also there has been much debate on the name "NSFW". Some cultures do not consider the same images to be NSFW as Westerners do, for instance. Of course, each community will need to opt-in to the auto-hiding feature, and even then it would probably need to be a preference. As mentioned above, English Wikipedia in particular has continually rejected this idea.

I personally have no interest in the auto-hiding effort, specifically, since I know many/most communities won't accept it (understandably). I would however love to have the scores stored in a database so that AbuseFilter could make use of them, and along with various other heuristics we can put a stop to image vandalism. So if anyone wants to work on this proposed extension, let me know :)

I also preffer checkbox for logged out users

One more NSFW filter: https://commons.wikimedia.org/wiki/MediaWiki:Gadget-NSFW.js. I generally like the idea of this filter, if it is to decode as Not Safe For Work.

Added several possible projects, delete if not for you :)

It seems to me that this is also important for the Page-Previews project, because such images also appear during link hover.
F35151884 (NSFW)

One more NSFW filter: https://commons.wikimedia.org/wiki/MediaWiki:Gadget-NSFW.js. I generally like the idea of this filter, if it is to decode as Not Safe For Work.

I agree this looks very interesting.

Unfortunately, it does not seem to work with v1.38.
At least, I added the gadget, enabled it, and nothing happens. Not even a console.log('Starting up...') is passed.

I am not proficient enough in MW/gadgets/JS to check why it won't work.

I find a particular problem with " I have an idea that will allow such content to remain on Wikipedia". It incorrectly implies that the content is not ALREADY allowed on Wikipedia, and implies that some sort of solution or permission is required for it to remain on Wikipedia. I believe every major Wikipedia has a NOTCENSORED policy. The Foundation tried pushing this idea in the 2011 Image filter referendum. It was abandoned due to strong opposition. There were serious discussions on German about forking projects away from the Wikimedia Foundation if there were an attempt to impose content filtering, and I know English takes this very seriously as well.

The outcome of the global discussion was essentially that an opt-in system could be acceptable, however there isn't really any support for an opt-in system. Filter opponents don't want it, and filter-advocates consider opt-in substantially worthless. Aside from the issue of opting-in after the images have already been displayed, any opt-in system is crap for logged out users. It's a hassle to use the opt-in, it has to be cookie based, and cookies have a bad habit of disappearing causing the system to fail.

Anyone is welcome to open new a community discussion on this subject, or even another global discussion, although I don't expect you'll get a favorable response. Until then, I suggest this idea be closed/rejected.

There is an experimental extension MediaSpoiler that can hide certain images/videos.