Talk:Image filter referendum/en

This is an archived version of this page, as edited by Michaeldsuarez (talk | contribs) at 14:03, 16 August 2011 (What does the referendum mean?: Fixing.). It may differ significantly from the current version.

Latest comment: 14 years ago by Michaeldsuarez in topic What does the referendum mean?

Will the outcome be binding?

Does the wording mean that the outcome of this vote will be binding? Or is it just an exercise to gauge community opinion before the Board makes a call? If it's the latter, I suggest the word "plebiscite" instead, or some other wording that makes it clear that this will not provide a binding directive to the WMF. Craig Franklin 21:55, 29 June 2011 (UTC).Reply

I'm puzzled as to why this doesn't count as a standard 'Request for Comment'... Mike Peel 21:56, 29 June 2011 (UTC)Reply
Craig: the idea of a direct referendum is interesting; we haven't formalized the concept in the past with Wikimedia-wide decisions like the relicensing process. Unless someone makes a case for a formal referendum process that is approved as a standing part of WMF governance, no outcome will be legally binding on the WMF without a Board resolution. However, any group within the WMF can set up a community referendum and commit to being bound by its outcome... which I believe is what's going on here.
Certainly the discussion and revision process of RfCs, rather than a focus on voting, is desirable. Discussion of concerns and suggestions raised, and the outreach involved in responding to them, will be as valuable as the vote itself. I don't know how specific the referendum text will be, but I hope that the first outcome of holding one is feedback that gets incorporated into the proposed implementation. SJ talk | translate   04:02, 1 July 2011 (UTC)Reply
I'm a little confused - for whom will the outcome be binding? aleichem 01:50, 24 July 2011 (UTC)Reply

Why do we need to vote?

I'm just wondering why a vote is needed here. It seems like such a no-brainer to enable this sort of system; are there significant concerns that have been raised by others somewhere? fetchcomms 03:11, 30 June 2011 (UTC)Reply

I agree; considering it is a personal choice anyways, where's the possible damage? I doubt that anyone will come complaining over the fact that they could activate this for themselves. Ajraddatz (Talk) 03:19, 30 June 2011 (UTC)Reply
I am amazed that such a system is even being considered, let alone brought to a vote, as though there was any chance of this gaining consensus. Such a system is a blatant annihilation of Wikimedia's neutrality, placing preference towards the viewpoints of certain cultures. It is impossible for such a filter to be anywhere close to neutral, short of having commons:Category:Topics as the start point for settings. My guess is that any potential filter the foundation has in mind will satisfy the opinions of a small group of people, mostly Americans and Europeans, and ignore everyone else. --Yair rand 03:29, 30 June 2011 (UTC)Reply
I'm also thinking I'll be against this proposal, but how about we wait and see what's actually being proposed before we start engaging in any wild speculation? Craig Franklin 08:26, 30 June 2011 (UTC).Reply
Well the comments above shows why we need a discussion/vote to bring such system into projects. --Bencmq 08:34, 30 June 2011 (UTC)Reply
There will definitely be opposition to this proposal, myself included. I wouldn't be so quick to call the result just yet. Blurpeace 09:13, 30 June 2011 (UTC)Reply
I'm not quite sure how it affects Wikimedia's neutrality. It says right on the main page for this, "allow readers to voluntarily screen particular types of images strictly for their own account." From that, I presume anonymous readers would still get the same content they do now, and if you don't want to filter anything (or filter everything), you would be free to do so with your account. I believe this is about giving people the tools to filter the content they don't want, rather than deciding en masse that certain content should be hidden/shown. ^demon 17:24, 30 June 2011 (UTC)Reply
"giving people the tools to filter" - it's not about what you do on the receiving end. It's about how "they" tag (or not tag) files at source. There are 10 million files on commons. There are no human resources to assess them (have you seen the backlogs?). But there are well-known problem areas: I'm afraid that added censorship rights will become a POV-magnet for determined POV-warriors. Welcome to filter-tag wars. Larger wikipedias have some experience of handling conflict areas, commons is totally unprepared. Let's hope that the new widget will be as helpless and useless as their "upload wizard". But if, indeed, they roll out something working, prepare for the worse. NVO 18:04, 30 June 2011 (UTC)Reply
People will not be free to filter what they want, such a system would be pretty much impossible to make. They will be free to filter a set bunch of groups of images that the WMF have received the most complaints about from our current English readership (well actually, probably not the most complaints, the Mohammed images have received loads, but I doubt the filter's going to include an option for filtering that simply because it would make Wikimedia look like the non-neutral group it will have become). There are endless amounts of groups that have their own filtering needs which will obviously not be satisfied by this system. --Yair rand 22:16, 30 June 2011 (UTC)Reply
Why would this be impossible to make? I don't see why everyone shouldn't be able to choose an arbitrary set of [current] Commons categories that they would prefer not to see. Would you be opposed to such a system? SJ talk | translate   03:29, 1 July 2011 (UTC)Reply
If it really gave no preference towards certain cultures (making just as easy to hide images in Category:Tiananmen Square protests of 1989 or Category:Asparagus as images from any other categories, perhaps having selection beginning from Category:Topics as mentioned above), then it would not be a problem with NPOV, in my opinion. --Yair rand 04:36, 1 July 2011 (UTC)Reply
It would be better as you suggest, but we would still have the problem that we don't agree on categorization. Nemo 10:42, 1 July 2011 (UTC)Reply
Yair: thanks for clarifying. Nemo: true, but we already have that problem with applying categories. [re]categorizing media would become more popular as a result, but that might be a good thing just as it is with article text. SJ talk | translate   03:05, 3 July 2011 (UTC)Reply
I find myself agreeing with the opposition on this issue. Regardless of who decides what is to be subject to censorship, and regardless of the fact that it is opt-in, it still puts the Wikimedia Foundation in a non-neutral position. Taking an example from my own area, what if we deemed images of the Tiananmen Square protests of 1989 or images of leaders of a certain religious group banned by the Chinese government to be "controversial", and therefore added them as an option on the filter? We would rightly be labeled as non-neutral for supporting the Chinese government's view of what is and is not controversial. I don't want us ever to be in a position like that. We are a neutral provider of knowledge. individuals already have the ultimate filter built in: they can simply not look at what they don't like.--Danaman5 23:44, 30 June 2011 (UTC)Reply
Things are controversial if some people think it is. So yes, we should add a tag to those images if some Chinese think they are offensive. Anyone should be able to come in and declare an image offensive. Only then it would be neutral. Of course, a lot of images would be deemed controversial that way. Zanaq 09:38, 1 July 2011 (UTC)Reply
Agree with Zanaq. If someone thinks a picture of a naked person is controversial that person should be allowed to add a filer, a symbol or whatever s/he likes to change the picture or the way it's linked. You want to give work to some filter committee so they have no time to improve Commons, Wikipedia or whatever project? Not a brilliant idea me seems. Cheers, 79.53.129.80 11:28, 1 July 2011 (UTC)Reply
hello may i add a thought on it?

the thing is any encyclopedias who host image pornography are misguided and betray trust in the first place or else it would be merely an adult site not a genuine encyclopedia of merit. Wikipedia requires morality regardless of the wide spectrum of user-opinions or other filters. users that post anything biased need to be expected to have their item debated in the sandbox. Regarding China they should be allowed for a country to choose what not to see if a person can. In China the people there did not get to vote for any other party so that is their choice. Make ready-made choices for the whole WP and images to be aware of some nations whose Government needs an option to choose to block things. When WP is obviously not allowed and WP would be banned due to open-content, then WP ought to offer a strong way to securely Select PG or China-friendly content filter or other dictator-styled filter set in stone and compulsory mandated in WP architecture as a ___location-based filter including for their anonymous users. Otherwise there are whole countries of people who get none of the WP articles because of an all-or-nothing preponderance. WP policy of one-size-fits-all means WP may deny people their right to see what is allowable in the place they live. Simply because of inadequate filters people must wait. As well as supplementary personal filters even a Saintly or an Atheist filter although those may remove most of the articles on human history. A China-tickable filter or Vegan-friendly locked-down WP made filter, could be popular selections and bring new users. Basically this might allow people to create a custom-filters gallery that are like mozilla addons for firefox. The possibility is about better WP article usage by widgetising several search terms for example so a search delves following the right learnt-link rather than searches that fan-out everywhere. Marsupiens 06:12, 16 August 2011 (UTC)Reply

This has been talked about for years. The idea that has emerged is that there would be multiple, separate filters, so you could pick and choose. The most commonly requested ones are for pornographic images (requested by basically everyone in the world except white males), religiously sensitive images (notably but not exclusively images of the Prophet Mohammed), and violent images (e.g., the picture of a man in the act of committing suicide).
I suspect that most of us would not choose to have any of them turned on, but the opposition's story to this point has generally been that if people don't want to see these images, then they should write their own third-party software (maybe a browser plug-in) because WMF should never-ever-ever offer options even remotely like Google's "safe search". In this story, enabling non-adepts to voluntarily customize their experience (e.g., by permitting a person with PTSD to suppress images that might trigger symptoms) is pure, evil censorship. WhatamIdoing 15:44, 1 July 2011 (UTC)Reply
I'm a white male but would happily support allowing people to opt in to various filters. including grades of porn. But we need the classification to be transparent and the filtering to be genuinely a matter of individual choice - otherwise we would be accused of bias. Some people would define porn very narrowly, othere would include bare arms, legs or even female faces. I suspect it would be impractical to create a filter per every individual request, though we could go a long way towards that if any category could be filtered out by any editor. I know someone who would opt to filter out images that contained spiders. I think we should also allow groups of Wikimedians to create custom flters that people could choose as opposed to going through a ream of options. So presumably a moderate Shia filter would set the porn threshold at bathing suit, ban the cartoons of Mohammed, and perhaps images of pigs and pork? Whilst a "western cosmopolita child friendly filter" would enable you to filter out graphic violence and set the porn filter at bathing trunks. The difficult questions we have to resolve before the referendum include:
  1. What do we do if particular governments say they would be willing to unblock Wikimedia if we allow then to filter out what they consider to be porn or of religious offense (I'm assuming that we won't cooperate with countries that want to filter out images that might be politically controversial).
    Suggestion: say No, just as we do today.
  2. What happens in discussions on Wikipedia if some people want to illustrate an article with images that others can't see becuase they have opted to filter them out? My preference is that we enable "alternate image for those who've filtered out the main image".
    I believe the current proposal is to have the image visible hidden, the way sections are rolled up; so it is clear that an image is there but not being shown, in case the reader wants to see it.
    If it is planned to work that way, I wonder how it will work with anything besides the mere use of images for illustrating articles. In the Portuguese wikipedia we use a painting of a naked woman as the symbol for our Art portal, embedded in a template and displayed at every artwork article. I'm sure the more puritan would like it to be hidden from view as well.--- Darwin Ahoy! 12:16, 3 July 2011 (UTC)Reply
  3. What do we do when people disagree as to whether a particular image shows enough breast to count as topless whilst others point out that there appears to be a diaphonous top? Do we give filters the right to fork?
    This is already an issue that comes up in categorization of images. Policies tend to be vague at present; if anyone can create their own filters then you could define a combination of descriptive categories, each of them relatively neutral, that fit what you don't want to see. (there could still be disagreements about how many different similar categories are approriate for Commons. For instance, different people with w:ailurophobia will draw different lines around what counts as 'a cat', from the perspective of their reflexive fear.)
  4. Do we use this as an opportunity for outreach to various communities, and if so are we prepared to set boundaries on what we will allow filters to be set on. For example I don't see that a porn filter is incompatible with being an educational charity, but I would be horrified if a bunch of creationists tried to block images of dinosaurs or neanderthals, especially if they did so as part of a "Christian" filter (I suppose I could accept individuals knowingly blocking categories such as dinosaur). WereSpielChequers 09:54, 3 July 2011 (UTC)Reply
    I can't think of a neutral way to set such boundaries. Personally, I would be fine with any arbitrary personal choices people make in what images they want to see -- dislike SVGs? fine! -- just as in what pages they watchlist. I see this as a tool to support friendlier browsing, not an occasion to set new official boundaries. SJ talk | translate   11:42, 3 July 2011 (UTC)Reply
    The reason we need to vote is because many people, myself included, are against any violation of the basic core principle of NOT CENSORED. There are three legitimate ways to avoid images 1/ turn them off in ones browser 2/ fork the encyclopedia to a version that omits NOT CENSORED 3/devise an entirely external feature making use of our ordinary metadata. What is not acceptable is for the WMF to interfere with the content in this manner. The fundamental reason for not censored is the general commitment to freedom of information as an absolute good in itself--freedom means not just nonremoval of information, but also not hindering it -- the practical reason is because no two people will agree on what should be censored. DGG 13:53, 16 August 2011 (UTC)Reply
The reason we need to vote is because many people, myself included, are against any violation of the basic core principle of NOT CENSORED. There are three legitimate ways to avoid images 1/ turn them off in ones browser 2/ fork the encyclopedia to a version that omits NOT CENSORED 3/devise an entirely external feature making use of our ordinary metadata. What is not acceptable is for the WMF to interfere with the content in this manner. The fundamental reason for not censored is the general commitment to freedom of information as an absolute good in itself--freedom means not just nonremoval of information, but also not hindering it -- the practical reason is because no two people will agree on what should be censored. DGG 13:52, 16 August 2011 (UTC)Reply

What is the question being asked?

In referendums, there is normally a single yes/no answer for those being polled to answer. The announcement doesn't give any hints as to the phrasing of this question - is there a draft available, and/or will the draft question be posted before the poll for analysis/comments/discussion? Or will there be multiple questions or multiple answers so that a diverse range of voter viewpoints can be sampled? Thanks. Mike Peel 20:51, 4 July 2011 (UTC)Reply

I also believe that there should be a suggested implementation (or set of implementations) set out prior to this referendum, given the range of different possibilities available for such a filter. Is this the plan, or will the referendum be on the general issue rather than the specific implementation? Mike Peel 20:59, 4 July 2011 (UTC)Reply
The referendum text will be published in advance. It will likely be a yes/no question (perhaps with an option for "I don't feel that I have enough information", as with the licensing migration). The referendum is on the general concept, as much as is humanly possible. Philippe (WMF) 01:25, 5 July 2011 (UTC)Reply
I appreciate the idea that we are allowed to see the referendum text before the actual casting of votes. However, in my humble opinion, we should also see the text - in its final form, or (if N/A) as a draft - early enough to be able to ponder and discuss it. Therefore, I hope that "In advance" means "Asap", and, more particularly, at least four weeks before the start of the referendum.
Among others, it may take some time to translate it to the main languages; and, when it comes to finer nuances, most non-native speakers of English would benefit by having access to a good translation. JoergenB 19:21, 5 July 2011 (UTC)Reply

Alternative ideas

low-res thumbnails

I've commented on this before, but will repeat: I think it would be more generally beneficial to allow users a setting to override page settings about the size of thumbnails, so that, for example, you could decide for all thumbnails to be shown at 30-pixel resolution (also for all images to be shown as thumbnails) regardless of the Wiki code. This would help low-bandwidth users as well as those with specific objections. My hope is that at some low resolution - 20 pixels if need be - there is simply no picture that will be viewed as intensely objectionable. I wish your referendum would investigate in this direction rather than pressing for people to "neutrally" place ideological ratings on specific images. Wnt (talk) 23:56, 30 June 2011 (UTC)Reply

This would be an interesting low-bw option, but seems like a separate issue. If an image is clear enough to be better than no image (the ultimate low-bw option, after all), it is clear enough to be controversial. 'Placing ideological ratings on images' would be an unfortunate outcome. SJ talk | translate   03:29, 1 July 2011 (UTC)Reply
If we really need to take in to consideration the wishes of the more squeamish readers/editors, this seems to me the only viable option. I can see quite few advantages:
  • It would avoid all work to set up the the tagging and the software to implement it. Energies that could be better invested elsewhere.
  • Would avoid all the editing wars that I already see ahead and consequent lost of time and energy.
  • Would be an advantage for people accessing Wiki with low connection/old machines.
As only (tiny) drawback, the articles would not look as sleek and appealing as now. It seems to me a win-win solution. --Dia^ 07:48, 16 August 2011 (UTC)Reply

(assisted) browser plug-in

I agree with user Koektrommel "If people want to filter, fine, do it on your own PC." (somewhere below in this discussion), but also see user WereSpielChequers statement "As for the idea of people filtering on their own PCs, what proportion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti.".
Why not go an intermediate way: Making it easy to the user/browser plug-in to filter, but keeping it local at the users PC? If every picture sends its category/tags with it, a local browser plug-in can filter easily. How complicate it is, to use this plug-in (or a concurrent one) is problem of the plug-in developers (and the user to chose an easy-to-use one). (A reader unable to install a plug-in should perhaps think about absolving a lerning lesson about using internet...)
Tagging and categorising of pictures remains task of the community; but perhaps it could be less 'absolut', e.g. relative values like: image-showing-almost-naked-woman: "67% for ~yes, she's close to naked~ / 33% for ~normal light clothes~" (123 users rated this picture)
--129.247.247.239 08:57, 16 August 2011 (UTC)Reply

All images on or off

Any kind of filtering system seems to me a waste of project resources and a morass. Instead, why not simply give readers the ability to turn all images on or off by default? And perhaps the ability to turn any given image back on, based on the reader's own evaluation of the caption and the article's subject. No tagging needed. No discussions required. Much easier to implement. Culturally neutral. Simple. Barte 11:21, 16 August 2011 (UTC)Reply

That is already included in every browser setting! In Firefox is Tools > Preferences > Content > remove check mark from "Load images automatically". Finish! So an easy solution is already there. --Dia^ 13:02, 16 August 2011 (UTC)Reply

User option to display immediately

Is it intended there will be a user option to bypass this feature and display all images immediately always? DGG 13:45, 16 August 2011 (UTC)Reply

Committee membership

I'm curious to know the answers to the following questions, if answers are available: How has the membership of this committee been decided? Who has it been decided by (presumably by some group at the Foundation - I'm guessing either the board, or the executive)? What steps are being taken to ensure that the committee is representative of the broad range and diversity of Wiki[p/m]edia's editors and users? Thanks. Mike Peel 22:36, 30 June 2011 (UTC)Reply

As an add-on comment to my last: I see that there are 2 US, 2 Canadian, 1 UK and 1 Iranian on the committee at present - which is very much skewed to English and the western world. E.g. there's no-one who speaks Japanese, Chinese or Russian, and no-one that is located in the southern hemisphere. If this is a conscious choice, it would be good to know the reason. behind it. Thanks. Mike Peel 07:59, 1 July 2011 (UTC)Reply
In fact this looks pretty much an en.wiki thing. Nemo 10:39, 1 July 2011 (UTC)Reply
Why does it matter?
The committee's job is to let you have your say about whether WMF ought to create a software feature that offers individual users the ability to screen out material they personally don't choose to look at.
The committee's job is not to decide whether Foroa (below) has to look at pictures of abused animals, or which pictures constitute abused animals, or anything like that.
Unless you think this group of people will do a bad job of letting you express your opinion on the creation of the software feature, then the composition is pretty much irrelevant. WhatamIdoing 20:28, 1 July 2011 (UTC)Reply
My main worries are the accessibility of committee members from a linguistic point of view (i.e. if non-en/fa/az/es speakers have questions about the process), and the risk of perception of this vote as a US-centric stance (which could lead to voting biases, e.g. "it's already been decided so why should we bother voting", or "let's all vote against the American world-view!"). Mike Peel 21:03, 1 July 2011 (UTC)Reply
The membership of the committee was largely put together by me, but informed by many other people. It's also not totally done yet, so if someone has interest, I'd love to talk to them. I've added a couple of Germans, and you're right that the southern hemisphere is under represented. You are, of course, correct in everything you say there - but we're going to try out the infrastructure that's being built for multi-lingualism by the fundraising team and hope that helps to cover some of the gaps. We will be calling heavily on the translators, as always. Philippe (WMF) 21:36, 1 July 2011 (UTC)Reply
Geez lets put in a token German and a token Iranian. The whole "committee" is a joke. It consists of mainly anglosaxon heritage people whom will decide for the rest of the world what they can see. My oh my. The last 10 years living outside of my own country I have started to hate native English speakers. They are arrogant and think that simply because they speak English they are more than the rest of the world. And now we get a committee of mostly North Americans deciding for the other 4.5 billion people in the world .... great. Just fucking great. Waerth 02:23, 2 July 2011 (UTC)Reply
I don't think we're doing a "token" anything. There was a concerted attempt to bring in people of varying backgrounds who also brought skills in helping with the election. I find it unsurprising that some of the people who came to our attention were Germans, given the size of the German language Wikipedia. It's disrespectful to imply that they are tokens: they're not. They're intelligent people who bring a great deal of value. The same with Mardentanha, who - in addition to being Iranian - has been a member of the Board of Trustees steering committee and thus brings practical knowledge to the table. Philippe (WMF) 11:46, 3 July 2011 (UTC)Reply
Waerth, I don't think you understand the proposal. The committee has one job. The committee's job is "Find out whether people want a user preference so that they can decide what they see on their computer".
The committee does not "decide for the rest of the world what they can see". They may decide whether we will add one more button to WMF pages, but you would be the only person who decides whether to click that button on your computer. I would be the only person who could decide whether to click that button on my computer. The committee does not decide "for the rest of the whole world what they can see". WhatamIdoing 19:06, 5 July 2011 (UTC)Reply

As a suggestion: try including an invitation to join the committee within the announcement itself - and particularly the translated versions of it. Saying "if someone has interest, I'd love to talk to them" is fantastic - but not particularly useful for diversifying the languages spoken by the community if it's only said in English. It's also rather hidden if it's just said on this page... Mike Peel 20:26, 4 July 2011 (UTC)Reply

Cost

Will the information provided include also an estimate of the costs the WMF will need to face to implement the filter? Nemo 07:35, 2 July 2011 (UTC)Reply

A fair question, since one of the tradeoffs may be between a quick inflexible solution that helps some users but frustrates others, vs. a better solution that takes more energy to develop. I'd like to see such an estimate myself (of time involved, if that's easier than cost). SJ talk | translate   11:42, 3 July 2011 (UTC)Reply
I'll see if we can get a scope for that, yes. Philippe (WMF) 00:36, 4 July 2011 (UTC)Reply
Correct me if I'm wrong, but I thougt there was a US law that limits publication of prices for services. --95.115.180.18 22:11, 1 August 2011 (UTC)Reply
No, there's not. It would be a violation of the business' free speech rights. The closest thing we have to that is that the (private) publishers of the phone books usually refuse to print prices, because they're worried about fraud/false advertising. WhatamIdoing 22:43, 14 August 2011 (UTC)Reply

Concerns

Slippery slope danger

This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Don't give organizations / government the opportunity to easily implement a blacklist for their people. Koektrommel 07:58, 3 July 2011 (UTC)Reply

That's a huge extrapolation based on a very preliminary step - and not one that I think is justified. Given that it would already be technically feasible (and easy) for foreign governments to apply such censorship if they wanted (e.g. see the Great Firewall of China or the Internet Watch Foundation), as well as the finances for technical development available to governments / organisations in favour of censorship (which are several orders of magnitude larger than those available to Wikimedia), I seriously doubt that this would make it significantly easier for them to implement... Mike Peel 20:22, 4 July 2011 (UTC)Reply
I don't think that it is within Wikipedia goals to provide a tool to make job of censors easier. Ipsign 06:57, 16 August 2011 (UTC)Reply

Esto no es una herramienta que aumente la libertad, sino que la recorta. De aprobarse será la propia wikipedia, que no censura contenidos ni temáticas, la que tristemente proporcione el instrumento de censura a los censuradores. Lamentable. Wikisilki 19:01, 5 July 2011 (UTC)Reply

Translation: "This is not a tool to increase freedom, but to limit it. If approved, it will be Wikipedia, which does not censor content or themes, which sadly provides a tool for censoring to the censors. Lamentable." —translated by Sj.

I concur. Implementing this feature would be clearly the first step towards censorship on Wikipedia. The next step on this way will be to block whole pages (for example, by adding category 'sexuality') which will allow schools to inject cookie with 'Display-Settings: sexuality=denied' and block such content for all the students. While it might be the intention, it won't stop there: the next step will be to create category 'Display-Settings: opposed-by-chinese-government=denied' and mark Tiananmen Square protests of 1989 with it, which will effectively allow Chinese government to simply inject appropriate cookie at firewall and block content which they don't want Chinese people to see. It is also *very important* to understand that limiting the feature only to 'logged-in' users will *not* fix this problem (regardless of implementation, it will be *very easy* to enforce at firewall level that all users from China are always logged in as the very same user). Ipsign 06:57, 16 August 2011 (UTC)Reply

It sets a precedent that I really don't want to see on Wikipedia. --Dia^ 07:54, 16 August 2011 (UTC)Reply

I dont want that tool, its the key to introduce censorship. If a certain image cannot be deleted if will be added to all possible categories to prevent presentation as far as possible. If someone has problems with uncensored images, he/she should read only their own publications.

Abuses by censors: self-censorship seems to open the door to 3rd-party censorship

From other comments on this page, there seems to be a severe confusion about the intention of the feature and of its potential (and IMHO very likely) abuses. One common argument pro this feature is that it is self-censorship, which (unlike 3rd-party censorship) looks as a good thing. Unfortunately, there are severe implications of available technical solutions. It seems that this feature will almost inevitably be based on so-called cookies. It will essentially allow any man-in-the-middle who sits between end-user and wikipedia server (usually ISP, which can be controlled by government/private company with certain agenda/...), to pretend that it was end-user who decided not to show controversial images, and Wikipedia servers won't be able to detect the difference (the only way I know to prevent such attack, is SSL, and even it can be abused in this particular case - 'man in the middle' can intercept SSL too, and while user will know that server certificate is wrong, he won't have any other option, so he'll need to live with it). It means that by enabling users to filter themselves out, we will also be enabling 'in the middle' censors to make filtering much easier than they're doing it now (it will also probably mean less legal protection against censorship: for example, if somebody will filter out Wikipedia images in US right now - they will be likely committing copyright infringement - with 'fair use' defense being quite uncertain, but if it is Wikipedia servers who's doing the filtering - copyright argument evaporates, making censorship much more legal). This is certainly not a good idea IMHO. Ipsign 07:59, 16 August 2011 (UTC)Reply

In order to distinguesh between images, they have to be categorized. And there you will find the censors. --Eingangskontrolle 08:47, 16 August 2011 (UTC)Reply

Yes, even if this is intended for personal use, it will in fact be used by network managers and ISPs to censor Wikipedia for their users. MakeBelieveMonster 11:50, 16 August 2011 (UTC)Reply

This would be a waste of time

Another waste of time by the Foundation. Wikimedia projects are not censored, except projects like :ar. They think they should not have pics of their prophet. Some christians might like to censor the Piss Christ. Ultra-orthodox hasidic jews don't like pictures of women of the opposite sex. On :nl some don't like pictures with blood. Some classification scheme has to be in place, some review process, and people have to invest time doing that. This time is wasted and can not be spent spreading free knowledge. And who will make these decisions? Zanaq 09:18, 1 July 2011 (UTC)Reply

As I understand the concept, the 'classification scheme' would be categories for media, which already exist. The 'review process' for creating and updating categories also exists - it happens every day when media are uploaded. Category information is generally considered useful metadata and free knowledge in its own right. I could imagine people trying to create "useless" categories that have meaning only to them, but that seems like a rare case [and, again - is something that already happens today with category creation]. SJ talk | translate  
Free content, or free information, is any kind of functional work, artwork, or other creative content that meets the definition of a free cultural work. A free cultural work is one which has no significant legal restriction on people's freedom. Where do i get a refund? aleichem 09:35, 1 July 2011 (UTC)Reply
I agree with this definition, but I'm not sure what point you mean to make. Could you describe it in more detail? For instance, if you are concerned about excluding free content from the projects, the filter proposals would not do any of that. (Notability guidelines, in contrast, exclude the vast majority of all free content from the projects - in what could more accurately be named 'censorship'.) SJ talk | translate   03:53, 3 July 2011 (UTC)Reply
  • Great Wall of China within Wikimedia? That attempt deserves an thumbs down button. --Matthiasb 09:33, 1 July 2011 (UTC)Reply
    • User side filtration is NOT the same as censorship. Bulwersator 09:38, 1 July 2011 (UTC)Reply
      • Just to be clear: I do not think this is exactly censorship. I do think it is unworkable and contrary to the goals of most (if not all) WikiMedia Projects. Zanaq 09:53, 1 July 2011 (UTC)Reply
        • When issuing the referendum, hopefully it will contain details on how it will be done, what committees will be responsible for what, how a user can get protection (against them self), use cases (I don't want to see pictures of abused animals, Bin Laden and Scientology), default values, uploader and user training, ... --Foroa 16:53, 1 July 2011 (UTC)Reply
          • This always ends up with the situation where one group will tell what the other group is allowed to see. We occuse foreign governments of doing that. Choosing always end up in culture and politics, two things Wikipedia should stay far away from. Edoderoo 13:15, 2 July 2011 (UTC)Reply
            • This is a user side opt-in filter. I cannot understand how this becomes censorship. --Bencmq 14:47, 2 July 2011 (UTC)Reply
              • I have actually seen very little info so far. I may hope you are right. An opt-in filter is self-censorship, I won't care. Any other option is an area I'm even not ready to discuss about. I can't tell what others can see or not... No one can... Edoderoo 16:25, 2 July 2011 (UTC)Reply

┌──────────────────┘

  • Although I do agree that there will be inevitable problems - if the filter is based on categories, there may be dispute over certain images if they should be be put into this or that category. But let's wait and see.--Bencmq 16:51, 2 July 2011 (UTC)Reply
    This would be an opt-in filter. SJ talk | translate   03:53, 3 July 2011 (UTC)Reply
  • This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Koektrommel 07:54, 3 July 2011 (UTC)Reply
    • I'm not aware of any examples where censorship has come in via that gradual way. I suspect the typical pattern is more that if organisations do things that upset people and refuse to compromise by allowing some sort of opt out or rating system, then eventually there is an overreaction, censorship ensues and people regret not agreeining a reasonable compromise. As for the idea of people filtering on their own PCs, what proprtion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti. WereSpielChequers 17:12, 3 July 2011 (UTC)Reply
    Then you need a system that does not require an account. As far as the slippery slope goes, suppose we start with an opt-in system and discover that it isn't all that effective as the vast majority of readers do not register and log in. So what then? Do we default to "reasonable" censors for IP readers, encouraging them to register if they want to opt out?? And straight down the rabbit hole we go.
    Before we even consider this referendum, WMF needs to understand the requirements. And the requirements are a system that (a) does not require registration (so has to be based on cookies rather than on-wiki preferences) (b) is easy to use and (c) can easily be advertised without running afoul of the "No disclaimers in articles" prohibition on en and likely other projects. And even then, it will be completely ineffective in many areas. Muhammad pictures for one, as many Muslims won't be satisfied unless we censor them for everyone. The "you have the capability to disable them for yourself" response has gone in one ear and out the other for years.
    For all the battles it will cause, a simple categorization system for images does have value. But I have to say that I'm in with the group who thinks the actual tools for censoring images should be left to third parties. Resolute 14:24, 4 July 2011 (UTC)Reply

This would be a distraction from Foundation principles

The Wikimedia Foundation, Inc. is a nonprofit charitable organization dedicated to encouraging the growth, development and distribution of free, multilingual content, and to providing the full content of these wiki-based projects to the public free of charge. The Wikimedia Foundation operates some of the largest collaboratively edited reference projects in the world, including Wikipedia, a top-ten internet property.

Any type of filter runs contrary, even if opt in. These projects were designed to take the sum of human knowledge and make it freely available. The donated money, if any at all is used here, for which the foundation uses to further the goal, should be put to better use. NonvocalScream 19:57, 4 July 2011 (UTC)Reply

That would depend on whether the content was standing in the way of growth, development or distribution - which it seems that it is at the current time. The content would remain completely freely accessible to everyone even with this option in place. As such, I don't believe that it runs contrary to the Foundation's principles. Mike Peel 20:17, 4 July 2011 (UTC)Reply
What is standing in the way of growth, dev, and dist at the current time? Filters should be provided at the client side, not the server side. The projects or the Foundation have no business wasting resources in such an elegant way. NonvocalScream 20:27, 4 July 2011 (UTC)Reply
Let's see ... how about contributions from schools, universities and organisations where Wikipedia is currently blocked due to its controversial images? People in countries where their national sentimentality is offended by Wikipedia's content sufficiently that they don't edit (e.g. pictures of Muhammad)? As a very controversial statement, what is worse: Wikipedia not being editable in China due to an image of Tiananmen Square, or not having much more content about China that is freely accessible to everyone on the internet?
Given that most large internet sites provides filtering on the server side, not just leaving it up to the client side, it's clear that there is demand for a server side solution. Simply based on that, I would argue that it is worth the Foundation running a poll/referendum of both its editors and readers to see whether Wikimedia should operate a similar server-side solution. Mike Peel 20:37, 4 July 2011 (UTC)Reply
The Foundation can run the poll/ref. I think that conversation here, now, before the poll is healthy.
I maintain that the Foundation, and its projects really have no business expanding into this type of area. Regardless of what country won't display because of "this" image, it is not in the ambit of the WMF to do this thing. It is totally out of scope. NonvocalScream 20:42, 4 July 2011 (UTC)Reply

As long as readers can choose to see the image with one click, I don't believe Wikipedia loses any freedom with this image filter. --NaBUru38 18:43, 15 August 2011 (UTC)Reply

I agree, but unfortunately it won't work this way. Filtering almost inevitably will be based on cookies, and there is nothing easier than to inject cookie at firewall level, pretending that it was user who decided not to see the image. Ipsign 07:39, 16 August 2011 (UTC)Reply
If a malicious party controls the firewall, the game's already over. They can block all of Wikipedia, or all images, or pages in some categories, or pages containing particular words. Bypassing those firewalls is out of scope. 59.167.227.70 09:30, 16 August 2011 (UTC)Reply

Complicated editing

If this feature is implemented, then when I'm editing the article, I will need to think "hey, how this article will look if user uses this filter? Or that filter? Or that combination of dozen of filters?" In many cases images are essential part of the article, and making article comprehensible without them (or worse - with an arbitrary combination of them), will be an editor's nightmare. And while I am not involved in editing of sexual or religious articles, I am not sure that this tool won't evolve, for example, to allow filtering out "all images which are related to Microsoft Windows/Linux/..." - if it ever happens, then I will find my editing severely affected. Ipsign 07:23, 16 August 2011 (UTC)Reply

Definitions

Educational materials

I don't understand what "educational materials" can mean in this page. Materials to educate the voters? Nemo 10:47, 1 July 2011 (UTC)Reply

So that they could make the right choice... ;-) --Mikołka 10:58, 1 July 2011 (UTC)Reply
An image or entire set of filter options might help suit different groups.```` — The preceding unsigned comment was added by Marsupiens (talk)
this is a little late, but "educational materials" might be an inadvertent US-centric edit -- in the US every voter gets a packet of information about the upcoming vote, who is running, what the full text of the ballot referendums are, etc. before every election. I always assumed this happened other places too until my UK friends were in town before an election and expressed amazement. At any rate, that's the kind of educational materials that are meant -- to answer the question "what's going on"? -- phoebe | talk 18:37, 31 July 2011 (UTC)Reply


Phobe, thanks for the explanation. Yes, it's an US-centric thing, I never heard of it anywhere in Europe for example. --Dia^ 07:59, 16 August 2011 (UTC)Reply

"servers hosted by a neutral third party"

From the content page: "The referendum [...] will be conducted on servers hosted by a neutral third party." Um. Well. What? Will users' login information be shared with this "neutral" third party? If not, how can the vote be held on external servers? Why would a vote like this be held not only off wiki, but off Wikimedia? How could a third partly outside Wikimedia possibly be neutral? --Yair rand 13:38, 3 July 2011 (UTC)Reply

This means simply 'conducted as the Board election was' - by a third party specialized in overseeing fair votes. There may not be a need for such precautions in this case [I would prefer an open-ballot vote on Meta, myself], but it is often seen as an unbiased choice for running the technical process of voting. SJ talk | translate   15:09, 3 July 2011 (UTC)Reply
Sj is correct. The election is held using the Securepoll extension, which passes a user who's already logged in to a WMF wiki (and verified as meeting voting requirements) to the servers of the third party for voting. Philippe (WMF) 00:40, 4 July 2011 (UTC)Reply
Thank you for the explanation. --Yair rand 02:53, 4 July 2011 (UTC)Reply
Is there some sort of technical explanation of the mechanism, except the source code? If IP addresses and user names are not connected on the third party's servers, then I suppose some cryptographic key is included in the URL or body of the request to the third party server. I brief explanation of how this is done and what information is exposed to the third party would be important. --LPfi 16:39, 5 July 2011 (UTC)Reply

Interesting, that we are not asked yes or no. The decision ist already made and we are asked for minor details. --Eingangskontrolle 08:54, 16 August 2011 (UTC)Reply

Design ideas

suggestion from Dcoetzee

I'm pasting this from a mail I sent to Philippe the other day on this topic.

I can't claim to represent general community consensus, although I have some idea what that consensus is like. In particular, in discussions of image filtering the community has made very clear that they don't want WMF projects to have any direct involvement in the construction or maintenance of blacklists, which are lists of articles/files or categories that are blocked for a particular user.

I have a vision for what I think a personal image filtering solution should look like, but not the time and resources to implement it. Basic outline follows.


Customer

Persons interested in self-censorship and censorship of small children by parents or guardians. In particular, censorship of teenagers or technically adept persons is not a goal for me. This is helpful because it means strong technical measures against circumvention are unnecessary.

Requirements

  1. If an article/page/entry is included in the user's blacklist, and is not in their whitelist, attempting to view it will produce an error message and show no content. If the user is using the software in self-censorship mode, they may proceed to view it regardless. Otherwise, a password is required to proceed to view it.
  2. If a file is included in the user's blacklist, and is not in their whitelist, it will be hidden, showing it its place an icon indicating that it was blocked. If the user is using the software in self-censorship mode, they may click it to reveal it. Otherwise, a password is required to view it.
  3. A user may at any time add a page or file they're looking at to their blacklist, hiding it and preventing it from appearing in the future.
  4. When a password is entered to view a page/file, either a time limit is given, or the page/file is permanently added to the user's whitelist.
  5. Both blacklists and whitelists may be specified using a combination of individual pages/files as well as categories of pages/files, which will include subcategories. Whitelists take priority over blacklists.
  6. A public distribution center is available (ideally on a third party/unaffiliated site) where users can share their blacklists/whitelists, and import those of others. This allows communities to specialize the software to meet their community standards, while still being able to individually adapt to their personal standards and the needs of the specific child. Several blacklists can be imported and merged into one.

Architecture

  • Filtering can be done either on the server side (with the cooperation of Mediawiki software, by associating blacklists with user accounts) or on the client side, using web browser extensions. I favor the latter approach because it avoids Mediawiki ever having to deal with blacklists, which is a point of contention in the community. This also avoids trivial circumvention methods like logging out or clearing cookies.
  • Blacklists/whitelists can be stored on local storage. Initial blacklists/whitelists are not shipped with the web browser extension, but are instead selected from the above-mentioned public distribution center.
  • Categories can be extracted using the Mediawiki API.

Design notes

  • To avoid circumventing the filter by editing to remove categories, recent removal of categories can be ignored. If the edit stands and is not reverted as vandalism it will eventually be accepted.
  • In addition to retrieving the categories of each article/file used in the article, it's necessary for each category to walk up the category tree to get its ancestors in case they're listed in a blacklist/whitelist, which can lead to a lot of Mediawiki API calls per page load. This can be mitigated by caching categories.

I pointedly avoid the idea of "labelling" specific articles or files with a "type" of content, although a blacklist may be intended for a specific type of content. I also strongly oppose the idea of having a single one-size-fits-all blacklist, or even just a small number of them to choose from - I think it's essential to let the user community build lists to their tastes, and to force users to choose among a wide, fine-grained collection to discourage overly broad filtering.

Any feedback is welcome. Dcoetzee 07:48, 2 July 2011 (UTC)Reply

I heartfully agree and support Dcoetzee proposal above. It will be much less contentious and more productive if the whole debate about what is censored and what is not stays out of Wikimedia, as well as the way of implementing it.--- Darwin Ahoy! 02:00, 3 July 2011 (UTC)Reply
I don't like the idea of the filter being enabled "by parents or guardians". If the small children is considered grown up enough to surf the web, he's certainly smart enough to control the filter as requested by his parents or guardians. It seems an unnecessary complication (management of passwords?!) with great danger of abuse e.g. by network managers. Nemo 07:37, 3 July 2011 (UTC)Reply
A password seems too much. That would conflict with the idea that no readers are prohibited from seeing any image. Better for such a system to simply let readers change the default display setting (hidden or not) for categories of images. SJ talk | translate   11:42, 3 July 2011 (UTC)Reply
I see nothing wrong if this is made based on existing content labelling standards. This would allow existing parental control softwares to work immediately with such labels. It's not our job to determine if these labels are working or not, or can be easily avoided by people wanting to pass over. But at least those that want those filters for themselves will be satisfied. It's just enough and if it allows more people visiting or contributng without fearing to see such images, we will win something. (Note that enyway, content filters based on the textual content of articles are already working, we made absolutely nothing to prevent this, but if the existing softwares can't easily parse images, all they will finally do is to block images from Commons completely, whatever they are showing in any article : this is already occuring in third party sites that are reindexing and republishing selectively the content of Wikimedia sites).~
We are NOT proposing any kind of censorship, only content labelling based on existing standards. Censorship only occurs elsewhere and we have absolutely no control on those third party sites or softwares. Users will setup their own filters themselves or will have to live anyway with these third party sites and softwares; or to their national legislation, if it is applicable to them, but we may (and probably should) still help them comply to their law). verdy_p 16:56, 3 July 2011 (UTC)Reply
Wrong. Categories should be broad and small in number. One either doesn't mind seeing photos of sexual activity, or chopped up body parts or one doesn't. Broad banding is preferred unless one really wants to give users the choice of saying don't mind porn but none of that gay stuff, do you really want them to be able to say no to piss Christ but yes to piss Mohammed, or no piccies to dead Westerns yes to piccies of dead Africans? John lilburne 17:11, 3 July 2011 (UTC)Reply
John - Can you articulate why it would bother you if someone else chose to see one of those complicated sets of categories? SJ talk | translate   18:59, 4 July 2011 (UTC)Reply
Principle of least astonishment. Imagine someone who is taken aback upon seeing a sexually explicit image, decides to use this feature, and finds them self looking at a menu that catalogues every conceivable particular sex act or fetish. It could be a double-take aback. ~ Ningauble 19:39, 4 July 2011 (UTC)Reply
That's a good principle for interface design - as is broad banding. But that doesn't necessarily preclude having an option to create a custom set of categories (or choose from a longer user-defined list) for one's own use -- which is how I read John's comment.

Questions about the design, existing alternatives

Hello there, I will just add a few comments regarding the upcoming referendum. As to my background: I am a Computer Scientist by profession, so I can technically judge what is involved. As a quick outline:

  • A user (either anonymous or named) is able to exclude certain images from search reasults. In the case of anonymous users, the preferences are stored inside the users session. Closing and re-opening the browser will reset the settings. Users who log in to edit can store their settings in their profile; their settings will not be reset when they close/open their browser.
  • Architecture wise: blacklists and whitelists can be used. To be determined: do these act on groups of images, or on single images? - Is it possible to whitelist single images in a group that is blacklisted? - To blacklist single images of a whitelisted group?
  • How does the software identify the images to "filter"? - There are two options: At load time, the software analyses the image; the result of this analysis is used for filtering. This will incur extra costs in computing time, and memory; There are different algorithms, which yield different results. The other option is static tagging. This option has the drawback that some people need to decide the tags to use ("tag wars" have been cited above). Also the behaviour needs to be specified if an image does not have any tags; the blacklist/whiltelist approach can be used.
  • There are programs on the market that implement a client-side proxy, and that probably cover 80-85% of what this development will achieve. I currently see no benefit in implementing this solution on the server. The solution where the filtering is done dynamically (i.e. no static tags), and on a per-image basis would probably be superior to the client-side filtering. This however comes at the cost of additional cpu and memory usage, as well as false positives/false negatives.

To summarize:

  • If the solution of static tagging is chosen, we have the problem that images need to be tagged, and "agreement" over the tags to use needs to be reached in some way. Also, the behaviour in the case of an untagged image needs to be defined. Finally, we need to define the granularity: Is it possible to "whitelist" individual images of a group that is "blacklisted" (or to "blacklist" individual images of a whitelisted group). Finally: how do we determine the "tags" (or group of tags) to use?
  • If we tag dynamically, we incur extra costs in cpu and memory use of the system. We need to reach agreement over the algorithms to propose for identifying images; we need to implement those algorithms, which may be technically difficult; we may need to think about caching results of calculations, to reduce cpu load. Also note that the algorithms use stochastic information. There will be false positives, and false negatives.

Both approaches have their benefits, and drawbacks. Neither is "quick to implement". So given that client proxies ("filters") out there probably cover 80-85% of the requirement the usual client needs ("don't show images of nude people of the opposite sex"), where is the use case that would justify 3-5 people work 3-6 months, to get the extra 15-20%? --Eptalon 09:12, 4 July 2011 (UTC)Reply

Please provide more data to explain what "80-85%" means to you here. (list some of the clients you have in mind, and the use cases you feel constitute the 100%). If there are client-side tools that are aware of Commons image categories [or can be customized to be] that would be a useful data point. (And pointing to an open-source client-side option for readers who want one, that is known to work smoothly with WM projects, would be in line with the goal here). Two use cases, for discussion purposes:
  1. You're browsing wikipedia, possibly on someone else's machine, and want to toggle off a class of images. [for instance: giving a WP demo at work, or in Saudi Arabia, &c.] It may not be possible to install your own client software, and you'd like to be able to set this in under a minute.
  2. You come across a specific image you don't want to see again (and checking, find it is part of a category of similar images), and want to hide it/them in the future.
SJ talk | translate   15:42, 4 July 2011 (UTC)Reply
Client-side proxy filters are aimed at parents worried that their children might see the wrong type of image; AFAIK most of them work with whitelists/blacklists of sites; they do not do an on-access scan of the image. In addition, they might do "keyword scanning" in the text (to filter hate sites, and similar). The customisation of these products lies in being able to select "categories" of sites to block/allow, perhaps on a per user basis. Our "static category blacklist/whitelist" approach would in essence do the same thing, except that to achieve it, we need to do development work, and at the best, we match the functionality of a USD 50 product. In addition, load is placed on our servers to do the filering work (+possible problems with the categorisation). Using the dynamic approach will mean even more load on our servers, the possibilities of "false positives"/"false negatives"; the difficulties in finding training data (note: that data can not be used later on), etc. In short: a lot more (difficult) work. We may exceed the USD 50 product as to functionality, but we have 3-5 people developing 4-6 months. I really don't know if I want to spend up to 250.000 usd (24 man-months) to "not see an image again" - it seems out of proportion.--Eptalon 19:29, 4 July 2011 (UTC)Reply
Point of clarification... $250,000? NonvocalScream 19:36, 4 July 2011 (UTC)Reply
Let's be very careful about throwing around dollar figures. That hasn't been scoped, so I think it's dangerous to introduce false numbers to the equation at this point. Philippe (WMF) 19:39, 4 July 2011 (UTC)Reply
Duration of project: several people, 4-6 months (for the dynamic approach, not using static tags). --Eptalon 20:15, 4 July 2011 (UTC)Reply
According to whom, by what metrics and judging by the speed of what resources? How about we let the folks who are designing the thing scope it out, once it's, you know, designed? Philippe (WMF) 23:59, 4 July 2011 (UTC)Reply
Eptalon, you seem to assume that everybody's got a web proxy. I don't. If I really, really don't want to see the infamous picture of the guy jumping to his death from the Golden Gate bridge, my options at the moment are:
  1. Don't read Wikipedia (because any vandal could add it to any page at any time),
  2. Especially don't read pages where that image might logically be present (bummer if you need information about suicide), or
  3. Figure out how to manually block all versions of that image in every single account (five) and every single browser (two) on every single computer (four) I use—which will effectively keep that image off my computer screen, but not any others like it.
This proposal would let me control my computer by clicking a "don't really feel like seeing images of dead bodies today, thanks anyway" button. The images would appear "hidden", and I could override the setting any time I felt like it by simply clicking on the "This image hidden at your request because you said you didn't feel like seeing any images of dead bodies today" button. There is nothing here that would let some institution control my computer. WhatamIdoing 19:23, 5 July 2011 (UTC)Reply
I dont have one (or use any filtering software); Your porposal shifts the problem though. You need to agree with other people about the categories. In the 16th century, a painter called Lucas Cranach the Elder pained a woman, before a tree, wearing a necklace (called 'Venus'). In the same century Mechalangelo did his statue David. In the 19th century, Jules Joseph Lefebvre painted a woman with a mirror ('Truth'). To me, all these works are works of art, and as such, limiting their audience does not make sense. In the 1990s, a museum in London, used Cranach's painting as an ad for an exhibition; they showed it on posters in the London Underground - and there was an outcry.--Eptalon 14:27, 6 July 2011 (UTC)Reply
@Eptalon: I think there is a very real advantage to doing a system specific to Wikipedia that takes advantage of our category structure: filtering can be made more precise, which means not just missing less things that people want to block, but more importantly, avoiding blocking educational materials that young readers need access to to learn. This is our chance to give people a solution that isn't as conservative and overzealous as every other generic solution on the market. Dcoetzee 22:31, 16 July 2011 (UTC)Reply

Logging in to permanently save

Just to point out: the concept presented in File:CC-Proposal-Workflow-Anon-FromNav-Step3.png (logging in to permanently save options) wouldn't necessarily work in the classic scenario of parents applying filters to their children's computers, as:

  1. Their children would then edit using the accounts that have been logged in (pro: would work well in terms of accountability, con: what happens if the account is subsequently blocked for vandalism? Note that the account would most likely pass semi-protection due to the length of time since creation - although it might not pass in terms of number of edits, at least to start with.)
  2. Their children could simply log out, and/or log in with their own accounts, thereby bypassing the filter.

Mike Peel 21:48, 23 July 2011 (UTC)Reply

Protection levels

Just like many security software appplications have many security levels, I propose the same for this image filter. There must be few levels per tag, 3 or 4, so decisions are easy. For example, 0 for no sexual content, 1 for light clothes like here, 2 for underwear and 3 for visible genitals. --NaBUru38 18:39, 15 August 2011 (UTC)Reply

Opt-out of seeing the "hide content" tabs

If we were to implement this, I would like to see a user-configuration option (which obviously applies only to registered users) that allows me to turn OFF the "hide content" tabs so that I don't see them. That way my limited screen space is not cluttered with things that I will never click. Of course the default for that option can be "show tabs", because I'll only have to turn them off once. Mitch Ames 10:07, 16 August 2011 (UTC)Reply

Voting process

Transparency of voting

Please consider using "open voting" in such a way that we can see/validate the votes, such as in steward elections. I do not trust a third party as currently proposed.

Also, please consider placing a sitenotice. I do not believe that VP posts are enough to inform every member. NonvocalScream 20:47, 4 July 2011 (UTC)Reply

Open voting works well for voting from Wikimedia user accounts, but I would imagine it won't work as well for votes from readers (assuming that readers will also be eligible to vote - which I would view as a prerequisite). Posting a site notice (along the lines of the fundraising banners) should also be a prerequisite to this referendum in my opinion. Mike Peel 20:54, 4 July 2011 (UTC)Reply
You're moving too fast. :-) The only thing that's been published so far is the call for referendum, which has been posted to village pumps and mailing lists. That is not the only notice we are going to be doing throughout the process — it's the first of many. We will be using a CentralNotice during the actual vote period, as far as I know. Cbrown1023 talk 01:22, 5 July 2011 (UTC)Reply
Voting will be verified as usual for the Board of Trustees elections (meaning, the committee will verify them using SecurePoll). The third party will not be responsible for that. The third party, incidentally, is Software in the Public Interest, as usual. Philippe (WMF) 01:23, 5 July 2011 (UTC)Reply
Ok, I understand Cbrown's remark and I'm comfortable with that response. Could you help me understand the advantage to using the closed voting system, versus the open one? NonvocalScream 04:17, 5 July 2011 (UTC)Reply
It's the same advantage that convinces us to use secret ballots in the real world: A greater likelihood that people will freely vote for what they individually believe is best for themselves and the group, without fear of public censure or retaliation. WhatamIdoing 19:27, 5 July 2011 (UTC)Reply
And automatic checking is less time consuming than hunting for voters with their second edit etc Bulwersator 06:49, 7 July 2011 (UTC)Reply

Status of referendum text

There's less than a month until this vote is supposed to start (August 12). A few people have asked to have some advance notice on the text of the referendum prior to the vote (plus I imagine it needs to be translated). When will the text be available? --MZMcBride 07:00, 19 July 2011 (UTC)Reply

Hopefully in the next 24 hours, if all goes well. Thanks :) Philippe (WMF) 18:53, 19 July 2011 (UTC)Reply

Vote/Referendum Question

When can we see the proposed wording of the question? Kindly, NonvocalScream 17:43, 23 July 2011 (UTC)Reply

You've probably already seen it, but: Image filter referendum/Vote interface/en. Cbrown1023 talk 18:56, 15 August 2011 (UTC)Reply

Suggest splitting question

I would suggest splitting "It is important that the feature be usable by both logged-in and logged-out readers." into two questions: one asking about logged-in readers, the other asking about logged-out readers. Otherwise, what would anyone that thinks that the feature should only be enabled for logged-in readers vote? Alternatively, if this question is intended to ask "should this feature be enabled for logged-out as well as logged-in users?", then the phrasing should definitely be improved... Mike Peel 21:10, 23 July 2011 (UTC)Reply

suggestions welcome :) the latter is what was meant. Basically, you can do a lot with account preferences, as you know; it takes a bit more hackery to do it for anons... but that's 99% of our readers, who are the target audience. -- phoebe | talk 18:45, 31 July 2011 (UTC)Reply
Actually, it's all of us on occasion, because you can't stay logged in for longer than 30 days. That means that even the most active user is going to have at least one moment every 30 days when s/he is logged out. WhatamIdoing 19:23, 4 August 2011 (UTC)Reply
I arrived here with the same question that Mike Peel asked. Votes based on the current wording may not produce as clear results as if it were split into two questions. Rivertorch 05:01, 10 August 2011 (UTC)Reply
I agree with Mike Peel, that question should be split in two. -NaBUru38 18:40, 15 August 2011 (UTC)Reply

Importance vs. should/shouldn't

"It is important for the Wikimedia projects to offer this feature to readers." - again, there are two separate questions here that have been rolled into one. (a) should the Wikimedia projects offer this feature to readers?, and (b) how important is it to offer this feature? Although answering (b) implies support of (a), asking about importance is a very separate question from whether the feature should or should not be enabled. If the majority of the community believe that Wikimedia shouldn't allow this feature, then the importance doesn't particularly matter. (in contrast to: if the feature is rated as low-importance, then that could mean that the WMF still funds its development and implementation but doesn't rate its development as highly as if it were high importance). Mike Peel 21:15, 23 July 2011 (UTC)Reply

Note that I don't believe that this should be subject to Wikimedia's standard rule of consensus: if e.g. 10% of voters believe that this feature should be enabled, then that's probably a sufficient amount of people (i.e. representing sufficient demand) to make this worthwhile implementing - particularly if those voters come from under-represented parts of the world. Mike Peel 21:29, 23 July 2011 (UTC)Reply
We are not asked if we want this feature nor not. It will come anyway as there is no way to stop it by this farce. --Eingangskontrolle 09:00, 16 August 2011 (UTC)Reply

Voting eligibility

I have to admit that I simply can't understand the rules for eligibility to vote for this election. My questions include:

  • Why can't Wikipedia readers vote? (should this really be restricted to the editing community?) - this is my key point in leaving this message. I know that this presents technological difficulties, but this should be the same technological difficulty that is present in implementing the image filter, so the difficulties should inherently be solve-able (otherwise this exercise is somewhat moot)...
  • Why should mediawiki developers, WMF staff and contractors, and board members be eligible to vote if they don't meet the 'editor' criteria? Either they should have sufficient experience on-wiki to meet these criteria, or they will most likely be working on topics unrelated to this and hence their vote shouldn't necessarily count
  • If WMF staff, contractors and board members are eligible to vote, then shouldn't the same apply to other Wikimedia staff, contractors and board members - i.e. including those of the Wikimedia chapters?

P.S. my input here is meant to be constructive, and I hope it comes across as being in that spirit. Apologies if it comes across otherwise... Mike Peel 21:27, 23 July 2011 (UTC)Reply

We did try to come up with a way to include Wikipedia readers; however, it is nearly impossible to ensure a representative result. For example, certain countries have minuscule IP ranges, and it would be impossible to tell if votes received were from hundreds of different readers, or a handful of readers voting multiple times; the same could be true for readers from large institutions who operate through a single or very small IP range. As to the remainder of the criteria, I believe this is intended to be the standard group of criteria, with the only variation between votes/referendums being the minimum number of edits. The intention is to be as inclusive as possible while still doing our best to ensure each person casts only one ballot. Risker 23:23, 24 July 2011 (UTC)Reply
How about using cookies to identify individual computers? There is still the potential for people to abuse that, by deleting the cookie / resetting the browser, but it largely avoids the IP address issue. You could also (temporarily) record the additional information of IP address, internet browser, operating system version, etc. in combination with looking for identical votes (e.g. same rating for all questions each time) which would help identify multiple votes from the same ___location. Since there should be a very large number of people voting, a small number of people voting 2-3 times won't matter that much (it'll be well within the noise / uncertainty) - so you'd only need to pick out cases where someone might have voted over 10 times or so.
Re "standard" - I don't believe that there is a standard yet, since this is only the second referendum that I'm aware of (the first being the license migration), which means this is part of setting the standard. Either way, the voting criteria should be thought through each time to make sure they're appropriate. I'm obviously all for being as inclusive as possible, but I'm not sure that the additional criteria are inclusive in a balanced way. Mike Peel 08:42, 25 July 2011 (UTC)Reply
I don't think this sounds like a good idea at all. Cookies can be deleted very quickly, a new IP address can easily be requested, and there is no law that says that a browser has to send the same identifying information all the time. While each of these may seem like a layer of extra protection, the problem is that the one hacker defeating the vote will know how to get around them all. Of course, it is possible that an editor could start - or hack into - lots of extra accounts to do the same thing, but at least that actually sounds like work.
Also, I really think that someone who only reads, never edits, wouldn't appreciate the issues involved. If all you do is read articles you have no idea of the kinds of rancorous debates that can get started over one word in an article. This "filter" idea might sound perfectly practicable to someone in that position, who doesn't realize how much drama would erupt over any borderline categorization. Wnt 19:56, 30 July 2011 (UTC)Reply
I'd like to know the opinions of the logged-out readers, but it sounds technologically difficult. Because of these difficulties, I probably wouldn't treat the two kinds of votes as being equivalent. That is, it might be nice to know that about ___ percent of unregistered users thought this or that, but you'd want to take that with a huge grain of salt, whereas the same statement about registered users could, I think, be relied on as fairly precise. WhatamIdoing 19:27, 4 August 2011 (UTC)Reply

Quantification of representation of the world-wide populace

It would be very useful and interesting to quantify what fraction of different countries hold that this is an important (or unimportant) feature, particularly if it turns out that the countries with low representation in this poll and in Wikimedia's readership believe that this is a key issue. It is crucial that this is quantified given Wikimedia's pre-existing biases, which could mean that the poll will simply measure the selection bias present in those questioned.

If e.g. it turned out that American (or British/European) editors thought that this feature was unimportant, but that African/Asian cultures thought it was very important, then it would obviously be very important to implement it for Wikimedia's future growth (even if that implementation happens on a geo-located basis). If the vote came back simply saying that this was not important to Wikimedia's current editing populace, then it could be argued that this just shows a western bias rather than presenting meaningful results. Mike Peel 21:36, 23 July 2011 (UTC)Reply

I totally agree. I hope we can figure out a way to measure this while keeping anonymity etc. Side note: this is a complex (but hopefully solvable) type of problem -- how to take a global, representative referendum on any question that affects the projects? this is something we need to figure out how to do better, now that we are collectively so big. I'll be happy if this referendum is a test for good process. -- phoebe | talk 18:32, 31 July 2011 (UTC)Reply
Similarly, although perhaps somewhat less importantly, if the feature is strongly supported by women, then I hope that fact would be considered relevant to Wikipedia's future growth. WhatamIdoing 19:29, 4 August 2011 (UTC)Reply

Transferring the board

Maybe it could be wise to transfere the board to a more neutral country like Switzerland? aleichem 23:45, 24 July 2011 (UTC)Reply

ha :) which board? The WMF board doesn't actually really have a home (members are from the US, Europe & India at the moment); but we do govern the WMF which remains US-based. At any rate, the board's deliberations are more influenced by wikipedian-ness than they are by nationality :) -- phoebe | talk 18:29, 31 July 2011 (UTC)Reply
which remains? on what authority? aleichem 01:31, 7 August 2011 (UTC)Reply
The Wikimedia Foundation is a legal entity located in the USA, moving it to Switzerland would mean moving it to a country with laws much closer to European Union Privacy law. I for one am not convinced that this could be done without deleting a lot of information we hold on living people. The Foundation is ultimately governed by a board and several members of the board are elected by the community, either directly or via the chapters. So I suppose that if the community wanted to move the legal registration from the USA to Switzerland or indeed any other country the first thing to do would be to present a case for this and the second thing would be to elect board members who favoured the idea. At the moment I think that is a long way off, and judging from the recent elections there is little if any support for fundamental change to the WMF. WereSpielChequers 08:07, 9 August 2011 (UTC)Reply

Inadvertent overwrite with telugu translation

Admins: Please restore the english version.Sorry for the inadvertent overwrite.-Arjunaraoc 12:01, 25 July 2011 (UTC)Reply

You don't need to be an admin to revert changes. ;-) I've reverted the page back to the english version. Mike Peel 13:07, 25 July 2011 (UTC)Reply

Neutrality

 
An example of what a completely non-neutral filtering tool would look like.

The FAQ for this referendum says that one of the principles behind the creation of the filtering tool is that "The feature is to be culturally neutral, and all-inclusive". However, the mock-up designs displayed show completely non-neutral filtering options. The IFR page says that "These are subject to change based on the outcome of the referendum and the realities of feature development, but it's likely that the final product would look very similar to these mock-ups". I think it would be very helpful if this were made clearer, explaining that these are just images showing what the interface style might look like, and that the actual feature will actually be neutral, giving no possible filtering setting any higher availability than any other conceivable filter, and not at all include any options like those given in the images displayed, assuming that this is the case, of course. If this is not the case, the FAQ must be corrected before the start of the referendum, in order to allow people to understand whether this feature would abolish NPOV before they vote. --Yair rand 22:29, 26 July 2011 (UTC)Reply

Hi Yair. Thanks for your comment, but I think you're likely misreading the intention behind these (admittedly vague) statements. The board specifically noted neutrality in design as a core principle -- we were thinking of things like category names. "Nudity" is pretty objective and neutral, for instance, as a descriptive category; "bad content" is not. This is important. That doesn't mean, however, that we can't pick certain categories of images to make hideable, as those that are commonly controversial across many cultures. I don't quite know what you mean about interface design; but I disagree quite strongly that the feature would "abolish NPOV" -- remember in this proposal nothing goes away permanently, and editing standards remain the same. -- phoebe | talk 18:18, 31 July 2011 (UTC)Reply
How is that relevant? The reason (well, one of them) behind rejecting repeated proposals for temporary/permanent hiding of certain "objectionable" content was that "objectionable" means something different for everyone and hiding certain content would be giving preference to certain groups in violation of NPOV, which, as you helpfully pointed out, is exactly what this image filter will do, and not, as SJ suggested above might be the case, give options to hide any such images in any neutrally selected categories not giving preference to select groups. --Yair rand 20:32, 31 July 2011 (UTC)Reply
  • There will be no permanent hiding of anything. Every single reader will be able to see every single image he wants to see.
  • This is not a one-size-fits-all filter system. You decide what categories of material you find objectionable (if anything). It doesn't matter what everyone else wants to hide. Consequently, it doesn't matter if "objectionable" means something different for every reader.
  • Putting a 'click here to see the image' button on your screen doesn't impair NPOV at all. Failing to load the picture of the man jumping off the Golden Gate Bridge when the page loads the first time does not "give preference to certain groups" or "violate NPOV". WhatamIdoing 19:37, 4 August 2011 (UTC)Reply
  • It is a one-size-fits-all filter system, though, because the filtering categories with which the end user is presented, as well as the content of those categories, will be decided by the community. That will inevitably mean edit wars and non-neutrality. I too would much prefer the system that SJ suggested, but that is not what we will be voting on here.--Danaman5 00:57, 7 August 2011 (UTC)Reply
The fact that multiple options will be presented means that it's not a one-size-fits-all system. You could choose to see everything except one type of images, and I could choose to see everything except one different type of images, and the next person could choose to see everything except four types of images. The ability to configure it means that it's not one-size-fits-all. It might well be a 5040-sizes-fit-most system (that's the number of combinations available in the sample above), but it is not a one-size-fits-all system. WhatamIdoing 22:49, 14 August 2011 (UTC)Reply
You are wrong. The user can decide, that he does not want the see "extreme politicans". But who will fit to this description? Some will say Hitler, Stalin, Goebbels. Other will include Mao, Lenin and Mubarak. And still others will include Palin, Obama or George W. Bush. I predict a censors war. --Eingangskontrolle 09:10, 16 August 2011 (UTC)Reply

What does this mean?

The last question: "It is important that the feature be culturally neutral: as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial."

Does this mean that the categories would be drawn according to some kind of global average? Does it mean more categories would be created otherwise? I don't know what effect a vote either way would have on development. Wnt 19:49, 30 July 2011 (UTC)Reply

We were thinking, with this and a few other points, of sort of a straw poll on what principles are most important. I'm not yet sure what it would mean specifically for design, but the answer to this would help everyone know what's important to the wikimedia community at large (I can assume that many people agree with this principle, for instance, but not everyone does -- I expect a truly global standard would offend both those who think there is nothing controversial in our image collections, and those who think there is a great deal of controversial material). -- phoebe | talk 18:27, 31 July 2011 (UTC)Reply
I take a culturally neutral feature as being one that can be tuned to suit people of many diverse cultures, some might choose to filter out images that I'm happy to look at, others might choose to see photos that I'd rather not see. That means a much more sophisticated filter than a simple on off button, and consequently so many user chosen lines that there is no need to debate where one draws the line. If instead we were debating a simple NSFW option of excluding material deemed Not Safe For Work then yes we would need to work out a global average of prudishness. But we aren't so we don't need to. WereSpielChequers 07:54, 9 August 2011 (UTC)Reply

Meaning of the Referendum

I read the questions as listed now as "We've decided to implement this feature, so how should we do it?", and not as "Should we implement this feature?" Is that a correct reading?--Danaman5 06:06, 4 August 2011 (UTC)Reply

I would really like an official response to this from someone involved in planning this referendum. Is it already a certainty that this feature will be implemented, or not? If not, the questions really need to be revised.--Danaman5 01:00, 7 August 2011 (UTC)Reply

  • Hi Danaman5, yes, the Board passed a resolution asking for this kind of feature to be developed; so that's a given. The questions are asking for community input in how it is developed and implemented. -- phoebe | talk 05:00, 15 August 2011 (UTC)Reply
Rely on "the" category system? Many wiki projects have their own image uploads and their own categories, so we are dealing with many category systems. Currently the Commons category system is really quite an internal thing, the images displayed in an article on any other project are not affected in any way by what we do to the category system, images are recategorized all the time, category structures are modified all the time, if filtering relies on the commons category system we can now cause huge disruption very easily in our everyday editing. We are currently going backwards as regards getting new uploads properly categorized, I doubt the extra energy that will go into maintaining categories that comply with the filtering requirements will help that problem. I suppose an optimist would hope that the new emphasis, put on Commons categories, would mean that more people would come to Commons to help. But I expect that the filtering system will attract zealots trying to censor material by categorizing images that they (and no one else) want hidden, and given the current backlog in categorization, they will not be rapidly reverted. --Tony Wills 09:22, 9 August 2011 (UTC)Reply
Ok, I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)Reply

Combination with content filters

When designing this feature, in the case it gets accepted, perhaps you also could find a way to allow content filtering software to integrate it in their products? Or perhaps a little software that automatically enforces certain filters, protected with a password? This would mean that parents who don't know how to set up the filtering themselves, or who want to prevent their children from disabling it, can also profit from this. Adelbrecht 19:44, 5 August 2011 (UTC)Reply

This is a thorny but important issue. Currently we don't have supervised accounts where parents can set preferences for their children, hence this proposal where an 8 year old would be only a click away from overriding a filter. I would anticipate that any content filtering software should be programmable to use these filters - the bigger question is which content filters filter out the whole of Wikimedia and which only filter out certain images or categories. WereSpielChequers 04:49, 9 August 2011 (UTC)Reply

IP editors

The current mock ups show the edit filter option appearing even if you are logged out. I'm uncomfortable about this as it implies that one person using an IP could censor the viewing of others. I would be much less uncomfortable if this was cookie based, so I could make decisions as to what appeared on my pc. I don't like the idea that the last person to be assigned my IP address gets to censor my viewing. So my preference is that filters are only available if you create an account and login, or via a session cookie on that PC, not a permanent cookie. WereSpielChequers 04:49, 9 August 2011 (UTC)Reply

I believe it would be handled based on cookies, not based on an IP address. You're right that having the settings associated with an IP address just wouldn't make sense. Cbrown1023 talk 21:16, 9 August 2011 (UTC)Reply
Thanks for the explanation. I can see an advantage that schools will be able to set a filter, but of course anyone will be able to change the filter for the whole school IP. Libraries and internet cafes will be in a similar situation. I foresee this leading to lots of queries. WereSpielChequers 14:58, 10 August 2011 (UTC)Reply

How will "the community will decide the category structure and determine [image categorization policy]"?

What images will be hideable?
Hideable images will be grouped into categories. The community will decide the category structure and determine which images should be placed within those categories.

As others have said above, (I redundantly reiterate their sentiments merely to bump the issue), this will just lead to categorization edit-wars and contentious, unproductive debates over which categories are worthy of being censorable (or possibly significant server strain to accommodate ridiculously flexible censorship options). I could perhaps envision semi-private categories created by subcommunities, but this would have obvious ownership+neutrality issues. On the whole, I think the entire thing is much better left up to third-party content filter providers; there is no need for Wikimedia to so entangle itself. I, for one, shall therefore vote this down. There are the best of intentions at work here and some good ideas, but I fear the actual plan of execution is fatally marred and frustratingly vague. Good day, --Cybercobra (talk) 07:27, 9 August 2011 (UTC)Reply

I tend to agree. However worthy the idea, I don't think our category systems are robust enough to make filtering reliably workable. Some of the ideas about the category system (which I take to really mean the Commons category system) assume that mis-categorization, or malicious (re)categorization would be noticed and fixed in short order - I don't think this can be expected to happen reliably, there aren't enough eyes at work here, and as far as I know you can't "watchlist" the contents of a category. I think any filtering or censorship needs to be done outside the current projects. If there aren't already such projects I think a children-safe wikipedia project would be one approach. --Tony Wills 09:39, 9 August 2011 (UTC)Reply
I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)Reply
Even if the categorisation is currently incomplete that isn't an argument not to do this. If someone wants to filter out penises and the filter only filters out 95% of them that is a job more than 95% done, not least because a complaint about one of the 5% is easily resolved with hotcat rather than as at present an argument as to whether something is encyclopaedic or educational. WereSpielChequers 18:05, 9 August 2011 (UTC)Reply
An interesting side effect of this sort of filtering is that editors may be less inclined to self-censor, and will add more provocative images to articles on the basis that those who do not like them can filter them out. --Tony Wills 22:55, 9 August 2011 (UTC)Reply
Or the opposite could happen with people choosing to illustrate articles with images that people are willing to see. But my suspicion is that the vast majority of current editors will censor few if any images that are actually used in articles. This is something that a small minority of editors are very keen on and which will have little or no effect on editors who choose not to use it. WereSpielChequers 02:18, 10 August 2011 (UTC)Reply

Please be sure to release ballots

The data is invaluable. To 'announce the results', release the full dataset, minus usernames. Sublte trends in the data could affect the implementation. E.g. If targets Brazil or India overwhelmingly want this feature, that's alone might be a good reason to build it. --Metametameta 10:03, 10 August 2011 (UTC)Reply

Well, I'd certainly like to see the (properly anonymised, of course) dataset, if only because I'm a sick person that enjoys analysing stacks of data. Disagree with your example though, the opinion of someone in India or Brazil should not be considered any more important than the opinion of someone in Zimbabwe, Australia, or Bolivia (to give some examples). Craig Franklin 08:03, 16 August 2011 (UTC).Reply

Referendum start moved to Monday, August 15

The start of the referendum has been moved to Monday, August 15, and all of the other dates will be bumped by around 3 days. Instead of rushing to put the technical aspects together quickly, we decided it would be better to just push the start date back a little and do things right. Updated schedule:

  • 2011-08-15: referendum begins.
  • 2011-08-17: spam mail sent.
  • 2011-08-30: referendum ends; vote-checking and tallying begins.
  • 2011-09-01: results announced.

This gives us more time to finish up the translations and look at the proposals. :-) Cbrown1023 talk 19:01, 12 August 2011 (UTC)Reply

Hi. I've got some questions in IRC regarding why the referendum has not started today as scheduled and I can't find any information on-wiki regarding this. Any ideas? Best regards, -- Dferg 15:34, 15 August 2011 (UTC)Reply
Generally we are just waiting on the rollout of notices and making sure the voting software is coordinated... I think we are suffering from having people in lots of different time zones, so eg. while it has been the 15th of August for a while in some places, it is only just still 9:00am in San Francisco :) We are planning to get it out *sometime* today. I don't know if there's been any other technical delays. -- phoebe | talk 16:32, 15 August 2011 (UTC)Reply
Indeed. It's going to start sometime today. :-) Cbrown1023 talk 16:38, 15 August 2011 (UTC)Reply

Sigh ... I guess I'll dismissing more global notices over the next two weeks :/. Please no big banners. fetchcomms 19:16, 15 August 2011 (UTC)Reply

Self censorship is fine, thus this is a great idea

Hopefully this will decrease attempt by people to censor content seen by other such as we see here [1] Doc James (talk · contribs · email) 21:32, 14 August 2011 (UTC)Reply

Precisely. I can't understand why so many people are opposing a plan that will increase users' freedom to avoid content they don't want. Nyttend 04:51, 16 August 2011 (UTC)Reply
Because it can and will be abused by censors (there is nothing easier than to inject a cookie saying i don't want to see specific kind of content at the firewall level - which will have nothing to do with user preferences). Ipsign 07:13, 16 August 2011 (UTC)Reply
Well my concern at this proposal is quite simple: how will the filters be implemented? How will images be determined to fall into a given category? Consider Jimmy Wales' misguided attempt to delete "pornographic images" from Commons, which resulted with him selecting several acknowledged works of art -- & amazingly, none of the images of penises. The reason filters haven't been implemented yet, despite on-&-off discussion of the idea for years, is that no one has proposed a workable way to separate out objectionable images -- how does one propose criteria that define something, say, as pornographic without at the same time including non-pornographic images? (Dissemination of information about birth control in the early 20th century was hampered because literature about it was treated like pornography.) Explain how this is going to work & convince me that it is workable -- then I'll support it. Otherwise, this proposal deserves to be filed here & forgotten. -- Llywrch 06:05, 16 August 2011 (UTC)Reply

Scope

If I understand this proposal correctly, the settings are to apply to each user's own account(s) and are customisable. I do not see this to be a problem as it would entirely be a matter of personal choice. However, assuming that non logged in users are able to see the full gamut of images from the smutty to the explicit, I fail to see the point: Creating such a system, which may or may not include the rather elaborate protection suggested system requiring passwords, seems to be technology for its own sake. Something that can be circumvented by simply being logged out has to be pointless. --Ohconfucius 04:56, 16 August 2011 (UTC)Reply

Agreed! --81.173.133.84 08:12, 16 August 2011 (UTC)Reply

Self Consorship?

(durante décadas la gente ha luchado por liberarse de censura disctatorial, por poder ofrecer una educación integra y laica a todos. como es que aquí se puede considerar que hay "imagenes controvertidas". en dicho caso se debe someter a modificación el artículo cuando por supuesto no se refiere a manipular la historia, no considero factible por ejemplo censurar imagenes historicas de guerra, educación sexual, fotos de comunidades nativas, imagenes de como las rebeliones, torturas. en ese sentido cada uno es responsable por lo que quiere ver y puede en su propio ordenador o movil adaptar un control paternal sobre las imagenes. si el problema son los niños... tendran miedo porue no se sienten capacitados para dar una explicación de acuerdo con la edad de ellos, por lo que deberían buscar a un profesional para que les ayude a dialogar. Son los niños los que deben entender mas el mundo que les rodea y sus origenes, para así crear un futuro mejor y más unido en la paz. No los dejen en la ignorancia, ese es el objetivo de este proyecto, conocimiento al alcance de todos libre de la enferma censura.)

for decades people have fought for freedom from censorship disctatorial, be able to offer education and integral secular all. as it is here to be considered as "controversial images." in this case must be submitted to change course when the article is not about manipulating history, I consider it feasible, for example historical images of war censorship, sex education, pictures of native communities, as images of rebellion, torture. in that sense each is responsible for what you want to see and be on your own computer or mobile phone to adapt a parental control over the images. if the problem is the children ... be afraid porue feel unprepared to give an explanation according to their age, so they should seek a professional to help them to talk. They are children who need to understand more the world around them and their origins, to create a better and more united in peace. Do not let them in ignorance, that is the goal of this project, all knowledge available to the patient free of censorship.

Waste of time

Here is my comments that I made in my vote:

There is very little to be gained if this feature was introduced. The way Wikipedia is written it would be unusual for offensive images to appear where they are not expected. To quote the example used in this poll, a reader will not get picture of naked cyclists if viewing the bicycle article. Editors generally use images that are applicable to an article since they are conscious of the wide range of sensitivities of the readers of Wikipedia. Images that are offensive to some may be present on an article about some aspect of sex for instance, but in these cases the reader may be unwilling to view the article.

Implementing this proposed feature will place yet another burden on the overworked editors. Rather than placing this burden on editors, of which the vast majority are volunteers, a better option is to have a clear warning on every page stating that Wikipedia is not censored in any way. This can easily added to the footnote containing other disclaimers that currently appears all Wikipedia pages.

If parents or guardians have wish to control the browsing habits of children in their care they can use other means of control such as supervision or content filtering software.

The internet contains huge amounts of easily accessible material that is considered by some to be offensive. Those who are offended by certain images must take responsibility for how they browse the internet in order to avoid offence. Alan Liefting 07:11, 16 August 2011 (UTC)Reply

Pictures of Muhammad

How will the Wikimedia Foundation avoid that people will vote for the option to remove pictures of en:Muhammad?

  • The recent version of the important article ar:محمد already seems to contain no picture of Muhammad, so it is factually censored.
  • Some people may argue, they feel hurt by depictions of Muhammad and that for them, these are images of violence.
  • The voting system is very vulnerable for abuse, e.g. by voting with multiple accounts, or by external organizations orchestrating votes.
  • You want to ask users to give their view on how important it is "that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial)". What will happen if the average outcome to this question will turn out to be less then 10? What will happen if the outcome turns out to be below 5?

--Rosenkohl 11:03, 16 August 2011 (UTC)Reply

Won't it be the users themselves who choose which type of images they want to hide, and which to view? Depictions of Muhammad may be tagged as such (instead of 'violence, etc.), so that readers who may be offended by such images may filter them out. --Joshua Issac 12:04, 16 August 2011 (UTC)Reply
How about any image of humans? we are all Gods image and therefor not to be portraited. --Eingangskontrolle 12:22, 16 August 2011 (UTC)Reply

What does "culturally neutral" mean?

I'm afraid don't understand what is meant in the questionnaire with the words "culturally neutral" and "global or multi-cultural view".

Some readers probably will think that something being "culturally neutral" would mean that it is applied for all person in the same way, regardless of their particular culture; so that a "cultural neutral" filter would never filter any image for the reason that it may hurt the culture of a reader; e.g. that a "cultural neutral" filter would never filter pictures of Mohammad (this at least was what I thought when reading the questionnaire the first time today).

However now I suspect, "culturally neutral" here has exactly the opposite meaning: that a filter takes account of the culture of reader, regardless which culture this is; so that a very "culturally neutral" filter would allow people to choose to not see pictures of Mohammad.

The same applies for the meaning of "global" and "mutli-cultural". I would think a "global" encyclopaedia provides the same knowledge, content and pictures for all readers, regardless of their particular local place, country, or culture. However, in the questionnaire "global or multi-cultural view" may mean the exact opposite, that a "global" and "multi-cultural" encyclopaedia would provide each reader with a different kind of knowledge, content and pictures, adjusted to their local place country, or culture.

So I think the wording of the questionnaire is highly misleading, if comprehensible at all, --Rosenkohl 13:08, 16 August 2011 (UTC)Reply

Tired of censorship

Many are becoming increasingly concerned about the range of unpleasant or explicit material available on the internet. However, many are also becoming rather concerned about the limitations on free speech and the growing acceptance of censorship. How does one reconcile these two? I acknowledge that Wikipedia is a community, but it is also an encyclopaedia.

When I was in school, I remember watching holocaust documentaries, etc. But it was important to have seen these things, however unpleasant. It was important to our education that we could understand. Had these images been censored, we would have felt cheated of a proper history education. If censorship of that kind is not practiced in a school, I dread to imagine the day it is accepted for an encyclopaedia.

We must consider the context of these images. They exist in an encyclopaedia (albeit online) and are almost certainly relevant to the subject. Some images may be unpleasant, but we shouldn't simply filter them out because of it. If a child were to look in encyclopaedia, they may find the same materials. Equally, there are academic books and journals available in every library containing the most horrific of images, but they are included because of their educational value. Filtering out relevant (or potentially highly relevant) images seems to be more harmful than helpful.

Whilst this would only affect those who choose to censor themselves, we are still permitting censorship and promoting ignorance. If the image is relevant to the subject, then the end user is placing restrictions on their own learning and filtering out content because of personal intolerance. Wikipedia aims to educate people, but it can't do that by filtering out their access to useful information (in the form of images), regardless of whether people make that choice. To paraphrase Franklin, those who would give up essential liberty don't deserve it.

Alan Liefting makes a good point. Why not simply have a content warning? If one watches a documentary, there is often a warning that the piece contains sex, violence, etc. The viewer is advised to use their discretion in watching the documentary, but the content is certainly not censored. Why should an encyclopaedia be any different?

Wikipedia was built on ideas of freedom of speech. Perhaps it's about time we defended it and took a harder line on censorship. If parents or particular institutions wish to block content, they are free to do so through various software, DNS services, etc. If they want to censor content, that's their decision, but we shouldn't be doing it for them. Zachs33 11:15, 16 August 2011 (UTC)Reply

Translation problems

Though ukrainian translation of questions is ready, this page is not in Ukrainian. How to fix this?? --A1 11:30, 16 August 2011 (UTC)Reply


Voting instructions

The voting instructions say "Read the questions and decide on your position." But the main questions I see are "Why is this important?", "What will be asked?", and "What will the image hider look like?", and I don't think I need to form a position on these. It's very confusing. Am I voting on how much I support or oppose each of the bullet points in the "What will be asked?" section, or how much I support or oppose the proposal as a whole? It's rather unclear. Either way, these are "questions" I can "read", but proposals (or principles) I can support or oppose. Could this be reworded for clarity? Quadell 11:33, 16 August 2011 (UTC)Reply

Another blow against freedom

A reason given for this change in your introduction is "the principle of least astonishment". In four years of using Wikipedia several times daily I have never accidentally come across an image that astonished me. That's a pity. Astonishment is a valuable ingredient of learning. I fear that the people who are pressing for this change are not those who are afraid they may see a picture of a nude cyclist (an example that you give), but those who wish to prevent others from this. It won't work. My mother, an astute psychologist, forbade me to read any of the books in my brother's room as they "were too old for me", with the result that I had read them all by the age of ten. Labelling an image as hidden will only entice people to look and find out why. When the people pressing for the change discover that it doesn't work, they will press for stronger controls. And then for even stronger controls. Freedom is surrendered in a series of tiny steps, each of which appears innocuous on its own. No doubt you will assure readers that you won't follow that path. You won't be able to keep that promise. Once on the slippery slope it's very difficult to get off. Apuldram 12:01, 16 August 2011 (UTC)Reply

Images are editorial choices

In wikipedia, images are editorial choices of the contributors, I see no reason why to only propose auto censuring tool for this part of the editorial choices made. Moreover, an image illustrates a topic but should (more reallisticly may) also be commented and described in the article itself; removing the illustration will make this part of the article useless or really weird.

IMO it's not our job as a community to implement a parental control solution. It's up to the parents. Does anyone want thoses kind of censoreship tools in museum or castel to not display the statue of Naked men, women and children ? To sum up, waste of time and money IMO. --PierreSelim 12:34, 16 August 2011 (UTC)Reply

What does the referendum mean?

 
An image many will avoid to see

The official start of censorship in Wikipedia. --Eingangskontrolle 12:31, 16 August 2011 (UTC)Reply

Don't be naive. Conservatives won't be offended by evolution enough to opt-out of being forced to view illustrations or evidence of it. This is about penises:
 
Something children should have the freedom to choose not to view.
--Michaeldsuarez 13:16, 16 August 2011 (UTC)Reply
Michaeldsuarez: No doubt you are tempting users to remove the image to prove your point. I don't particularly wish to see the image and am certainly tempted to remove it (although I shall refrain from doing so), but that's because I don't wish to view the kind of content to which it belongs. If I were looking at articles on masturbation techniques, to which the image belongs, I would have no problem with the image. A child will only come across the above image if viewing a relevant article, which will be of an explicit nature, both in images and in text. My only problem with the image is that it does not belong on this talk page as it lacks relevance. I understand your point, but it's rather weak. Zachs33 13:44, 16 August 2011 (UTC)Reply
This is a free image, not a fair use image. If I wanted to, I could post 50 of them on my userpage. It's more relevant than the finch image inserted earlier. There are ways for children to wonder onto enwiki's masturbation article. There are links to that article from the "marriage" article, the "Jock" disambiguation page, the "List of British words not widely used in the United States" article, etc. The only image I've ever removed from article was a drawing of a little girl by a non-notable pedophilic artist DeviantART user. I'm not against images such as the one I've embedded above. I won't be censoring myself. I'm an Encyclopedia Dramatica sysop. What I'm against is imposing my will and beliefs on others. Users ought to have a choice. --Michaeldsuarez 14:00, 16 August 2011 (UTC)Reply
Return to "Image filter referendum/en" page.