{{rfctag|policy}}
THIS IS A DRAFT RFC THAT IS NOT YET LIVE, PLEASE DO NOT COMMENT YET, BUT EDITING OF THE ARGUMENTS AND FACTS BY INTERESTED PARTIES IS ENCOURAGED
Questions for community consideration
There are two primary, and related, questions that have been raised recently at discussions regarding the spam blacklist:
- Should we blacklist sites that pay authors for contributions, based on traffic, such as ehow.com, examiner.com, triond.com and associatedcontent.com?
- Should the reliability of a source be a factor in de-black listing or in white listing?
Background facts
The blacklist is a page in the MediaWiki namespace. Unlike the Meta blacklist, this blacklist affects pages on the English Wikipedia only.
- Blacklisting prevents editors from adding a hyperlink to the blacklisted site.
- Blacklisting is abuse/evidence-based and should not be used Pre-emptively.
- Removal requests are handled where outcomes are based on community established policies and guidelines, such as "Verifiable Reliable Sources". Examples: 1,2
- Common practice does consider the reliability of a site as an additional factor to abuse/evidence in blacklisting decisions. Examples: 1,2
- The blacklist extension has been in use since July of 2007 on the English Wikipedia, recently there has been an ArbCom ruling:
As blacklisting is a method of last resort, methods including blocking, page protection, or the use of bots such as XLinkBot are to be used in preference to blacklisting. Blacklisting is not to be used to enforce content decisions.
— ArbCom, Passed 10 to 0, 16:39, 18 May 2009 (UTC)
Typical examples which show the persistence of the problem
Here are some typical examples of the efforts spammers do to escape blacklisting etc. (I will look for examples which are related to the sites under scrutiny, but it shows how the incentive of being linked to can drive editors):
- E.g. see User:Japanhero, who has made a large number of socks using an easy changing IP to spam a large number of subdomains on webs.com. This example shows the lack of sufficiency of specific blacklisting and blocking, while numerous (full) protections (accounts are quickly autoconfirmed!) to prevent the editor from adding the links to the pages would be needed. Blacklisting the whole of webs.com would solve the problem .. but the site contains too much good information.
- MediaWiki_talk:Spam-blacklist#brownplanet.com. Besides spamming this domain, the spammer also linked to a large number of redirect sites (and seems to have quite an access to them). Again a clear example of how blacklisting is not sufficient.
Should we blacklist sites that pay authors for contributions, based on traffic, such as ehow.com, examiner.com, triond.com and associatedcontent.com?
Arguments for
The arguments for this are that such sites represent a unique and significant conflict of interest, which will motivate authors to spam these sites to an extent that low-threshold, or even preemptive blacklisting could be justified. "Low-threshold" here refers to blacklisting the entire site once abuse is detected of one article or subdomain. This because other subdomains or documents are easily made to circumvent specific blacklisting, blocking accounts or protecting pages is insufficient as the drive for earning money will result in editors. XLinkBot listings are unsuitable because the financial motivation to spam will override the small speedbump of being reverted once or establishing an account.
Arguments against
The arguments against this are that these sites host a wide range of content from a wide range of sources, with varying reliability, and the decision whether or not to use such links and references should be an editorial one, not an administrative one. XLinkBot listings are appropriate as a reminder to unexperienced users that they should think twice before using such links, while still leaving the ultimate decision on suitability to the editor and article talk page consensus.
(problems with the wording:
- When there are already several discussions on WP:RS / WP:RS/N on a site, which have as a general outcome that the site is generally unreliable, by several editors, then the decision not to use these links has already been made by editors, and that decision has already been imposed by administrators; I reacted earlier against calling this an administrative decision, because, though it appears that way, it is absolutely not
)
Should the reliability of a source be a factor in de-blacklisting or in whitelisting?
Arguments for
The reliability of a source is not the primary motivation for blacklisting a site; the fact that it is being spammed is. That said, even a few links being spammed to an unreliable source can justify blanket blacklisting. The reliability of a source is considered for whitelisting decisions and decisions about the scope of a blacklist listing. If an unreliable source is blacklisted, the community doesn't lose as much as if a reliable source is blacklisted.
(Problems with wording:
- Nope, the reliability has often already been considered by editors, and it was found to be not reliable
)
Arguments against
The nature of a wiki is that editors are encouraged and allowed to be bold. Decisions about the suitability of content are editorial, not administrative. Judgment about the reliability of a source should not be happening on whitelist request pages, it should be happening on article talk pages and through the normal process of bold, revert, discuss. Blacklisting is a last resort; it shouldn't be used as a pre-emptive measure.
(Problems with the wording of this:
- I don't think it is judgment that is given during de-blacklisting or whitelisting requests, that judgment has already passed (see the numerous discussions on WP:RS on examiner.com, editors have already decided it is not suitable). Moreover, the editors are generally asking a confirmation on whitelisting whether a specific document is reliable. There is time and place enough to discuss whether the specific document is reliable, and when such discussions are incorporated in the whitelisting request, whitelisting goes easy and quick. Also, blacklisting is generally not a pre-emptive measure, blacklisting is to stop abuse/spam, and that is what it has been used for, stop the spam from editors who add ehow.com, associatedcontent.com or examiner.com links
).