Edgars2007 (talk | contribs) |
→Remove Persondata: proposal |
||
Line 417: | Line 417: | ||
:::::I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- [[User:T.seppelt|T.seppelt]] ([[User talk:T.seppelt|talk]]) 22:29, 28 October 2015 (UTC) |
:::::I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- [[User:T.seppelt|T.seppelt]] ([[User talk:T.seppelt|talk]]) 22:29, 28 October 2015 (UTC) |
||
::::::{{Ping|T.seppelt}} Thank you. Consensus has been reached, in not one, but ''two'' RfCs. <span class="vcard"><span class="fn">[[User:Pigsonthewing|Andy Mabbett]]</span> (<span class="nickname">Pigsonthewing</span>); [[User talk:Pigsonthewing|Talk to Andy]]; [[Special:Contributions/Pigsonthewing|Andy's edits]]</span> 23:36, 28 October 2015 (UTC) |
::::::{{Ping|T.seppelt}} Thank you. Consensus has been reached, in not one, but ''two'' RfCs. <span class="vcard"><span class="fn">[[User:Pigsonthewing|Andy Mabbett]]</span> (<span class="nickname">Pigsonthewing</span>); [[User talk:Pigsonthewing|Talk to Andy]]; [[Special:Contributions/Pigsonthewing|Andy's edits]]</span> 23:36, 28 October 2015 (UTC) |
||
:::::::{{Ping|Pigsonthewing|Izno|Magioladitis}} Now I read the whole discussion. What I can offer is the following: KasparBot goes through all pages which transclude {{tl|Persondata}} and compares the information with the statements, labels, descriptions and aliases of the connected Wikidata item. Missing information is added to Wikidata. After all the data which is equally stored in Wikidata is removed from the article. Problems will be tracked in a special database which can be accessed using a tool I am going to develop. If no data remains in the article the whole template will be removed. This procedure is exactly the same as I am using for {{tl|Authority control}}. What do you think? -- [[User:T.seppelt|T.seppelt]] ([[User talk:T.seppelt|talk]]) 20:13, 29 October 2015 (UTC) |
|||
:{{ping|Magnus Manske}} because you have experience with setting up neat tools for Wikidata--I'm not sure this would be exactly up your alley to up this but figured I'd ping you. --[[User:Izno|Izno]] ([[User talk:Izno|talk]]) 11:46, 28 October 2015 (UTC) |
:{{ping|Magnus Manske}} because you have experience with setting up neat tools for Wikidata--I'm not sure this would be exactly up your alley to up this but figured I'd ping you. --[[User:Izno|Izno]] ([[User talk:Izno|talk]]) 11:46, 28 October 2015 (UTC) |
||
::All the data that can sensibly be transferred to Wikidata by a bot has already been trasnfered; that was discussed at length in the first RfC. <span class="vcard"><span class="fn">[[User:Pigsonthewing|Andy Mabbett]]</span> (<span class="nickname">Pigsonthewing</span>); [[User talk:Pigsonthewing|Talk to Andy]]; [[Special:Contributions/Pigsonthewing|Andy's edits]]</span> 13:48, 28 October 2015 (UTC) |
::All the data that can sensibly be transferred to Wikidata by a bot has already been trasnfered; that was discussed at length in the first RfC. <span class="vcard"><span class="fn">[[User:Pigsonthewing|Andy Mabbett]]</span> (<span class="nickname">Pigsonthewing</span>); [[User talk:Pigsonthewing|Talk to Andy]]; [[Special:Contributions/Pigsonthewing|Andy's edits]]</span> 13:48, 28 October 2015 (UTC) |
Revision as of 20:13, 29 October 2015
This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).
You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.
Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).
- Alternatives to bot requests
- WP:AWBREQ, for simple tasks that involve a handful of articles and/or only needs to be done once (e.g. adding a category to a few articles).
- WP:URLREQ, for tasks involving changing or updating URLs to prevent link rot (specialized bots deal with this).
- WP:SQLREQ, for tasks which might be solved with an SQL query (e.g. compiling a list of articles according to certain criteria).
- WP:TEMPREQ, to request a new template written in wiki code or Lua.
- WP:SCRIPTREQ, to request a new user script. Many useful scripts already exist, see Wikipedia:User scripts/List.
- WP:CITEBOTREQ, to request a new feature for WP:Citation bot, a user-initiated bot that fixes citations.
Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}
, and archive the request after a few days (WP:1CA is useful here).
Legend |
---|
|
|
|
|
|
Manual settings |
When exceptions occur, please check the setting first. |
Bot-related archives (v·t·) |
---|
You may {{Archive basics}} to |counter= 67
as Wikipedia:Bot requests/Archive 66 is larger than the recommended 150Kb.
Redirects to lists, from the things they are lists of
Please could someone do this:
- For every article titled "List of foo"
- if the article called "Foo" exists; do nothing
- otherwise, create "Foo" as a redirect to "List of foo"
For example, I just created Birds of Tunisia as a redirect to List of birds of Tunisia.
This might usefully be added to a list of monthly cleanup tasks, for new "List of..." articles. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:06, 6 May 2015 (UTC)
- Doing... - Though I have messaged WikiProject Lists to check consensus first. Jamesmcmahon0 (talk) 12:54, 7 May 2015 (UTC)
- Thank you. Please see also #Century-item redirects, below. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:36, 9 May 2015 (UTC)
- BRFA filed - Wikipedia:Bots/Requests for approval/MoohanBOT 8 It is just for this task as I had already generated the list of pages needed and there seems to be no opposition to it. I will have a look at #Century-item redirects in a few days but feel free to jump ahead GoingBatty as that one may be outside of my regex expertise... Jamesmcmahon0 (talk) 11:05, 10 May 2015 (UTC)
- Thank you. Please see also #Century-item redirects, below. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:36, 9 May 2015 (UTC)
It appears that User:Jamesmcmahon0 has dropped this. Can anyone else help, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:29, 5 July 2015 (UTC)
- I see User:Jamesmcmahon0 has been editing again... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:30, 3 August 2015 (UTC)
- The BRFA expired, this task is now open for grabs again.—cyberpowerChat:Online 20:20, 7 September 2015 (UTC)
- This is a two minute job. All the best: Rich Farmbrough, 03:12, 16 October 2015 (UTC).
The BRFA has been denied for reasons unrelated to the task itself. Another editor is welcome to take this on. @Jamesmcmahon0: Are you willing to reopen the old BRFA? — Earwig talk 23:15, 22 October 2015 (UTC)
- @Jamesmcmahon0: And I have refiled the BRFA. PhilrocMy contribs 11:30, 23 October 2015 (UTC)
Accidental template protection
Occasionally, I've noticed that an article has been mistakenly template-protected. Perhaps a bot could monitor the protection log, and if a page in a namespace other then Template, Module, User, or Wikipedia is template-protected, deliver a "did you mean to do this" message to the protecting admin, i.e.
- Hello
administrator name
. Ondate
you template-protected [[page
]] ([https://en.wikipedia.org/w/index.php?title=Special%3ALog&type=protect&page=page (url-encoded)
log]). As template protection is only meant to be used for templates, or other highly transcluded pages, did you perhaps mean to select a different level? Thanks,bot signature
or similar. - Evad37 [talk] 03:54, 17 June 2015 (UTC)
- Unsure if necessary - couldn't you just add something to said template? (Just lurking WP:BOTREQ to see what sort of things people want bots to do). E. Lee (talk) 04:59, 17 June 2015 (UTC)
- @Elee: Which template are you talking about? And how does adding something to a template fix a wrongly applied protection level? - Evad37 [talk] 05:50, 17 June 2015 (UTC)
- For an example of the problem and proposed solution (letting admins know that they may have made a mistake so they can fix it), see User talk:Ponyo#Maithali protection level, or User talk:Black Kite#Farshad Fotouhi protection level - Evad37 [talk] 05:54, 17 June 2015 (UTC)
- There are padlock templates added to protected pages. These "sense" if they are incompatible with the protection actually used, I believe, and put the page in a category to be fixed.
- Arguably there is something that could be done along these lines.
- A list of template protected articles can be found here (currently empty). A bot could check this, and act upon it. All the best: Rich Farmbrough, 21:43, 27 July 2015 (UTC).
- Such a bot should also check the queue for move protection. ~ RobTalk 14:11, 9 September 2015 (UTC)
Articles with {{Infobox Journal}} seek bot to ensure redirects are in place
As discussed at Wikipedia_talk:WikiProject_Academic_Journals#Bot_task?, there are several fairly standard redirects needed to each article in this project using that infobox. The box has parameters for the journal title and it's ISO abbreviation. Citations routinely vary the capitalization, abbreviations, and punctuation of these abbreviations, creating a need for redirects from each common variation to the actual article title (usually the same as the journal title, in sentence case). Is there a bot that might be suited to the task? LeadSongDog come howl! 01:23, 18 June 2015 (UTC)
- I've obtained the ISO 4 vocabulary to convert, e.g., "European Physical Journal" to "Eur. Phys. J."; it's a spreadsheet-format version of the PDF available at issn.org. Could you bot-wizards please tell us if such a conversion would be simply too complicated? Thanks! Fgnievinski (talk) 02:30, 30 July 2015 (UTC)
- Maybe an easier and useful thing to do instead would be to start from Infobox journal's title field (e.g., "European Physical Journal") and its manually-entered abbreviation field ("Eur. Phys. J."), and create the desired redirects: e.g., "European physical journal", "Eur. Phys. J.", "Eur Phys J", "eur phys j", "E. P. J.", "E.P.J.", "E P J", "EPJ". Fgnievinski (talk) 02:45, 30 July 2015 (UTC)
Replacement of Template:Infobox Country World Championships in Athletics
Hello. Could I hire a bot to substitute all transclusions of {{Infobox Country World Championships in Athletics}}, per the outcome of this TfD? Alakzi (talk) 13:12, 20 June 2015 (UTC)
- Same with {{Infobox China station}} and {{Infobox Japan station}}, but using the sandbox version. Alakzi (talk) 17:34, 20 June 2015 (UTC)
- {{Infobox Country World Championships in Athletics}} done - thanks Plastikspork. Alakzi (talk) 16:00, 25 June 2015 (UTC)
- @Alakzi: is this still pending or done? Mdann52 (talk) 18:18, 28 August 2015 (UTC)
- China and Japan station are pending. Alakzi (talk) 18:20, 28 August 2015 (UTC)
- Hi. I'm working on eliminating the backlog of requests, one request at a time. Unless, someone else takes this one, I will hopefully get to it soon.—cyberpowerChat:Online 00:37, 29 August 2015 (UTC)
- @Cyberpower678: Mind if I steal this one, if you haven't started on it yet? I'm working on clearing WP:TFD/H and there's a handful of templates that can be handled in one BRFA, including the two remaining here. ~ RobTalk 16:06, 3 September 2015 (UTC)
- Hi. I'm working on eliminating the backlog of requests, one request at a time. Unless, someone else takes this one, I will hopefully get to it soon.—cyberpowerChat:Online 00:37, 29 August 2015 (UTC)
- China and Japan station are pending. Alakzi (talk) 18:20, 28 August 2015 (UTC)
- @Alakzi: is this still pending or done? Mdann52 (talk) 18:18, 28 August 2015 (UTC)
Updating US Census Estimates
Is there a bot available that could add the current United States Census Bureau population estimates (and unfortunately I wouldn't trust OCR for a lot of the older Census files because I often have to look carefully/zoom myself to tell 3 from 8 or 6 from 0)? It should be a fairly straightforward task. The Census updates can be found at census.gov/popest. I am in the process of adding data (mostly, I am using an AWK script on my computer to format data from a spreadsheet for copy/paste into Wikipedia) manually, and for that, I'm okay, since it gives me a chance to do spot edits on those pages as well and allows me to try to make sure that adding the USCensusPop widget doesn't completely screw up formatting of the page, but it's not something I could do every year.
Specifically, it could check to see if a page for a place has a Template:USCensusPop, and then if so, just update. Very simple. I'd write it myself, but it would be nicer if somebody either has code I can reuse or if they could do it all themselves. Thanks. DemocraticLuntz (talk) 23:30, 21 June 2015 (UTC)
Removal of {{Start date}} from {{Singles}} template
It has become common practice in album articles to use {{Start date}} in the {{Singles}} add-on to {{Infobox album}}. Per Template:Start date/doc: "This purpose of the {{start date}} template is to return the date (or date-time) that an event or entity started or was created. It also includes duplicate, machine-readable date (or date-time) in the ISO date format (which is hidden by CSS), for use inside other templates (or table rows) which emit microformats. It should only be used once in each such template and should not be used outside such templates." i.e. {{Start date}} should only be used in album articles for the album release date, not single release dates. It would be nice to have a bot to clean this up, as this error is currently in who knows how many articles. Chase (talk | contributions) 16:44, 5 July 2015 (UTC)
- While we're at it, the bot that would do this should also remove {{Start date}} from AltDate in {{Episode list}}. nyuszika7h (talk) 19:32, 5 September 2015 (UTC)
DOI bot
Given a reference in the forms
"<ref>doi:10.[four digits]/*</ref>" "<ref>http://www.doi.org/10.[four digits]/*</ref>" or "<ref>www.doi.org/10.[four digits]/*</ref>",
the bot should insert the full reference into the article page and into Wikidata. It might be extended to add data to existing references that are, say, missing the date of publication.
See:
HLHJ (talk) 12:28, 8 July 2015 (UTC)
- See User talk:Citation bot/Archive1#Replacement citation bot? and the immediately preceding section. --Izno (talk) 13:09, 8 July 2015 (UTC)
- That discussion does not appear to be leading to getting a bot to start working on the Cite Doi templates. Abductive (reasoning) 19:13, 8 July 2015 (UTC)
- The bot in question preempts the need for doing so (were it turned on). Inserting {{cite journal|doi=value}} and then the bot fills in the other data is what the bot does (or did with {{cite doi}}). --Izno (talk) 19:48, 8 July 2015 (UTC)
- As for Wikidata, I'm not sure of your intentions, so you will need to clarify. Regardless, that bot would need to be approved at Wikidata, not here. --Izno (talk) 19:49, 8 July 2015 (UTC)
- That sounds good, and would do half my request. I hope it's back soon.
- Apologies for the lack of clarity. Wikidata has a data format for journal sources, but there is currently no way to create items from citation templates. See this discussion. There are tools for doing it from a DOI; see the tools section here. It seemed to me that co-ordination between bots working on both might be helpful at avoiding duplicates, etc., but I take your point that separate bots might be easier. HLHJ (talk) 14:40, 16 July 2015 (UTC)
- That discussion does not appear to be leading to getting a bot to start working on the Cite Doi templates. Abductive (reasoning) 19:13, 8 July 2015 (UTC)
For the record, we have an user tool that can be used to derive {{Cite journal}} from DOIs. Having a bot that can autoexpand DOIs to full citations would be useful. Maybe one could reuse the {{Cite doi}} template for it; the bot would convert it to a {{Cite journal}}. Jo-Jo Eumerus (talk, contributions) 19:01, 29 August 2015 (UTC)
Redirects to academic journals may lack WPJournals template (class=redirect) in their talk pages
Would it be possible to check, for each page with Template:WikiProject Academic Journals in its talk page, if its redirects also have Template:WikiProject Academic Journals (class=redirect) in their respective talk pages? Thanks! Fgnievinski (talk) 20:52, 18 July 2015 (UTC)
- Idea is not well explained.—cyberpowerChat:Online 20:14, 27 August 2015 (UTC)
- @C678: sorry, here's an example: Proceedings of the National Academy of Sciences of the United States of America has several redirects [1], some of which are correctly tagged with
{{WPJournals|class=redirect}}
in their respective talk pages (e.g, Talk:PNAS) others that are either blank or as a redirect to the target's talk page (e.g., Talk:Proc Nat Acad Sci). fgnievinski (talk) 22:47, 28 August 2015 (UTC)
- @C678: sorry, here's an example: Proceedings of the National Academy of Sciences of the United States of America has several redirects [1], some of which are correctly tagged with
Using Infobox journal's language field to populate Category:Academic journals by language sub-categories
Could a bot please inspect values entered in field "language" of Infobox journal? Then possibly populate individual sub-categories of Category:Academic journals by language (as per WP:JWG). Thanks! Fgnievinski (talk) 02:14, 30 July 2015 (UTC)
- @Fgnievinski: do you mean check they are valid, or add individual articles about journals to the category? Mdann52 (talk) 16:19, 28 August 2015 (UTC)
- @Mdann52: the latter, please; thanks for looking into this. fgnievinski (talk) 19:03, 28 August 2015 (UTC)
- @Fgnievinski: in that case, this is beyond my capability, I will leave this for someone else to take a look at. Mdann52 (talk) 19:07, 28 August 2015 (UTC)
- @Mdann52: a listing of inconsistencies between category membership and infobox language field would be a great start; then one could manually fix as appropriate. fgnievinski (talk) 19:15, 28 August 2015 (UTC)
- Actually, even a reverse listing of transclusions by language field value would be helpful (e.g., English, French, etc.); it doesn't even need to break up sub-values (e.g., "English", "French", and eventual "English and French" would be fine). fgnievinski (talk) 00:12, 10 September 2015 (UTC)
- @Mdann52: a listing of inconsistencies between category membership and infobox language field would be a great start; then one could manually fix as appropriate. fgnievinski (talk) 19:15, 28 August 2015 (UTC)
- @Fgnievinski: in that case, this is beyond my capability, I will leave this for someone else to take a look at. Mdann52 (talk) 19:07, 28 August 2015 (UTC)
- @Mdann52: the latter, please; thanks for looking into this. fgnievinski (talk) 19:03, 28 August 2015 (UTC)
Container category diffusion
Category:Container categories, by the current definition of the notice box, only allow subcategories, no other pages. If possible, I think it would help the maintenance process if a bot could check container categories for pages, and if found, check if they are already categorized in a subcategory of the container category being checked. If they are, remove them from the container category, referencing the subcategory and WP:SUBCAT in the edit summary. --Slivicon (talk) 14:57, 3 August 2015 (UTC)
I also think, such a bot is required -- Pankaj Jain Capankajsmilyo 13:27, 1 September 2015 (UTC)
Cleanup of "naked" Google Books?
Right now there are approximately 2500 pages with "naked" google books entries (defined as containing the string >http://books.google.com/ ). I asked on Wikipedia talk:AutoWikiBrowser whether there was any way to combine AWB and the Wikipedia citation tool for Google Books at http://reftag.appspot.com/ to help on cleaning these up, and got a response to ask here. Would this be appropriate for a Bot? It *may* also be appropriate to include <ref>[http://books.google.com/... text]</ref> cleanup as well, but that would be a later request if the first makes sense.Naraht (talk) 17:05, 24 August 2015 (UTC)
- It would have to distinguish between reference and non-reference links, at any rate. IMO, changing the links in this way would be a net improvement of the wiki, at any rate. Jo-Jo Eumerus (talk, contributions) 17:20, 24 August 2015 (UTC)
- OP:Absolutely. And given the small (but not non-existant) crossover between those users who would use named refs and those who would put a "naked" google book in a ref, I would be *quite* happy to limit this at the start to something like regexp <ref>http://books.google.com/[^ <]</ref> Naraht (talk) 18:07, 24 August 2015 (UTC)
If you're going down this line (and it sounds worthwhile) I strongly recommend you set the bot to run slowly, so you give people a chance to notice and then feed back on any errors before they're reproduced on multiple pages that may not be actively Watchlisted. --Dweller (talk) 08:49, 25 August 2015 (UTC)
- @Naraht: Creating a appropriate cite tag requires human intelligence and eyeballs to correct mistakes. This is a big no-no for Bots. I see a case for a bot (or report) that lists all pages that have at least one naked books reference and allow people to work the report/backlog by sorting the cite tag out. Hasteur (talk) 12:34, 25 August 2015 (UTC)
- Support - I support a bot to be made that fixes these issues.--BabbaQ (talk) 23:41, 26 September 2015 (UTC)
Update the lists at WikiProject Fix common mistakes
We really need a bot that updates the lists at Wikipedia:WikiProject Fix common mistakes#Log. Right now they are being done manually, which is a very tedious process. Some of the entries have not been updated since November of 2014 and there are a bunch of errors that we haven't added because we can't keep up with the ones we list now.
Note: This was brought up before at Wikipedia:Bot requests/Archive 63#Bot to updated lists at WikiProject Fix common mistakes, where it was marked as resolved and archived, despite the fact that we are still doing this by hand. --Guy Macon (talk) 18:40, 29 August 2015 (UTC)
Monitor and circulate old unanswered questions
Talkpages have disadvantages, not least that many are unwatched so posting a query on them doesn't always get a response from someone who knows about the subject. But we could greatly reduce this with a bot.
If someone posts on an unwatched talkpage we risk having their query linger unnoticed. Would it be possible to have a bot run lists of open talkpage queries to relevant wikiprojects? With a special list for "talkpage queries on pages not tagged for any wikiproject". I'm sure we could get volunteers to go through the default list and either answer queries or tag those pages for relevant wikiProjects, so as hopefully to bring the query to the attention of someone who could answer it.
It would need to ignore threads marked {{done}}, and ideally the reports to each WikiProject should be colour coded and date sequenced so you could differentiate between queries or discussions that more than one editor had participated in and sections on talkpages with only one editor having posted. For talkpages tagged to multiple wikiprojects it would probably help if we also had a template that marked that section as of interested or otherwise to particular wikiprojects, so someone from WikiProject mountaineering could go through some relevant talkpage threads, answer those they could and tag those that were about glaciation, vulcanology or botany so that the bot would know that while the article on that mountain was tagged to several WikiProjects including mountainerring, that particular thread with a question re the most recent eruption was for WikiProject vulcanology ϢereSpielChequers 11:12, 7 September 2015 (UTC)
- It's been my understanding that this is what the project tags accomplish on talk pages. You can click the links on those to find a more general audience for queries. ~ RobTalk 19:41, 7 September 2015 (UTC)
- Yes, but this would work the other way round, so anyone visiting a wikiProject page could easily see a list of open questions that are likely to be of interest to their WikiProject. The reason being that newbies post questions on talkpages and sometimes they linger till stale. ϢereSpielChequers 16:20, 8 September 2015 (UTC)
Replace stubs category with stub template
As of the September 2015 dump, there are over 2700 articles with a stub category. (e.g. \[\[Category:[\w\s]+stubs\]\]
) Could someone create a bot that would replace the stub category with the appropriate stub template? (If the stub template already exists on the article, just delete the stub category.)
For example, Antikristos contains Category:Folk dance stubs. The Category:Folk dance stubs page contains {{Stub Category|article=[[folk dance]]|newstub=folk-dance-stub|category=Folk dance}}
. Therefore, the bot would:
- Delete the stub category
- Look at the value in
|newstub=
- If the article does not contain {{folk-dance-stub}}, add {{folk-dance-stub}} at the bottom of the article
Thanks! GoingBatty (talk) 21:09, 7 September 2015 (UTC)
- Anyone interested in taking this on? GoingBatty (talk) 17:36, 3 October 2015 (UTC)
Mexican digital television stations
On September 24, 2015, some Mexican TV stations will become all-digital. This implies a change in callsigns, but most Mexican TV station links are redirects, not articles.
As such, some redirects need to be moved to new locations and references to them changed out. For instance, XHBAB-TV must become XHBAB-TDT. A bot to make these moves would be very helpful, and the code will be vital when more than 600 stations do this on December 31. There are enough references that all of them can be changed and the old -TV suffixes can be removed.
Note that some stations have their own articles, and those will be manually moved and updated.
The stations that are redirects and to be moved are:
- XHAFC
- XHBAB
- XHBTB
- XHGWT
- XHMOY
- XHOPMT
- XHGNB
- XHSIB
- XHSRB
- XHVEL
I will likely need to make one or two more requests, and then on December 31 we will need to have a massive blitz of some 600 of these, so having reusable code is a must for my sanity. Raymie (t • c) 21:37, 8 September 2015 (UTC)
- @Raymie: So just to make sure, you want all redirects with the prefixes listed above with the "TV" suffix to be changed to the same prefix with the "TDT" suffix? -24Talk 20:15, 17 September 2015 (UTC)
- @Negative24: That would be correct. The actual redirects need to be moved and the links to them need to be modified too. And I'll want to be able to do it again in December with 600+ of them. Raymie (t • c) 02:38, 18 September 2015 (UTC)
- @Raymie: Alright, I will see what I can do but this would be the first "real" task for User:Bot24 so it might be a tiny bit rough from the beginning. -24Talk 02:43, 18 September 2015 (UTC)
- @Negative24: That would be correct. The actual redirects need to be moved and the links to them need to be modified too. And I'll want to be able to do it again in December with 600+ of them. Raymie (t • c) 02:38, 18 September 2015 (UTC)
Texas Historical Commission atlas has changed information links
I'm not knowledgeable enough to know if this is possible to correct with a bot, but there are a considerable amount of articles that are affected by this. These atlas links have been used for NRHP citations, as well as other historical marker citations.
The home for the Texas Historical Commission atlas URL remains the same: http://atlas.thc.state.tx.us/
However, once you access information, those links have changed. Whatever is linked to THC as sources in articles are now dead links. I just made a recent change to an article. You can see by the diff how it's been changed. — Maile (talk) 22:25, 8 September 2015 (UTC)
- Copied from Wikipedia:Village pump (technical)/Archive 140#Texas Historical Commission atlas has changed information links:
- Special:LinkSearch finds 718 links to http://atlas.thc.state.tx.us. The count includes all namespaces and cases with multiple links on the same page. There are around 370 different articles. http://atlas.thc.state.tx.us currently says: "Welcome to the new Atlas! The original Atlas, now located at http://atlas1.thc.state.tx.us, will eventually be phased out in the coming weeks. Please begin transitioning your use to the new Atlas." The links I examined work if atlas is replaced by atlas1 but it sounds like this is temporary. It would be good to find and update to new atlas url's while the old content can be seen at atlas1 (not all url changes are of the same form). PrimeHunter (talk) 22:40, 8 September 2015 (UTC)
findarticles.com
Mark all links to findarticles.com as dead. The links are being redirected to a another website. However, they are not marked as 404, or soft 404. That includes links to https://web.archive.org/web/$1/findarticles.com etc. which has been deleted retroactively from the archives. Examples:
- http://findarticles.com/p/articles/mi_m1058/is_1998_Nov_18/ai_53365282
- http://web.archive.org/web/20050922160253/http://www.findarticles.com/p/articles/mi_m1058/is_1998_Nov_18/ai_53365282
See more detailed reasoning read this on my blog and FindArticles. (t) Josve05a (c) 08:25, 30 September 2015 (UTC)
- insource:findarticles.com: 16,609 mainspace matches. (t) Josve05a (c) 08:31, 30 September 2015 (UTC)
- Josve05a so that I understand the request correctly (and to help guide the answer) what you're looking for is: For every occurrence where the pattern findarticles.com appears inside a ref block (i.e. regex 'ref>*?findarticles.com*?</ref') append a
{{deadlink}}
template (with appropriate year/month for categorization) just inside the close of the reference tag. Is this correct? Hasteur (talk) 18:36, 30 September 2015 (UTC)
- Josve05a so that I understand the request correctly (and to help guide the answer) what you're looking for is: For every occurrence where the pattern findarticles.com appears inside a ref block (i.e. regex 'ref>*?findarticles.com*?</ref') append a
Transclusion of daily Copyright problems subpages onto the main WP:CP page
(Repeat of my posting of 27 February) Would some kind bot take this on? The subpage name is of the form Wikipedia:Copyright problems/2015 February 26; it needs to be added to Wikipedia:Copyright problems after seven days; i.e., the page for 19 February is added at midnight on 26 February. It's being done manually at the moment, would be good if it could be automated. Thanks, Justlettersandnumbers (talk) 20:06, 11 October 2015 (UTC)
- Also it'd be really good if a bot could be asked to create the daily listing subpages of the copyright problems page, such as Wikipedia:Copyright problems/2015 October 11. I'm doing them manually at the moment, but I'd rather do other things. This should be a no-brainer for a bot. Thanks, Justlettersandnumbers (talk) 20:06, 11 October 2015 (UTC)
BC births and deaths categorizations
RfC: BC births and deaths categorization scheme has just been closed on:
(option 5:) Return to earlier guideline-conforming scheme adding "rollup" categories by decade/century
Could we have bot-assistance on realising that? Pinging a few people that may be able to give some assistance:
- @Fayenatic london: may have some experience as to what can be handled (semi-)bot-wise at the end of categorisation discussions
- @Rick Block: seems to have some experience with the "roll-up" systems
- @Good Olfactory: commented in a prior discussion here
If I need to be more specific on possible tasks involved, please ask me. --Francis Schonken (talk) 17:18, 14 October 2015 (UTC)
- The "roll-up" on decade categories, as currently seen at Category:0s deaths, is simply done using <categorytree mode=pages>0s deaths</categorytree> on that page. The parameter in the middle of that string has to match the name of the page that it is on. There is a way to show an ordinary category tree using the PAGENAME parameter: {{#categorytree:{{PAGENAME}}}}. However, I do not know of a way to combine that with
mode=pages
. For more info see MW:Extension:CategoryTree. So AFAIK this "rollup" code will have to be added manually. - The old categories will have to be undeleted by admins; I don't know a way to automate that. After undeletion, we would then list them at WP:CFDWR so that Cydebot would remove the CFD templates from them.
- I believe the member pages (biography articles) will also have to be reverted manually. The best that I can offer would be to provide links to the diffs made by Cydebot when emptying the old categories. – Fayenatic London 11:01, 15 October 2015 (UTC)
- The "roll-up" on decade categories, as currently seen at Category:0s deaths, is simply done using <categorytree mode=pages>0s deaths</categorytree> on that page. The parameter in the middle of that string has to match the name of the page that it is on. There is a way to show an ordinary category tree using the PAGENAME parameter: {{#categorytree:{{PAGENAME}}}}. However, I do not know of a way to combine that with
- @Armbrust:: I manually undeleted Category:1 BC deaths to Category:9 BC deaths. Would you be able to automate reversals of your bot's edits starting from [2]? See [3] for the instruction at CFDW for deaths from 1 to 599 BC. – Fayenatic London 21:56, 17 October 2015 (UTC)
- @Armbrust: I've manually reverted from the bottom of that page of contribs up to Curia (wife of Quintus Lucretius). Is it any trouble to you if we use rollback or undo on your bot's edits? – Fayenatic London 12:45, 19 October 2015 (UTC)
- I don't mind, although some articles were edited after the bot. Armbrust The Homunculus 19:51, 19 October 2015 (UTC)
- @Armbrust: I've manually reverted from the bottom of that page of contribs up to Curia (wife of Quintus Lucretius). Is it any trouble to you if we use rollback or undo on your bot's edits? – Fayenatic London 12:45, 19 October 2015 (UTC)
As the work cannot be processed by bot, I have listed the CFDs listing the births/deaths categories to be reinstated at WT:WikiProject Years#BC births and deaths categories. – Fayenatic London 13:50, 20 October 2015 (UTC)
- I subsequently moved the list and progress marker to Wikipedia talk:WikiProject Biography#BC births and deaths categories. – Fayenatic London 21:36, 26 October 2015 (UTC)
- Re. "As the work cannot be processed by bot" – says who? I think part of the tasks can be processed by bot. I'd prefer to keep the discussion here (various bot operators may pick up on tasks for which they see a possibility to automate it), with a possible exception to logging tasks performed at WT:WikiProject Years#BC births and deaths categories. --Francis Schonken (talk) 15:31, 20 October 2015 (UTC)
- @Fayenatic london: again, please discuss these issues here. --Francis Schonken (talk) 13:39, 26 October 2015 (UTC)
- Your confidence in bot-kind is touching. I agree that this task would be best handled by a bot, but I have never come across an existing bot written to do what is required here. Well, I suppose there is little harm in waiting longer; perhaps somebody may write a new bot for us. The main disadvantage of waiting is that subsequent edits to the biographies will mean that an increasing proportion of the bot edits cannot be reverted using Undo. – Fayenatic London 21:22, 26 October 2015 (UTC)
- Actually it could be done with AWB alone (replace year category with birthsyear cat and remove birthsdecade category), but compiling a list of affected articles is troublesome. Armbrust The Homunculus 08:29, 28 October 2015 (UTC)
- @Armbrust: I had thought about using Cat-a-lot to do that, but ruled that out, because a year category on a bio could be for births or for deaths. A human editor could tell which, by referring to the decade categories, but that would probably be too difficult to program into a bot. So yes, it could be done using AWB, but requiring manual intervention on each one before clicking Save. – Fayenatic London 13:45, 28 October 2015 (UTC)
- If you use the bot's contributions list compile the articles, than this shouldn't be a problem. Armbrust The Homunculus 19:12, 28 October 2015 (UTC)
- @Armbrust: I had thought about using Cat-a-lot to do that, but ruled that out, because a year category on a bio could be for births or for deaths. A human editor could tell which, by referring to the decade categories, but that would probably be too difficult to program into a bot. So yes, it could be done using AWB, but requiring manual intervention on each one before clicking Save. – Fayenatic London 13:45, 28 October 2015 (UTC)
- Actually it could be done with AWB alone (replace year category with birthsyear cat and remove birthsdecade category), but compiling a list of affected articles is troublesome. Armbrust The Homunculus 08:29, 28 October 2015 (UTC)
- Your confidence in bot-kind is touching. I agree that this task would be best handled by a bot, but I have never come across an existing bot written to do what is required here. Well, I suppose there is little harm in waiting longer; perhaps somebody may write a new bot for us. The main disadvantage of waiting is that subsequent edits to the biographies will mean that an increasing proportion of the bot edits cannot be reverted using Undo. – Fayenatic London 21:22, 26 October 2015 (UTC)
- @Fayenatic london: again, please discuss these issues here. --Francis Schonken (talk) 13:39, 26 October 2015 (UTC)
Century-item redirects
My request for someone do this:
- For every page or category beginning with a cardinal number (e.g. 17th-, 21st-) century; or articles prefixed "List of..." matching that pattern:
- Create a redirect from the equivalent title, with no dash
- Create a redirect from the equivalent title, using words
- Create a redirect from the equivalent title, using words, with no dash
was markred as "not done - no wider discussion" and archived. What wider discussion is needed?
For example, for the existing Category:20th-century war artists, I just created:
- Category:20th century war artists
- Category:Twentieth-century war artists
- Category:Twentieth century war artists
Other examples matching the above pattern would include:
This might usefully be added to a list of monthly cleanup tasks, for new articles and categories matching the above pattern. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:27, 15 October 2015 (UTC)
Deadlink Fixing
Maybe a bot that fixes a collection of dead links? The Ohio Historical Society once maintained a few thousand pages with a well-maintained naming convention, http://ohsweb.ohiohistory.org/ohpo/nr/details.aspx?refnum=XXXXXXXX (the Xs represent an eight-digit number), but they took down these pages a good while ago. Now that OHS has renamed itself to Ohio History Connection, it's put up a new website, and these pages are once again good, but with different URLs, http://nr.ohpo.org/Details.aspx?refnum=XXXXXXXX. Could a bot go around and perform replacements? The work should be easy, and manual fixes will take a lot of work for a human but should be easy for a bot, given the careful adherence to the naming convention. A few of these links have been correctly marked with {{dead link}}; it would also help if the bot were to remove that tag when it's present. Nyttend (talk) 21:45, 19 October 2015 (UTC)
- Then my bot should come by soon and replace the tagged ones with a wayback link.—cyberpowerTrick or Treat:Limited Access 02:54, 20 October 2015 (UTC)
- I'm confused: why would that help? Almost none of these are in archive.org (I've checked), and why would it be good in the first place for the bot to use an archive URL instead of the URL of a currently active page from the same source with the same content? Nyttend (talk) 04:42, 20 October 2015 (UTC)
- Sorry that was meant to be more of a general comment. Cyberbot II now attempts to attach wayback links to tagged links.—cyberpowerTrick or Treat:Online 15:56, 20 October 2015 (UTC)
- I'm confused: why would that help? Almost none of these are in archive.org (I've checked), and why would it be good in the first place for the bot to use an archive URL instead of the URL of a currently active page from the same source with the same content? Nyttend (talk) 04:42, 20 October 2015 (UTC)
Http->Https for Newspaper.com links
Hi all. If you didn't know, we have a substantial donation of accounts to WP:Newspapers.com as part of the Wikipedia Library partnership program. As part of the recent expansion of access to 300 accounts, our contact noted that they can no longer track referral traffic from Wikipedia, because of the change earlier in the year for all Wikipedia readers to be on Https (Https only communicates referrals to https not http). We would like help changing http to https links for Newspapers.com, for several reason:
- From the start, the have been one of our most used partners by volunteers, and they are very much willing to expand our editor access to include more editors as demand; we want to keep currying this good will.
- In part this demand from editors, is in response to their Open Access "Clipping" function (read more), which allows our editors to pull their sources out from behind the paywall on Newspapers.com. This particular case study has been part of our business case for other partners creating more open access options (for example WP:Newspaperarchive.com created the exact same feature as part of the development of our partnership, and we are using it to propose other reader-favorable access negotiations with other partners). Having good metrics for this case study from both the Wikipedia side and from the Newspapers.com analytics side, which includes referrer information, helps us make the argument to other publishers/databases
- Https links are more secure for our readers that do click through to their project (even though Newspapers.com plans to redirect any traffic from Wikiepdia to a https url, that redirect loses the referral information, which effects 1 and 2, and temporarily routes readers through a insecure server).
Could someone run a bot that substitutes http with https when it precedes newspapers.com? Our contact assures me that none of the link should break. A tool/bot that can substitute http to https link like this might be useful for a number of different TWL and WP:GLAM partnerships in the future: part of the business case for most partnerships is increased traffic, and many of our historical allies will be converting to https in the near future, if they haven't already (for example, JSTOR plans to). Thanks much from the Wikipedia Library team, Astinson (WMF) (talk) 15:15, 20 October 2015 (UTC)
- (Thanks Izno for pinging.) This is somewhat related to my request above. I will include
http://www.newspapers.com/
→https://www.newspapers.com/
in my AWB settings, but I think this could be done more efficiently by a bot. Because with Google Books links, I also remove the link clutter on the fly, but this doesn't seem necessary for newspaper.com, or does it? --bender235 (talk) 16:16, 20 October 2015 (UTC)- @Izno: Thanks! @Bender235: I had considered doing it semi-automatically with AWB, but it such a simple conversion that a bot should do it. All the urls are one of two types of structure URIs, so there shouldn't be any clutter, since we have been giving very clear recommendations with the Newspapers.com donation, that they shouldn't be inconsistent. It would be great to have a bot (or a bot activated by a tool), that can help with these kinds of conversions, because I am sure there will be a myriad of requests in the next year or so from GLAMs, etc. Astinson (WMF) (talk) 18:48, 20 October 2015 (UTC)
- @Bender235: Saw the first 4000 links changed, thanks for doing it with AWB! Is there any chance we can update the rest of them in the next week or two, if no-one picks it up with a bot? We would love to be able to keep capturing accurate metrics data to our Newspapers.com partner, sooner rather than later. Astinson (WMF) (talk) 16:06, 22 October 2015 (UTC)
- @Izno: Thanks! @Bender235: I had considered doing it semi-automatically with AWB, but it such a simple conversion that a bot should do it. All the urls are one of two types of structure URIs, so there shouldn't be any clutter, since we have been giving very clear recommendations with the Newspapers.com donation, that they shouldn't be inconsistent. It would be great to have a bot (or a bot activated by a tool), that can help with these kinds of conversions, because I am sure there will be a myriad of requests in the next year or so from GLAMs, etc. Astinson (WMF) (talk) 18:48, 20 October 2015 (UTC)
- Related is m:Research:Wikimedia referrer policy. Legoktm (talk) 18:44, 23 October 2015 (UTC)
Give out Deletion to Quality Awards
Is there a way a bot could give out WP:Deletion to Quality Awards ?
Here's what it would have to do:
- Check Category:Deletion to Quality Award candidates
- Find out who the FA, FL, or GAN nominator was.
- Place the corresponding Banner Award from Wikipedia:Deletion_to_Quality_Award#Banner_awards on their user talk page = linking to the article and the AFD page as the two parameters in those Banner Awards.
You can say, on behalf of Cirt and WP:Deletion to Quality Awards.
And also, any way a bot could update the "Hall of Fame" table at Wikipedia:Deletion_to_Quality_Award#Deletion_to_Quality_Award_Hall_of_Fame ?
Thoughts ?
Any help would be most appreciated,
— Cirt (talk) 05:04, 21 October 2015 (UTC)
- Note: Please note that a one-time-run would be totally acceptable. :) — Cirt (talk) 09:20, 21 October 2015 (UTC)
This report was last updated only in March 2014 but most (if not all) of the IP talk pages listed here are still blank. A bot should be used to apply the template {{OW}} to all the pages which were blanked to remove stale warnings. An easy criterion for identifying such pages is this: the last editor should hve been User:BD2412 and the edit summary should have a link to WP:AWB. In all the pages that I checked at random, the page was blanked to remove the stale warnings and this was done by BD2412 using AWB. 103.6.159.89 (talk) 17:50, 23 October 2015 (UTC)
- That was the practice at the time. How many of these are there now? bd2412 T 18:13, 23 October 2015 (UTC)
- Well, I did not accuse you of doing anything wrong, of course. I don't know how many are there. Nevertheless, I think this task should be done by a bot rather than through AWB because bot edits, if marked as minor, would not trigger the unnecessary "You have new messages" note at the IPs' end. 103.6.159.89 (talk) 19:07, 23 October 2015 (UTC)
- BD2412 Is there a reasonable consensus to do this? I could look at coding up a bot to do this. Hasteur (talk) 19:24, 23 October 2015 (UTC)
- Yes, but I'll have to find the discussions. I'm actually headed out right now, but will get back to the question tonight. Cheers! bd2412 T 20:10, 23 October 2015 (UTC)
- There have been lots of small discussions, e.g., Wikipedia_talk:Criteria_for_speedy_deletion/Archive_9#IP_talk_pages, Wikipedia:Bot_requests/Archive_50#Bot_to_remove_patently_stale_warnings_from_IP_talk_pages, Wikipedia:Village_pump_(proposals)/Archive_110#Bot_blank_and_template_really.2C_really.2C_really_old_IP_talk_pages. bd2412 T 00:55, 24 October 2015 (UTC)
- Yes, but I'll have to find the discussions. I'm actually headed out right now, but will get back to the question tonight. Cheers! bd2412 T 20:10, 23 October 2015 (UTC)
- BD2412 Is there a reasonable consensus to do this? I could look at coding up a bot to do this. Hasteur (talk) 19:24, 23 October 2015 (UTC)
- Well, I did not accuse you of doing anything wrong, of course. I don't know how many are there. Nevertheless, I think this task should be done by a bot rather than through AWB because bot edits, if marked as minor, would not trigger the unnecessary "You have new messages" note at the IPs' end. 103.6.159.89 (talk) 19:07, 23 October 2015 (UTC)
- Template:OW is pretty terrible, in my opinion. Why not just delete the pages? User talk:67.173.42.13 is an easy example: that IP hasn't edited in over a decade. There's no good reason to indefinitely keep a templated warning from 10 years ago. --MZMcBride (talk) 18:22, 24 October 2015 (UTC)
- @BD2412 and Hasteur: BTW, there are 68 782 blank talk pages for IP users. You can see the list here. Sorry for non-wiki list, but Mediawiki didn't allow me to save the page :D --Edgars2007 (talk/contribs) 10:59, 29 October 2015 (UTC)
Section headers
A bot should change all h1s (surrounded by just one equal sign on each side) to h2s and all h2s, h3s, h4s, and h5s under an h1 to h3s, h4s, h5s, and h6s respectively. H6s cannot be changed into h7s because h7s do not work, instead they will be treated as h6s where the section header name begins and ends with an equal sign. GeoffreyT2000 (talk) 16:02, 24 October 2015 (UTC)
- I don't think this can be done with a bot. What if you had a header structure like this:
- h2
- h2
- h1
- h2
- h2
- h2
- in which someone has erroneously inserted an h1 into an otherwise well-formatted set of h2s? You would want to change that h1 to an h2 without changing the h2 below it.
- If h1 is not allowed in articles, a bot might be able to tag or make a list of articles that are afflicted with h1s so that editors could work from a list of those malformed articles. – Jonesey95 (talk) 17:48, 24 October 2015 (UTC)
- Could a bot at least look for that and post a list of articles with that issue? Frankly, it may be something for a cleanup bot or as part of an AWB check or something. -- Ricky81682 (talk) 19:55, 26 October 2015 (UTC)
- Are you seeing articles with this problem right now? Checkwiki task 19 already looks for lines that start with a single "=" character, and its most recent report shows no articles with this error. – Jonesey95 (talk) 21:50, 26 October 2015 (UTC)
- Could a bot at least look for that and post a list of articles with that issue? Frankly, it may be something for a cleanup bot or as part of an AWB check or something. -- Ricky81682 (talk) 19:55, 26 October 2015 (UTC)
Draft articles without an AFC banner
There used to be a category (and a bot that forced articles into the category) that kept track of Draft class articles without an AFC submission banner of any type. I've also seen some lost into the ether because the submit substitution was screwed up somehow. Could a bot create a list of all draft-space article without a call to template:AFC submission? Depending on the volume created, this may be worth doing regularly (monthly?) as a backlog at Wikipedia:WikiProject Articles for creation or something. -- Ricky81682 (talk) 19:53, 26 October 2015 (UTC)
- What is the rationale for such a category? I thought Draft space was, by definition, a place where people could work on drafts of articles before submitting them to the AFC process or moving them to article space. Is there a requirement that Draft articles have certain tags? I must be missing something. – Jonesey95 (talk) 20:00, 26 October 2015 (UTC)
- You're correct. I worded this wrong for what I'm looking for. -- Ricky81682 (talk) 21:15, 26 October 2015 (UTC)
- Not done Ricky81682 Draft namespace was created for the purpose of a unified draft location that anybody could work on from Abandoned Drafts and Articles for Creation. In the formative discussions, there was a suggestion of putting the non-AFC pages in draft space in some sort of categorization scheme so that we could track drafts that were sitting out there (in supposed WP:WEBHOST violation) never being improved that was struck down. I believe this is not the first or second time I've explained this difference. Before this goes any further can you please look into proposing a RFC at WT:DRAFTS or at WP:VPR to secure that there is a consensus to do this? I doubt there is a consensus to do this, but if a bot is to do this, there needs to be an ironclad consensus for it. I see that you've asked before (1) and didn't get an answer you wanted. Hasteur (talk) 20:22, 26 October 2015 (UTC)
- Yes and I agree that a mass move would be improper. I was actually looking for old drafts and knew that the lack of header would simplify it. How about a request for all draftspace article that have not been edited in say two years? I don't know how the API works but I guess asking for the lack of header would be an extra computing call. It would not be G13 eligible because of the lack of header. What has happened is that I found an old user, listed the junk for deletion, found possibly useful old stale draft (say User:World Cinema Writer/National Treasure 3) and 'adopted' it, moved that to Draft:National Treasure 3 and added a new banner so that it's checked once in a while. What I found an old draft article and would work on it the same way? That or take them to MFD in bulk I guess. I think seeing two year old stale drafts would be perfect for the Abandoned drafts project to work out. -- Ricky81682 (talk) 21:15, 26 October 2015 (UTC)
- I'm sure that there is a good idea in there somewhere. If you take it to WT:DRAFTS, you'll get some help refining and defining the need. Once that need is defined, you can bring a request back here for implementation. I imagine that it wouldn't be hard, for example, for a bot to tag Draft articles that had not been edited in a while (except by bots), and then automatically remove that tag after a human editor makes a change to the page. That should not be discussed on this page, however. – Jonesey95 (talk) 21:44, 26 October 2015 (UTC)
- I'm just asking for a list right now, not necessarily a category. I think this is the place to ask for something like that. A list can be checked by humans then. I'll see there too. If there's interest, it may be a bot task to review periodically. Thanks. - Ricky81682 (talk) 04:03, 27 October 2015 (UTC)
- I'm sure that there is a good idea in there somewhere. If you take it to WT:DRAFTS, you'll get some help refining and defining the need. Once that need is defined, you can bring a request back here for implementation. I imagine that it wouldn't be hard, for example, for a bot to tag Draft articles that had not been edited in a while (except by bots), and then automatically remove that tag after a human editor makes a change to the page. That should not be discussed on this page, however. – Jonesey95 (talk) 21:44, 26 October 2015 (UTC)
- Yes and I agree that a mass move would be improper. I was actually looking for old drafts and knew that the lack of header would simplify it. How about a request for all draftspace article that have not been edited in say two years? I don't know how the API works but I guess asking for the lack of header would be an extra computing call. It would not be G13 eligible because of the lack of header. What has happened is that I found an old user, listed the junk for deletion, found possibly useful old stale draft (say User:World Cinema Writer/National Treasure 3) and 'adopted' it, moved that to Draft:National Treasure 3 and added a new banner so that it's checked once in a while. What I found an old draft article and would work on it the same way? That or take them to MFD in bulk I guess. I think seeing two year old stale drafts would be perfect for the Abandoned drafts project to work out. -- Ricky81682 (talk) 21:15, 26 October 2015 (UTC)
(←) See this Quarry. Assuming my SQL is right, there are around 1026 draft pages that have not been edited in the past six months. Most of these look like test pages, vandalism, or WP:WEBHOST violations. I even just deleted an attack page. Furthermore, nearly all that I've checked have less than 5 edits made to them. I suppose the lack of articles makes sense, as many content creators would have instead found their way into the draftspace via article creation links, which insert an AfC template. Either way it looks like there's a lot stuff to review here. I can make a tool to interact with this data easier — MusikAnimal talk 05:57, 27 October 2015 (UTC)
- Just updated the Quarry to also show the page length in bytes. If you put that in descending order you're more likely to find articles. There are a fair amount, it turns out. — MusikAnimal talk 06:09, 27 October 2015 (UTC)
Ricky81682 (RE to 21:15, 26 Oct 2015 UTC) I would not bulk MFD them as the argument you're using "That they're stale and haven't been touched" was rejected multiple times for nonAFC draftspace pages. I would strenously suggest you go round up a consensus at WT:Drafts prior to nominating for MFD. Getting the consensus also has the side benefit of stirring the community up to support your MFD nominations. Once you can satisfy the CSD requirements (Objective, Unconstestable, Frequent, Non-redundant) there'll be a wonderful case for using CSD to vaporize the poor drafts. Hasteur (talk) 14:26, 27 October 2015 (UTC)
- I have no intention to. Please give me some credit here. I've brought this up at WP:DRAFTS and Abandoned Drafts as I'd rather be it be done as an Abandoned Draft backlog to work on, something to give that project some push I think. Although 1000 page is nothing compared to the 49k backlog of old userspace drafts. -- Ricky81682 (talk) 21:38, 27 October 2015 (UTC)
Remove Persondata
Persondata was deprecated: by this RfC which closed on 26 May this year and included consensus to remove Persondata from Wikipedia.
An earlier request for a bot to undertake this task was closed on 7 September, with the comment:
a discussion about a bot operation of this magnitude needs to be held in a broader forum, with more participants and a more focused discussion
This has now taken place, and the second RfC has just been closed with the comment:
There is consensus to have a bot remove all the persondata from all the articles.
Please can we now have a bot to do this? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:24, 27 October 2015 (UTC)
- Andy Mabbett does this mean that
23Wikipedia:Bots/Requests for approval/Yobot 24 can be resumed? -- Magioladitis (talk) 05:18, 28 October 2015 (UTC)- I assume you meant Wikipedia:Bots/Requests for approval/Yobot 24? — Earwig talk 05:25, 28 October 2015 (UTC)
- Yes. Fixed. -- Magioladitis (talk) 11:55, 28 October 2015 (UTC)
- @Magioladitis: Yes. Please do. We now have two RfCs that have found conssensus to remove Persondata. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:51, 28 October 2015 (UTC)
- I assume you meant Wikipedia:Bots/Requests for approval/Yobot 24? — Earwig talk 05:25, 28 October 2015 (UTC)
- What Andy fails to mention is the rest of the close, which I think is rather important:
Yobot 24's denial reason is probably just as applicable to this recent RFC, which established no time period for removal beyond the quoted material. I'm happy to help iron out myself how a bot or set of bot tasks should take care of this, but Andy's pointy-effort to remove this template immediately is not doing him any favors, and I would echo Guy's previous comment to him about this template were it not that I think that Dirtlawyer1 had already done so enough times…. --Izno (talk) 11:36, 28 October 2015 (UTC)As a side note there is common sense coming from the minority and even some supports. That the removal be done in steps and that moving what has not been moved to wikidata to some other place so that it can be done at a later date is a good plan that may save time of less informed editors. But there can not be said to be consensus for this, though I see no opposes.
- Oh FFS. How much longer are people going to Wikilawyer this? The text you cite concludes
"there can not be said to be consensus for this..."
. It is important only in that it gives a green light to proceed immediately with the removal of Persondata. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)- You didn't answer the question you thought you did in the RFC. Please review--the question you asked and the majority if not entirety of the discussion centered on "should it be bot-removed" and not "should it be bot-removed now". Wikilawyer? No, certainly not. --Izno (talk) 14:03, 28 October 2015 (UTC)
- Oh FFS. How much longer are people going to Wikilawyer this? The text you cite concludes
- @GoingBatty: because I think you probably did the most in the most-recent discussion to move this forward sensibly. --Izno (talk) 11:38, 28 October 2015 (UTC)
- @Izno: On one end of the spectrum, there are people who think that all {{Persondata}} templates should be deleted immediately. On the other end, there are those who think there is more work to be done to copy the data to Wikidata. As an attempt to find a compromise, I submitted a bot request to remove Persondata where ALL of the values are found elsewhere in the article, such as an infobox, the lead, and/or categories. However, the bot request was only approved to remove Persondata where it only contained
|NAME=
. I would be happy to resubmit my original bot request if there's a reasonable chance of it being approved. GoingBatty (talk) 16:53, 28 October 2015 (UTC)- @GoingBatty: as far as I understand the Wikidata guys do not need the Persondata info so any concerns about transferring data to Wikidata are mute. Am I wrong? -- Magioladitis (talk) 16:59, 28 October 2015 (UTC)
- @Magioladitis: In previous conversations, there were those who said the Wikidata editors don't want the remaining Persondata, and others who thought there was still opportunity for manual (and possibly automated) copying. GoingBatty (talk) 00:57, 29 October 2015 (UTC)
- @GoingBatty: as far as I understand the Wikidata guys do not need the Persondata info so any concerns about transferring data to Wikidata are mute. Am I wrong? -- Magioladitis (talk) 16:59, 28 October 2015 (UTC)
- @Izno: On one end of the spectrum, there are people who think that all {{Persondata}} templates should be deleted immediately. On the other end, there are those who think there is more work to be done to copy the data to Wikidata. As an attempt to find a compromise, I submitted a bot request to remove Persondata where ALL of the values are found elsewhere in the article, such as an infobox, the lead, and/or categories. However, the bot request was only approved to remove Persondata where it only contained
- @T.seppelt: because I think [5] might be a useful and extensible method to preserving/moving the data that isn't checkable by a bot, which would help us remove the template. --Izno (talk) 11:46, 28 October 2015 (UTC)
- There is no need to "preserve" any data (it's already in article histories) and the recent RfC found no consensus to do so . Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
- Your comments are false. See below. I'm happy to take this to ANI given your behavior if you don't back off on the plainly-sensible suggestions. --Izno (talk) 14:03, 28 October 2015 (UTC)
- Ooh goody, more dramah. Off you go... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:26, 28 October 2015 (UTC)
- I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- T.seppelt (talk) 22:29, 28 October 2015 (UTC)
- @T.seppelt: Thank you. Consensus has been reached, in not one, but two RfCs. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:36, 28 October 2015 (UTC)
- @Pigsonthewing, Izno, and Magioladitis: Now I read the whole discussion. What I can offer is the following: KasparBot goes through all pages which transclude {{Persondata}} and compares the information with the statements, labels, descriptions and aliases of the connected Wikidata item. Missing information is added to Wikidata. After all the data which is equally stored in Wikidata is removed from the article. Problems will be tracked in a special database which can be accessed using a tool I am going to develop. If no data remains in the article the whole template will be removed. This procedure is exactly the same as I am using for {{Authority control}}. What do you think? -- T.seppelt (talk) 20:13, 29 October 2015 (UTC)
- @T.seppelt: Thank you. Consensus has been reached, in not one, but two RfCs. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:36, 28 October 2015 (UTC)
- I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- T.seppelt (talk) 22:29, 28 October 2015 (UTC)
- Ooh goody, more dramah. Off you go... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:26, 28 October 2015 (UTC)
- Your comments are false. See below. I'm happy to take this to ANI given your behavior if you don't back off on the plainly-sensible suggestions. --Izno (talk) 14:03, 28 October 2015 (UTC)
- There is no need to "preserve" any data (it's already in article histories) and the recent RfC found no consensus to do so . Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
- @Magnus Manske: because you have experience with setting up neat tools for Wikidata--I'm not sure this would be exactly up your alley to up this but figured I'd ping you. --Izno (talk) 11:46, 28 October 2015 (UTC)
- All the data that can sensibly be transferred to Wikidata by a bot has already been trasnfered; that was discussed at length in the first RfC. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
Izno should then AWB be altered to removed Persondata as part of general fixes? -- Magioladitis (talk) 14:30, 28 October 2015 (UTC)
I'm not really sure why we have this category in the first place, but in an ongoing discussion at WP:AN it has been mentioned that there are large numbers of users listed here who are not in fact indefinitely blocked for violations of the username policy. We're talking about tens of thousands of pages in total, so it's pretty much never going to be fixed by humans. I know I use a script that automatically strikes out the usernames of blocked users, I assume the same type of coding could be used to scan this cat and remove anyone who shouldn't be in it? Beeblebrox (talk) 01:33, 29 October 2015 (UTC)
- User:Betacommand has a tool that lists all accounts that are not blocked locally in that category. Legoktm (talk) 01:41, 29 October 2015 (UTC)