Page semi-protected

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia

BAG member instructions

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

 Instructions for bot operators

Current requests for approval

GreenC bot 5

Operator: GreenC (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:52, Tuesday, April 24, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Awk

Source code available: GitHub

Function overview: Remove |accessdate= in CS1|2 templates that don't have a |url= but do have a value assigned to any of the various 'permanent-record' identifiers. Excluding templates {{cite web}}, {{cite podcast}}, and {{cite mailing list}}.

Links to relevant discussions (where appropriate): Help_talk:Citation_Style_1#Clearing Category Pages using citations with accessdate and no URL

Edit period(s): one-time run during first pass as standalone bot; then semi-continually as part of a module of WaybackMedic

Estimated number of pages affected: apx 30,000

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Of the Category:CS1 errors, the tracking category with the most entries is Category:Pages using citations with accessdate and no URL (43,719). There is no silver bullet solution to clearing the cat, so this will break it down by targeting a known type of problem within that category. There have been many discussions about it over the years.

The method is simple: if a cite template contains a |access-date= and contains a permanent-record identifier (eg. |jstor= or |biorxiv=) which creates an external link (ie. not a |isbn=), the assumption is the |url= never existed or was removed (such as by an AWB script) when it was changed over to the permanent-record identifier. As such, the |access-date= should be removed because it's only used when a |url= exists.

Because the bot will skip {{cite web}} it's estimated it will check about 30,000 articles and some unknown sub-set of those will contain this particular scenario/fix.

Discussion

Yobot 59

Operator: Magioladitis (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 07:57, Tuesday, March 27, 2018 (UTC)

Automatic, Supervised, or Manual: Manual

Programming language(s): AWB / WPCleaner

Source code available:

Function overview: Fix ISBN mistakes (CHECKWIKI 69-73)

Links to relevant discussions (where appropriate):

Edit period(s): Daily

Estimated number of pages affected:

Namespace(s): Main

Exclusion compliant (Yes/No):

Function details:

Discussion

Fix commons ISBN errors. CHECKWIKI 69-73 -- Magioladitis (talk) 07:57, 27 March 2018 (UTC)

Could you please provide some sample edits to show how the bot would fix each of these Checkwiki errors? Since magic links are going away someday, the bot should wrap ISBNs in the {{ISBN}} template. – Jonesey95 (talk) 12:54, 27 March 2018 (UTC)

Jonesey95 I will editing manually from the bot's account. I will be searching for the correct ISBN in google books and other places and correcting the number. I will use AWB/WPCleaner only to load the list of pages faster and be able to perform WP:GENFIXES. -- Magioladitis (talk) 13:43, 27 March 2018 (UTC)
#73 sample edit, #70 sample edit, #72 sample edit, #69 sample edit -- Magioladitis (talk) 13:51, 27 March 2018 (UTC)
Thank you. Sample edits for the other CW errors would be helpful. Magic links are going away. Please wrap ISBNs in the {{ISBN}} template. – Jonesey95 (talk) 14:35, 27 March 2018 (UTC)
Jonesey95 I will. I was already wrapping them. I am one of the supports of the deprecation fo the ISBN magic links. For the 69 one Ihave a list of commons ISBN myspellings: User:Magioladitis/ISBN which I created in 2015 and now they are partly part of CHECKWIKI's detection logic. -- Magioladitis (talk) 16:08, 27 March 2018 (UTC)
Jonesey95 I added sample edits for all of them. -- Magioladitis (talk) 16:11, 27 March 2018 (UTC)
This feels like pulling teeth. I would like to see sample edits made by your proposed code, made by you, that result in an ISBN fix and wrapping the ISBN in the ISBN template. – Jonesey95 (talk) 16:31, 27 March 2018 (UTC)
Jonesey95 There is no additional code. Purely manual. I won't be adding ISBNs in template. This is already done by my bot automatically. I am requesting to fix invalid ISBNs numbers with correct ones. This is what errors 69-73 are about. For instance in error 69 one can find errors like O (big o) instead of 0 (zero) etc. I this request, when I find an invalid ISBN I will replace it with the correct one using google books site, my private library or local public library to find the corrcet ISBN and put it in place. -- Magioladitis (talk) 17:30, 27 March 2018 (UTC)
So wait, given the option between manually fixing the ISBN and putting it in the template, you'd prefer to only fix the ISBN and let the bot make a second edit to put it in a template? Primefac (talk) 17:35, 27 March 2018 (UTC)
Primefac The ISBN template issue has already been handled for every correct ISBN number by numerous bots. I will be fixing ISBN numbers inside the template or if the invalid number is not in a template I will fix the number first and manually add the template if necessary. Completelly manual work. -- Magioladitis (talk) 17:37, 27 March 2018 (UTC)

#69 sample fixes (one with template; one without) -- Magioladitis (talk) 17:39, 27 March 2018 (UTC)

#73 sample fix. -- Magioladitis (talk) 17:42, 27 March 2018 (UTC)

#72 sample fix. -- Magioladitis (talk) 17:43, 27 March 2018 (UTC)

#71 sample fix. -- Magioladitis (talk) 17:45, 27 March 2018 (UTC)

#70 sample fix. -- Magioladitis (talk) 17:45, 27 March 2018 (UTC)

Usernamekiran BOT 2

Operator: Usernamekiran (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 04:00, Wednesday, February 7, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): WP:AWB

Source code available: Yes.

Function overview: Insert banner on the talkpages of pages that come under the scope of wikiproject organised crime.

Links to relevant discussions (where appropriate):

Edit period(s): 3-4 times a week.

Estimated number of pages affected: around 2 to 3000.

Namespace(s): artcile talk, category talk, file talk, template talk.

Exclusion compliant (Yes/No): No.

Function details: Since last few months, I have been using my other account (Usernamekiran (AWB)) for inserting {{WikiProject Organized crime}} on the talkpages that fall under the scope of wikiproject. So far I have inserted banner on thousands pages, and there has been no mistake, nobody has objected yet. I can sort the targets properly (Wikipedia:WikiProject Organized crime/Bot tagging categories). The bot will not do any other changes other than adding this banner. And basic things like adding banner shell if banners exceed 3, old ProD.

Discussion

Is your only estimate that this will be 6 to 12000 edits per week? — xaosflux Talk 02:10, 8 February 2018 (UTC)

@Xaosflux: No. At max, there are around 4 thousand pages remaining that needs to be tagged with the organised crime banner. I think, using the bot, I will rag around 3 thousand pages. The ones that need human judgement, will be done semi automatically from non-bot a/c. I meant, I will run the bot 3-4 times a week. Given if i get done 1000 pages done in one day/session, then the entire task will be done in like 4 days. If not, Iwill be using bot like 3-4 times a week. I think this task will take 2 weeks to finish. I apologise for the confusion. —usernamekiran(talk) 08:11, 8 February 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── If necessary, I have been using this module since months. I mean, I created the module a long time ago, but I barely used it till the discussion with Primefac special:diff/824201270.

  public string ProcessArticle(string ArticleText, string ArticleTitle, int wikiNamespace, out string Summary, out bool Skip)
        {
            Regex header = new Regex(@"\{\{{WikiProject Organized crime|{{WikiProject Organized Crime|{{WikiProject Fictional characters|{{Comicsproj|{{WikiProject Film|{{Film|{{WikiProject Video games|{{WikiProject Television|{{WPTV|{{WP Fictional|{{WikiProject Novels|{{WikiProject Anime|{{TelevisionWikiProject|{{WPFILM|{{WikiProject Songs|{{Songs|{{album|{{WikiProject Hip hop|{{WP film|{{WPBooks", RegexOptions.IgnoreCase);
            Summary = "Added banner for [[WP:WikiProject Organized Crime]]";
            Skip = (header.Match(ArticleText).Success || !Namespace.IsTalk(ArticleTitle));
            if (!Skip)
                ArticleText = "{{WikiProject Organized Crime}} \r" + ArticleText;
            return ArticleText;
        }

Also, I re-checked. I can't be sure about exact number of pages to be tagged with banner, but they appear to be more than 7000. In previous calculations, I had not included the terrorist organisations. —usernamekiran(talk) 13:27, 12 February 2018 (UTC)

{{BAG assistance needed}}

Bots in a trial period

Gabrielchihonglee-Bot 4

Operator: Gabrielchihonglee (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 14:28, Tuesday, January 16, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python (pywikibot)

Source code available: will be given after test run

Function overview: Change Cite web template in pages to move ThePeerage's website from parameter "publisher" to "website".

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#thepeerage.com

Edit period(s): One time run

Estimated number of pages affected:

Namespace(s): Mainspace

Exclusion compliant (Yes/No): Yes

Function details: Flow of the bot:

  1. Get all pages with template Cite web and the parameter publisher is ThePeerage's website
  2. Delete parameter 'publisher' and add the link to 'website'
  3. Save


Discussion

  • Can you run a database scan or otherwise check to get some measure of estimated number of pages affected? ~ Rob13Talk 17:39, 24 January 2018 (UTC)
    @BU Rob13: I just made a scan to 40,000 pages that have the cite web template, it shows that 20 of them will be affected. According to template count, there is a total of 2,700,000 pages, which by ratio, should have a total of around 1350 pages being affected. However, this may be very inaccurate as I've just scanned 1.5% of all pages with that template. I assume the bot will affect 500 - 5000 pages. -- Gabrielchihonglee (talk) 08:23, 30 January 2018 (UTC)
    Approximately 2k. --Izno (talk) 19:21, 21 February 2018 (UTC)
    This search indicates a few more, without filtering for cite web. --Izno (talk) 19:24, 21 February 2018 (UTC)
    6597 as of right now. SQLQuery me! 17:39, 21 March 2018 (UTC)
    That's just a total count of the links though, no? We're just looking for where |publisher=The Peerage, which is what my search indicates. --Izno (talk) 14:22, 31 March 2018 (UTC)
    Fair point! SQLQuery me! 01:57, 3 April 2018 (UTC)
Approved for trial (30 edits).xaosflux Talk 14:04, 13 April 2018 (UTC)

Bots that have completed the trial period

DeprecatedFixerBot 3

Operator: TheSandDoctor (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 22:35, Thursday, March 22, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: https://github.com/TheSandDoctor/Music-infoboxes-deprecated-param-fixer

Function overview: The bot goes through Category:Music infoboxes with deprecated parameters looking for either {{Infobox Album}}, {{Extra chronology}}, {{Extra album cover}}, {{Extra track listing}} (of course, any of their redirects/synonyms as well). If they are found, the bot appends "subst:" to the title to trigger the substitution trick (as noted/recommended in all templates linked above) to resolve deprecated parameters.

Links to relevant discussions (where appropriate): N/A. Template:Infobox album, Template:Extra chronology, Template:Extra album cover, Template:Extra track listing

Edit period(s): A series of shorter runs until resolved

Estimated number of pages affected: 149,557 (approx)

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Namespace(s): Mainspace only.Wherever present (mostly mainspace)

Function details: The bot goes first generates an internal list of the page names within Category:Music infoboxes with Module:String errors. It then goes through Category:Music infoboxes with deprecated parameters. If the title of the next page to edit is within the errors category, then it skips it and will not edit that page. If the page title is not in the list, the bot looks for either {{Infobox Album}}, {{Extra chronology}}, {{Extra album cover}}, {{Extra track listing}} (of course, any of their redirects/synonyms as well). If they are found, the bot checks if they contain |released= (and that it contains a date, ie not "Unreleased"). In the event that it does, the bot proceeds directly to substituting the template(s) and checks its edit (see below).

In the event that the released parameter is not found, the bot checks for a date within the |this album=,|prev album=,|next album= for dates if |released= is not found. If a valid date format is found, then it appends the appropriate year parameter and moves the found date into it (the exception, of course, being WikiLinks, in that case they are just copied rather than moved). Once the incompatibilities with the substitution trick have been worked out, the bot appends "subst:" to the title to trigger it[a] the substitution trick (as noted/recommended in all templates linked above) to resolve deprecated parameters.

As a last resort the bot also checks its edit after making it. The bot will revert itself if the following two criteria are met:

  1. the page contains "[[Category:Music infoboxes with Module:String errors|"[b] after the bot's edit and
  2. the bot was the last user to edit the page

Notes

  1. ^ For example, "Infobox album" becomes "subst:Infobox album", which then the "trick" turns back to "Infobox album", only with all the parameters corrected. The same goes for all of the other templates mentioned above.
  2. ^ That is the pattern that it is looking for. After the '}}', it varies what is before the closing ']]', so it is left open-ended. The bot does not use regular expressions for this, so it only matches that part of the string (which is the part that matters).

Discussion

{{BAGAssistanceNeeded}} --TheSandDoctor Talk 23:16, 29 March 2018 (UTC)
Approved for trial (50 edits). Please run a trial and post back a summary of your results and a link to the diffs. — xaosflux Talk 13:28, 12 April 2018 (UTC)
@Xaosflux: Trial complete. Diffs (most recent 50, bot will not run again until this BRFA is complete) --TheSandDoctor Talk 16:44, 12 April 2018 (UTC)
Pages in Category:Music infoboxes with Module:String errors need manual attention. Edits like Special:Diff/836087082, Special:Diff/836087087, Special:Diff/836087114, Special:Diff/836087144, Special:Diff/836087156, Special:Diff/836087190, Special:Diff/836087214, Special:Diff/836087927, Special:Diff/836088186, Special:Diff/836088188, Special:Diff/836088194, Special:Diff/836088888, and Special:Diff/836088935, cause errors to appear in articles. Special:Diff/836088194 and Special:Diff/836087156 also missed {{infobox album}}. — JJMC89(T·C) 02:29, 13 April 2018 (UTC)
I would like to request a new trial (not limited by time) so that I can test improvements when I have made them (busy with finals now for next two weeks). My guess about the error is that it is due to the other albums not being WikiLinks, but that is just a guess at the moment that I will have to investigate further when I have the chance. Thank you for bringing those up JJMC89, I could add a check to ensure that the page being edited is not the same as one in that category. As for missing the infobox, I missed those two. Not sure what would have caused the issue, but I will look into it ASAP. --TheSandDoctor Talk 03:02, 13 April 2018 (UTC)
Symbol tick plus blue.svg Approved for extended trial (50 edits or 60 days). @TheSandDoctor: I don't like to leave these in trial "indefinitely", if you need more than 2 months to do this we can but this whole request on ice until you are ready. — xaosflux Talk 13:29, 13 April 2018 (UTC)
Thank you Xaosflux. Instead of editing the page, I had the bot in "dry-run mode" where it spits out its changes (what it would send to server to save) into a text file. I took (the relevant infobox) spit out of Annette (album) and this time, it found the infobox album (sandbox diff). I checked Art Pepper with Duke Jordan in Copenhagen 1981's infobox as well in the sandbox and it now works fine by the looks of things. The bot is also behaving properly on Anita O'Day & the Three Sounds now as well. I will live edit the three shortly. (For this task's script) the bot now checks what is present in Category:Music infoboxes with Module:String errors before editing pages. (cc @JJMC89:) --TheSandDoctor Talk 04:01, 14 April 2018 (UTC)
I know why it worked perfectly on those pages. During revert JJMC fixed the error. *mental facepalm*. Will dry run some more pages and get back to you. --TheSandDoctor Talk 04:07, 14 April 2018 (UTC)

Doing the subst trick on its own will not work with these Infoboxes. Because fields like Last album, This album and Next album have been free text since day one, there are all kinds of weird and wonderful user formats, and errors that are present and that trip the subst command up, resulting in string errors. I've done a lot of work with AWB on the music infoboxes and the range of formats is ridiculous, e.g. I kept creating a new a rule to handle each new formatting style I found for the Last/This/Next album fields. I eventually had to stop because there were so many of them, it just became quicker to manually edit them. I strongly urge you to either do a separate bot run to pre-parse the Last/This/Next album fields or make the bot re-check a page after an edit and revert itself if a string error has been introduced. The unacceptable situation is for a bot run that is only doing a subst edit, because that will result in Category:Music infoboxes with Module:String errors being swamped with a wave of extra articles that would be left for manual clean up, and that isn't an acceptable solution. - X201 (talk) 07:47, 18 April 2018 (UTC)

@X201: I do not have time to work on this for the next week or so (finals wrapping up). I agree that swamping Module:String errors is not an acceptable solution, that is why I am working on a solution that doesn't use the subst trick for extra chronology. I will keep this page updated when possible, but I wouldn't expect any updates until next week some time when finals are over and I don't have any more studying to do and can work on this more consistently. --TheSandDoctor Talk 20:38, 18 April 2018 (UTC)
@X201: I have written and tested a revert function so the bot now has the ability to revert its latest edit and have also implemented the rough framework/outline that should allow the bot to review its edit after making it. The plan is to link the two and revert itself if:
  • A: spots [[:Category:Music infoboxes with Module:String errors{{!}}C]] in the page, and
  • B: was the last editor of the page
That could be a useful backup should the other preventative actions fail. Again, I will have more time to work on this next week and shall keep everyone updated. --TheSandDoctor Talk 03:08, 20 April 2018 (UTC)
Looks like a good plan. Good luck with the finals. - X201 (talk) 07:22, 20 April 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── Trial complete. @Xaosflux: Cleanup/revert functionality has now been tested and incorporated into the bot. Description has been updated to reflect the bot procedural differences (still does the same task, just updated to reflect that it does it slightly more cautiously now) --TheSandDoctor Talk 05:24, 24 April 2018 (UTC)

RonBot 3

Operator: Ronhjones (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:30, Monday, February 19, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: User:RonBot/3/Source1

Function overview: 1. Add {{non-free reduce}} to images over the NFC guideline 2. Add {{Non free image to be reduced}} (not yet written) to the uploader's talk page.

Links to relevant discussions (where appropriate): Wikipedia:Village_pump_(proposals)/Archive_145#Template:Non-free_reduce_bot

Edit period(s): Daily

Estimated number of pages affected: around 70 pages per day

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Bot will do a search on Category:All non-free media for images that have a fileres: >325 (105625 pixels). It will check the last upload is the big image (wiki search shows the biggest visual image on the page, not necessarily the last image), and after checking to make sure it's suitable and not tagged already for reduction or for no reduction will add the {{non-free reduce}}. Also the uploader will get the {{Non free image to be reduced}} added to their talk page to explain that the image has been tagged and what options are available. NB: That template does not yet exist - some text ideas are at User:Ronhjones/Sandbox4.

Discussion

  • NB: This bot will only affect new uploads. I've done all the old files manually. Ronhjones  (Talk) 19:32, 19 February 2018 (UTC)


From general bot standpoint, I agree with the spirit of a concern raised in the linked discussion:

If the idea is to alert the user, have the bot post a message on their talk page. {{non-free reduce}} goes far beyond just alerting the user, since it will also alert a bot to automatically resize the image within 24 hours. --Ahecht (TALK PAGE) 16:43, 7 February 2018 (UTC)

...with the main reason being that typically we'd like to avoid quick back-to-back bot operations (i.e., a bot action that prompts another bot to take action). That said Wikipedia:Bots/Requests for approval/Fbot 9 did a more expanded version of this (now inactive), and spurred some disagreement. This would be more targeted toward solely new uploads rather than everything, if I'm reading the new discussion correctly? It might also be an idea to go with a higher, clear-and-obvious/ likely-no-errors threshold for the pixel area / sizes for one form of tagging (i.e., the one that prompts bot followup) and tag with something less intrusive for only-suspect ones (since there was some disagreement on what that threshold should be).

--slakrtalk / 01:15, 22 February 2018 (UTC)

If the reduction was not reversible, I would agree with the "Bot to Bot" scenario, however do we allow a full 7 days for a simple revert, which the uploader (having been notified) can easily do if necessary. Also quite often it's not really possible to evaluate the reduction until it's actually done. New uploads only, currently all files in excess of 105,000 pixels that are not up for reduction are tagged with {{non-free no reduce}} already (921 files at present = just 0.15% of all non-free images). I think we must be careful not to set a different "bar" - if we tag all files over X pixels, then we need something else done for the range 105,000 to X pixels (maybe use a modified {{non-free manual reduce}}), otherwise there will be users who will pitch their image just under the new bar - I've seen this with the current system, where there have been a disproportionate amount of new uploads in the 100,000 to 100,500 range - obviously knowing the bot won't reduce them (about 60,000 images in this range - 10% of the whole non-free category). Ronhjones  (Talk) 19:02, 22 February 2018 (UTC)
{{BAGAssistanceNeeded}} Do we think this can go anywhere? Just for info, in the last 2 months I have had to manually tag 4105 oversized non-free images - an average of 70 files a day (at least I don't have to tell the uploader, so my simple javascript to add the template by clicking an extra tab is very useful). Checking my talk page, I have only had 4 queries about file reductions, within that 4105 file lot. Ronhjones  (Talk) 01:28, 12 April 2018 (UTC)
  • As someone who used to run a similar bot, I support this initiative. Ronhjones has already done 4000+ of these manually without issue; should be safe to move forward with a trial. -FASTILY 04:40, 12 April 2018 (UTC)
You will need to build out the user messaging before this can be trialed. If you need assistance with the wording, check in the areas that normally deal with this subject. — xaosflux Talk 22:39, 12 April 2018 (UTC)
@Xaosflux: I was initially thinking of more than one, but in the end I went for a simple system - {{Non free image to be reduced}}Ronhjones  (Talk) 21:18, 13 April 2018 (UTC)
{{BAGAssistanceNeeded}} Any chance of moving forward with a trial on this one. Still manually tagging 70 odd images a day - as an aside, it's possible the use of this bot with its user notices "might" persuade editors to consider the size of the image before they upload it (probably a bit of a long shot...). Ronhjones  (Talk) 15:40, 22 April 2018 (UTC)
Approved for trial (50 edits).xaosflux Talk 20:08, 22 April 2018 (UTC)
Trial complete. Run as trials of 1, 1, 5, 13, and 30 files. List of edited pages at User:RonBot/3/Trial. First trial had a minor glitch of not adding the signature to the talk page (so added manually, and edited code for next trial). The rest of the trials went smoothly. Source code page updated to last version used. Ronhjones  (Talk) 18:42, 23 April 2018 (UTC)

Muninnbot

Operator: Tigraan (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:19, Sunday, March 25, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python 3

Source code available: [1]

Function overview: Notifies posters when a Teahouse thread gets archived.

Links to relevant discussions (where appropriate): See Wikipedia:Bots/Requests_for_approval/Tigraan-testbot and the links from there

Edit period(s): Daily

Estimated number of pages affected: ~20/day

Namespace(s): User talk pages

Exclusion compliant (Yes/No): Yes (PWB)

Function details: See previous BRFA (Wikipedia:Bots/Requests_for_approval/Tigraan-testbot). That is a bit of a procedural nomination, since that "new" bot has exactly the same functionality. We intended to add that functionality to User:HostBot (maintained by Jtmorgan) but we have not done so, and I also have plans/dreams to extend the functionality on other pages than the Teahouse (and it would not make much sense to perform the duties from a Teahouse-dedicated bot).

The previous test run was fine but I have done a significant refactoring of the code (tested, of course, but you never know what can happen). Furthermore, it would be my first bot on Toolforge, and I am clearly not above a screwup when submitting the jobs on the grid and running the script from a different environment. So I would request a trial run similar to last time, even if the functionality has not changed.

Discussion

  • Is User:Tigraan-testbot still going to making edits and need a bot flag? — xaosflux Talk 20:52, 25 March 2018 (UTC)
    No, please remove it. I intended to use the "testbot" account for trial runs etc. but a quick glance at the Toolforge setup makes me think it is more trouble than it's worth. TigraanClick here to contact me 10:20, 26 March 2018 (UTC)
     Done, @Tigraan: if you want a bot flag on testwiki, just drop me a note at testwiki:User_talk:Xaosflux and I'll flag it over there for you. — xaosflux Talk 14:05, 26 March 2018 (UTC)
  • Please put a task description on User:Muninnbot and you probably should redirect its talk page to you. — xaosflux Talk 20:52, 25 March 2018 (UTC)
    Kind-of done for the former (the short task description is in, but I will absolutely need to write a proper documentation, what I wrote in the github repo is out of date...), done for the latter. TigraanClick here to contact me 10:20, 26 March 2018 (UTC)
Approved for trial (50 edits or 30 days). please post back results here after trialing. — xaosflux Talk 14:07, 26 March 2018 (UTC)
{{OperatorAssistanceNeeded}} how was the trial? Please post a summary and diffs.xaosflux Talk 14:06, 13 April 2018 (UTC)
Looks like the trial never occurred, do you intend to move forward on this still? — xaosflux Talk 14:07, 13 April 2018 (UTC)
Yes, sorry. I was out of home for a few weeks, but should be able to trial this weekend. TigraanClick here to contact me 14:11, 13 April 2018 (UTC)
OK, I will not have the trial results this weekend, due to a pesky little issue with PWB, but I should be able to do that this week (the big technical hurdle was to install PWB on Toolforge, but I got this). TigraanClick here to contact me 20:03, 15 April 2018 (UTC)
OK, added more days to the trial approval. — xaosflux Talk 12:56, 17 April 2018 (UTC)
  • Trial complete. - 34 edits to user talk pages outside its own, if we exclude an embarassing mistake where I created a random page (which hopefully was only seen by the deleting administrator). Not quite 50, but I want to avoid to go overboard as last time. I reviewed ~20 of them by hand so far and they all look good (Special:Diff/837740252 looked sketchy at first since the thread is signed by someone else, but I believe it's SineBot's fault). Typical diff: the correct notification is posted and a few cats are reorganised on the page (that's PWB scripts add_text).
I do not explain why the April 22 edits are all marked minor, while the April 21 edits are not, since I did not touch the code in between the two runs. If that is a problem I can try to see PWB's options. (I have kicked it off the cron for now.) TigraanClick here to contact me 20:56, 22 April 2018 (UTC)
Suggestion: Change L993-L997
    post_text = '=={sn}==\n{tta}'.format(sn=sn, tta=text)

    # Caution: will not ask for confirmation!
    add_text.add_text(page, post_text, summary=es,
                      always=True, up=False, create=True)
to page.save(text=text, summary=sn, section='new', minor=False, botflag=False). (example edit) This will avoid making other changes to the user's talk page; however, you will lose the ability for the edit summary to be different from the section name. — JJMC89(T·C) 23:28, 22 April 2018 (UTC)
I incorporated JJMC89's suggestion above in the code (having no control over the edit summary is not an issue; of course, I set botflag to True, though.). After that plus a bit of unimportant code tweaking, a dry run showed that today's batch is 14 notifs, so I could run it without going over trial limit (34+14<50). Results here. Manual inspection revealed no discrepancies. The worse I could see is that a human editor would probably have refrained notifying for that archival (not really a question, and veteran editor), but there was no simple way to avoid that (well, except avoiding notifications for users with more than X edits or the like, but consensus was somewhat against such a scheme when we designed the bot a year or so ago). TigraanClick here to contact me 19:23, 23 April 2018 (UTC)
Another good suggestion at User_talk:Tigraan#No_time_tag_on_Muninnbot_notifications by David Biddulph, implemented by editing the template rather than the bot code (hence I do not believe any testing is necessary). TigraanClick here to contact me 10:19, 24 April 2018 (UTC)

RonBot 4

Operator: Ronhjones (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:32, Friday, March 9, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: User:RonBot/4/Source1

Function overview: Currently DatBot6 reduces all tagged JPG and PNG files in Category:Wikipedia non-free file size reduction requests. RonBot2 does the same for GIF files. SVG files are, by default, sorted into Category:Wikipedia non-free file size reduction requests for manual processing (along with TIFs and PDFs). This bot will reduce just the SVG files in that category.

Links to relevant discussions (where appropriate):

Edit period(s): Daily Run, expect 5 hours max

Estimated number of pages affected: About 2500 now, then a few files each week

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: Uses a modified version of the code that DatBot6 and RonBot2 is using. Treating the downloaded file as text and applying the methodology of User:Ronhjones/SVGreduce. There are a few files which do not easily work (alternative name for width (wide), and when there is "width=XX%"), these are skipped. Would like to do same as RonBot2 - modify {{non-free reduce}} to move SVG back to parent category, and set the skipped files to the "manual reduce" catagory

Discussion

  • @Ronhjones: This is uncontroversial (well, no more controversial than image reductions required by policy always are), but I was under the impression automating this was somehow technically difficult. Have you used this methodology before successfully? ~ Rob13Talk 04:40, 22 March 2018 (UTC)
I have to thank User:JoKalliauer for enlightening me on how the initial svg tag is formatted and how to easily change the nominal size. That's all summed up in User:Ronhjones/SVGreduce. I have done about 400 odd files manually using this methodology, editing on-wiki with User:Rillke commons script for editing SVG files (See my contributions 26th Feb and earlier). I have also run the bot (with saves only to my hard drive) and only found a few files that caused a problem and they are now excluded from processing...
  1. I found someone had used "wide=" instead of "width=".
  2. There were a few with width="100%" height="100%"
The code is set to skip those. I think the the best plan would be to put the SVG files back into Category:Wikipedia non-free file size reduction requests amd any file the bot does not like gets moved to Category:Wikipedia non-free file size reduction requests for manual processing in the same way RonBot2 does for GIF files - that's not in the current source code yet, but can be easily added. Ronhjones  (Talk) 19:18, 22 March 2018 (UTC)
@Ronhjones: If you have problematic files you don't know how to change the previewsize without changing the content of the svg, message me with a link to an example and I am happy to try to take care about it. Now most pics in the manualcategory are in the form: width="[[digit].]+(px|mm|pt|pc|cm|in)*" height="[[digit].]+(px|mm|pt|pc|cm|in)*" which should be changed to width="www" or height="hhh" if viewBox= is in the orginal SVG JoKalliauer (talk) 11:05, 24 March 2018 (UTC)
Thanks for that. As it stands, if we can get permission to run the bot, from the dummy runs done - I estimate that <0.5% of the files will be skipped for a proper manual evaluation. Ronhjones  (Talk) 18:00, 24 March 2018 (UTC)
Added "pc" and "cm" dimensions to the source code. Ronhjones  (Talk) 14:45, 25 March 2018 (UTC)
{{BAGAssistanceNeeded}} If this is uncontroversial, then how about a trial? Ronhjones  (Talk) 22:36, 4 April 2018 (UTC)

Approved for trial (50 edits). ~ Rob13Talk 01:16, 5 April 2018 (UTC)

Trial complete. - Results documented at User:RonBot/4/Trial Ronhjones  (Talk) 22:06, 5 April 2018 (UTC)
A)Because files have to be reuploaded and saved completely there should be basic repairing of damaged files at the same time like: (I just looked first 10 reported files)
  1. replace xlink:href="data:image/jpg;base64, with xlink:href="data:image/jpeg;base64, as in File:CRC_Logo.svg
  2. fake svg (SVGs that only contain one rasterimage) should be exported as jpeg or png as in File:Bristol_Pitbulls_2010_Logo.svg (to File:Bristol_Pitbulls_2010_Logo.png)
B)About the file which didn't work if there is a viewBox="[-[digit]. ,]+" remove all height="[[digit].]+(px|mm|pt|pc|cm|in)*" and remove aswell all width="[[digit].]+(px|mm|pt|pc|cm|in)*" and insert width="www" height="hhh"
In this case it should be changed from
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" wide="500" height="442.4"
         viewBox="0 0 500 442.4" enable-background="new 0 0 500 442.4" xml:space="preserve">
to
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" wide="500"
         viewBox="0 0 500 442.4" width="50" height="44.24" enable-background="new 0 0 500 442.4" xml:space="preserve">
C)Generally to keep the aspect-ratio (also if scale the file). If you set a desired width, I would not round the height to an integer.
D)In my opinion the "target pixel count" is picturedependend: f.e.: File:Cycling_(track)_2018_Commonwealth_Games.svg should have a lower target pixel count than File:The_Guardian_al-Qaeda_recruitment.jpg
PS: Im quite negative about it because, in my opinion to meet fair-use the precision of SVGs hast to be reduced (or converted to png), not the size of the preview. (This is independent if you do it manually or with a bot.)
JoKalliauer (talk) 18:19, 6 April 2018 (UTC)
A) Repairing. Nice idea, I think better done with a different bot, this will only work on currently tagged oversized SVGs and new uploads. There are thousands of existing SVGs which won't be touched. Need a bot to run through ALL the files (maybe the next one for me). Total 20,000 SVGs on en-wiki, I'm only going to look at 2500.
B) Agree, but my estimate is that there will be so few files, it's an easy manual task.
C) No problem, I'll check the code so there is no rounding - see User:RonBot/4/DummyRun.
D) Agree, but not something the bot can easily determine.
E) I know that, and I know other editors have the same opinion on how reduction is done, this is just one facet. My suggestion would be to allow editors to apply a (new) tag where they think that the precision of SVGs has to be reduced. However, I know there are SVGs (not oversize) that are currently tagged with {{non-free reduce}} - in the hope that someone would do such a process - I've not seen any movement in that direction, but there are so many big files tagged, that they are rather lost in the big list. Ronhjones  (Talk) 23:59, 6 April 2018 (UTC)
  • @Ronhjones: Trial looks fine. Before this is approved, could you edit the text of {{Non-free reduce}} to give instructions to editors who want their file reduced by a human rather than a bot? In some cases, as noted above, this may be desirable for an SVG (and other images as well). ~ Rob13Talk 13:37, 13 April 2018 (UTC)
  • @BU Rob13: I have...
  1. Adjusted the {{non-free reduce}} to mention vector editing programs, state manual reduction is an allowed option, and explained that svg will only be reduced by changing the nominal size, and not reducing the number of vectors.
  2. Created {{Non-free manual svg reduce}} - as a usable template where editors require a reduction of vectors to conform to "low res" can apply to any image.
  3. Fixed the "switch" to move SVGs back to the parent template (current reducing bots are set to skip SVGs anyway), then this bot can pull the SVGs from Category:Wikipedia non-free file size reduction requests and put the few it has to skip into Category:Wikipedia non-free file size reduction requests for manual processing just as RonBot2 does with GIF files. I just need to insert that bit of code from RonBot2 into this bot.
Hope that sounds OK. Ronhjones  (Talk) 19:40, 13 April 2018 (UTC)
Code changed and saved in the source page Ronhjones  (Talk) 19:53, 13 April 2018 (UTC)
  • All of the original SVG files have been deleted so I can't see what the bot has changed, but the discussion suggests that the bot operator has misunderstood what reducing a file means and that none of the files in fact were reduced. Reducing means removing non-free content, but it looks as if the operator simply added the <svg> tag. That tag does not contain any non-free content so changing the tag does not mean reducing or increasing the file, only changing how the non-free content is presented. If the files were too big before the bot edited them, then they are still too big. In other words, {{non-free reduce}} should not have been removed from the files as it seems that the bot didn't reduce anything at all. --Stefan2 (talk) 19:57, 13 April 2018 (UTC)
I personally tagged the vast majority of these files, many, many months ago. They were tagged because they have a nominal size way in excess of the guideline size, thus making the wiki software generate a png image that appears very large on the image page, which is unacceptable. As I explained in point (2) above, I have made another template ({{Non-free manual svg reduce}}), which can be used to indicate that you (or other editors) think that more editing work is necessary to achieve proper non-free status. However that is not something that can be done by a bot, and needs a personal touch. If you don't like my wording of ({{Non-free manual svg reduce}}), then you are free to alter it. This bot was always about reducing the nominal size. Ronhjones  (Talk) 17:27, 14 April 2018 (UTC)
@Stefan2: I note you tagged one file, after I did the nominal size reduction manually (there were a couple of others, but deleted), so I have changed the template to {{non-free manual svg reduce}} for you (File:Arunachal Pradesh Seal.svg). Ronhjones  (Talk) 22:37, 14 April 2018 (UTC)
Stefan2 might also want to note that the documentation in {{non-free reduce}} says (and I did not write it - it was added 08:16, 17 December 2010‎) For non-free SVG file, add {{Non-free reduce}} if the vector image displayed at excessive nominal size. Ronhjones  (Talk) 23:42, 15 April 2018 (UTC)
The template looks wrong. The text seems to have been added in Special:Diff/343420609 but there's no edit summary so I don't know if there was a discussion about the wording at that point. Maybe someone never discovered it as the type parameter rarely is used. I'm not sure why the template even has a type parameter in the first place since most files are reduced by a bot within a day after the file was tagged, and the bot doesn't pay any attention the parameter.
WP:NFCC#3b is about removing non-free content, not presenting the same non-free content in a different way. The nominal size in the <svg> tag is just an arbitrary number - it can be set to anything and only affects how the file is presented on the file information page. In my opinion, changing this arbitrary number does not reduce or increase the file's "size" but just looks like a redundant edit which produces unnecessary clutter on people's watchlists. For pixel graphics, reducing a file means merging two or more pixels into one single pixel, subject to the condition that you hardly should notice a difference if the image is used as a small illustration for an article. For vector graphics, the logical reduction is then that two or more vectors are merged into a single vector, subject to the same condition that it is hard to notice a difference when the file is used in a typical Wikipedia article. --Stefan2 (talk) 23:08, 20 April 2018 (UTC)
I agree with Stefan2, therefore processed the first files in this category using a script (cleanupSVG):
Of course, you could decrease the file size a little bit more (but generally reducing further, would only lead to a negligible file reduce, compared to the previous file-reduction), but the values used in my script is a good trade-of between precision and file size. Which is comparable to a resolution of 0.1Megapixel, so generally there are minor visible changes if you zoom the picture to full screen. JoKalliauer (talk) 14:33, 21 April 2018 (UTC)
@JoKalliauer: OK, therefore can I therefore suggest that Stefan2 or yourself take over this BRFA? For several reasons...
  1. I think the bot owner should be able to maintain the code, as issues arise, it's clear that that will be impossible for me, it's not written in a code I know, nor do I exactly understand what it actually does.
  2. I'm retiring in six weeks, I intend to move my current bots over to the Wikimedia Toolforge, so that they are not relying on my PC, as I may not be at home to fix any issues. I can then use my 12 inch tablet.
Can I also ask what will the code do with an embed raster graphic? I was considering another bot to highlight those - see Wikipedia:Village_pump_(proposals)#Proposed_Bot_for_tagging_BadSVG_files.. Do also remember that there are over 12000 non-free svg files below the nominal size of 100,000 pixels - I know some editors have made their complex logo a mere 20x20 on the nominal size, knowing it makes zero difference on enlarging Ronhjones  (Talk) 15:27, 21 April 2018 (UTC)
@Ronhjones: My script, basically calls 3 different svg-Optimizer (scour, svgo, svgcleaner). If you want to see how to optimize SVGs: SVGOMG is an easy, optimiser with a simple graphical interface (works in every common browser). JoKalliauer (talk) 16:03, 21 April 2018 (UTC)
@JoKalliauer: I'm not a computer programmer, just a mad organic safety and development chemist. I cannot implement and maintain your script in my code. If you want this, then someone else will have to run the bot. Ronhjones  (Talk) 17:03, 21 April 2018 (UTC)
I'm a civil Engineering PhD-Student, so I'm not a computer programmer either. I just call three Optimises with options, which I think fit for Wikipedia. Since they are quite complicated, I think a file-size-reduction is to "complicated"/"idividual" for a Bot, and some optimisations need much computer-resources. In my opinion: Reducing the precision should not be done with a bot, and reducing the preview-size (can be done with your bot), but is in my opinion useless. JoKalliauer (talk) 05:37, 22 April 2018 (UTC)
It's not useless as it does what {{non-free reduce}} currently specifies (even if Stefan2 does not agree, but it's been like that for 8 years). It does stop the wiki software making an unacceptably large png (wiki software assumes that not all browsers will show an svg, and makes png images where they will be shown) - having non-free pngs of 1000x1000 can never be acceptable. Also I tagged 90% of the current "oversized" svgs solely on the basis of the nominal size. If RonBot3 gets approval it will do the same. I agree that a "superbot" to do a full optimisation is very unlikely, that does not stop this one moving on, until someone writes it. Such a "superbot" would have to run through the entire set of SVGs, as there are plenty below the 100,000 nominal size that would need treatment (currently 14,615 SVG files in en-wiki}}.Ronhjones  (Talk) 15:27, 22 April 2018 (UTC)
Summing up - Stefan2 and JoKalliauer would obviously like the bot to do more, but as they say, it's unlikely to be doable with a bot. So going back to the original idea of doing what {{non-free reduce}} has stated for the last 8 years - For non-free SVG file, add {{Non-free reduce}} if the vector image displayed at excessive nominal size. The trial is OK, we just need some final approval. Thanks. Ronhjones  (Talk) 15:32, 22 April 2018 (UTC)

images should be rescaled as small as possible to still be useful as identified by their rationale, and no larger

It is as unlikely that a Bot can derive case-dependent the maximum allowed resolution.
Write a guideline for reducing the preview-size for SVG, if it would get an Category:Wikipedia_content_guidelines, then your bot can start imediatly. But since SVGs don't have any resolution, Wikipedia:Non-free_content#Image_resolution is not valid for SVGs, as long as the preview-sizes of Vectorgraphis is not explicitly included in this guideline. Discuss it with the community, if they agree with it; write it explicitly into the guidelines; and then the approval would be great. But to apply for approval for a bot, which is not supported by the community does not make sense. Here is place for discussing the approval, not how to handle NFC.
conclusio: The community first has to support the edits, and not run a bot against the community. (So first fix the guidelines.) JoKalliauer (talk) 16:07, 22 April 2018 (UTC)
It is valid (as I said earlier, as we are generating unacceptable oversized pngs - or do you consider that acceptable? If you wish to change what was set up 8 years ago, then you are free to start a new discussion at Wikipedia talk:Non-free content. I am just following the system that is in place. Ronhjones  (Talk) 22:24, 22 April 2018 (UTC)
  • Pinged on this on my talk page, I think this is a reasonable task, though it's not as critical compared with the other bots. This is basically cutting down the size of the pre-cached PNG previews of an SVG on the appropriate File: page, as well as optimizing the SVGs (cropping away empty space, etc.) We can't stop people from dropping a 10,000 px wide SVG image on a page due to the nature of SVG, but we can minimize the cached image sizes to reasonable small sizes. Or another way to put it - the MediaWiki software appears to require us to generate these PNGs for an SVG, so by minimizing the displayed resolution, we better meet NFC. --Masem (t) 22:33, 23 April 2018 (UTC)

JarBot 3

Operator: جار الله (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 09:14, Thursday, March 29, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): python

Source code available: Pywikibot

Function overview: Add WikiProject to redirects based on what articles that in WikiProject and have redirects but the redirects not in Category:Redirect-Class WikiProjectname articles

Links to relevant discussions (where appropriate):

Edit period(s): maybe daily

Estimated number of pages affected: Depends on the project

Namespace(s): 1

Exclusion compliant (Yes/No): yes

Function details:

Discussion

Add WikiProject to redirects based on what articles that in WikiProject and have redirects but the redirects not in Category:Redirect-Class WikiProjectname articles.--جار الله (talk) 09:14, 29 March 2018 (UTC)

{{BotTrial}} I think the description above may need a bit more clarity, but that some examples will help. Please do at least 2 different samples and post the results here. — xaosflux Talk 13:32, 2 April 2018 (UTC)
  • edits
  • What the bot will do:
  1. Remove #REDIRECT [[thename]] and add {{Talk page of redirect}}
  2. If {{WikiProject banner shell}} in page (for example {{WikiProject Medicine }} will be in {{WikiProject banner shell}}) like {{WikiProject banner shell|collapsed=yes|1= {{WikiProject Medicine |class=redirect |importance=NA}} }}

{{BotTrialComplete}}.--جار الله (talk) 21:07, 2 April 2018 (UTC)

Hi جار الله, can you point me the diffs for this trial, I'm getting lost in your bot's contributions with other running tasks. — xaosflux Talk 00:59, 7 April 2018 (UTC)
  1. [2]
  2. [3]
  3. [4]
  4. [5]
  5. [6]
  6. [7]
  7. [8]
  8. [9]
  9. [10]
  10. [11]
  11. [12]
  12. [13]
  13. [14]
  14. [15]
  15. [16]
  16. [17]
  17. [18]
  18. [19].--جار الله (talk) 02:19, 7 April 2018 (UTC)
@جار الله: thank you the diffs look fine, as far as your targets - this is very open ended. How do you plan to select projects to do this on? For example, by request of the project, just ones you think should be done, every one you can find, something else? — xaosflux Talk 13:26, 12 April 2018 (UTC)
@Xaosflux: My plan is, to working on all projects but if you think is there project not important, the bot will skip it.--جار الله (talk) 00:32, 13 April 2018 (UTC)
Running without coordinating with the affected wikiprojects could have unwanted impacts. For your example ones, have you asked Wikipedia_talk:WikiProject_Medicine about this or to look at it? At the very least I think a requirement to post at the associated wikiproject for at least a week should be done for any of these runs, with silence indicating consent. What do you think about that, and can you ask the wikimedicine project to look over this as well? — xaosflux Talk 03:14, 13 April 2018 (UTC)
@Xaosflux: Sounds good, I will put a request into the project and wait for a week before running, and I will ask the wikimedicine project to look over, thanks.--جار الله (talk) 08:59, 13 April 2018 (UTC)
I checked the first two examples above and the example posted at WT:MED (Talk:(S)-equol). The documentation at {{Talk page of redirect}} includes "Place this template on the talk pages of redirects when the content and history of the talk pages and/or the history of the subject pages are to be preserved." In the three cases I checked, the bot created the talk page. That seems to contradict the template documentation. I think I have seen cases where talk pages have been created for no reason other than to tag them, and those talk pages were then speedy deleted. Johnuniq (talk) 10:07, 13 April 2018 (UTC)
This might be interesting and desirable for some groups, but WikiProject Medicine discussed this (years ago) and decided not to tag redirects (with the occasional exception). Please don't do this. WhatamIdoing (talk) 16:23, 13 April 2018 (UTC)
@جار الله: this is heading for a deny based on lack of editorial consensus - was there some prior discussion with projects that wanted this function? — xaosflux Talk 22:44, 13 April 2018 (UTC)
@Xaosflux: It seems WikiProject Medicine does not approve of the task. What if the task is approved in general, subject that I will put a request into the project and wait for a week or 10 days and there is no objection, before running and bot will not add {{Talk page of redirect}}.--جار الله (talk) 11:39, 15 April 2018 (UTC)
جار الله in general I'm not seeing a technical issue, however one of the other concerns brought up above is that there is really no need to create pages for the sole purpose of putting this template on them. So this task really only seems useful then a project wants it, and where the pages exist for some reason already. Can you adjust your code to avoid creations? — xaosflux Talk 15:53, 15 April 2018 (UTC)
@Xaosflux: the code now work on exist pages only and in this case I think add {{Talk page of redirect}} is okay.--جار الله (talk) 08:17, 17 April 2018 (UTC)
I think if it's only adding this to talk pages which already exist, it would be ok. Also, make sure the project you're running this one actually have a article classification called "redirect". For instance, I know WikiProject Libraries does not have one such designation and would show up as "NA" instead of "Redirect" in the template line, and therefore this bot would be unneeded. I also think the 7-10 day waiting period might run into some problems if the project isn't terribly active, and then it might be a month later when someone goes to update the pages and goes "whoa, when did 15,000 pages get added to this?" Just some thoughts! SEMMENDINGER (talk) 19:27, 17 April 2018 (UTC)
@Semmendinger: How long to wait is enough in your opinion?.--جار الله (talk) 13:12, 21 April 2018 (UTC)
It's really hard to say. While I think the bot is awesome and I wish I had the technical know-how to create and operate a bot that could process tasks like these, I can't see the net positive of tagging tens of thousands of potential pages that don't need strict maintenance on a WikiProject. To answer your question though, if the WikiProject is active (according to its banner on their main page) I'd give it 2 weeks. If it's semi-active I'd give them a month to 6 weeks. If it's inactive then this bot won't serve a purpose there anyway. I really hate to be a bummer, I can only imagine how long it took to create this program, but unless a Project specifically asks for it I don't think it should be employed just because no one voiced an opinion against it in a short period of time. SEMMENDINGER (talk) 13:21, 21 April 2018 (UTC)
I get your point, and I agree with it, in that case I will with till I get the approval from the WikiProject.--جار الله (talk) 15:15, 21 April 2018 (UTC)
@Semmendinger: I'm following your reasoning here - unless a project actually wants this to run, it may not be a net positive. جار الله have any projects asked for this to happen? — xaosflux Talk 15:03, 21 April 2018 (UTC)
@Xaosflux: So far I have didn't asked, but in a week I will put several requests in some active projects.--جار الله (talk) 15:15, 21 April 2018 (UTC)
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) OK, come back when you have someone that wants to try a full fun on their project, link to the discussion here and we can approve the trial then. — xaosflux Talk 18:51, 21 April 2018 (UTC)

InfoboxBot

Operator: Garzfoth (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 04:36, Tuesday, October 24, 2017 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): Python and mwparserfromhell

Source code available: No source code available at this time, sorry. Example for original functionality: User:InfoboxBot/wikipedia_edit_pages_clean.py Yes (available at User:InfoboxBot/wikipedia_edit_pages_clean.py)

Function overview: This bot would assist me in fixing various widespread yet minor issues with non-standard infobox parameters in articles (primarily focused on issues with Template:Infobox power station and possibly Template:Infobox dam).

Links to relevant discussions (where appropriate): I do not believe that this bot would be controversial - any changes made by it are going to be uncontroversial minor changes.

Edit period(s): As needed (it'll vary significantly). It will not be anywhere near continuous.

Estimated number of pages affected: There are ~2500 articles using infobox power station and ~3500 articles using infobox dam. The number of articles out of these that would be affected by my bot is unknown. For now, let's call it an absolute upper limit of ~6000 affected articles.

Namespace(s): Mainspace only.

Exclusion compliant (Yes/No): No, as in my experience articles with infobox power station or infobox dam on them never use the bots template in the first place. I am not adverse to implementing detection for this template in the future, but I don't see the need for it unless I broaden the scope of the bot's work to different infoboxes. Yes (added Apr 4 2018)


Function details: I have already scraped all articles with infobox power station and infobox dam in them, placed the infobox data from said articles into a MySQL database, and am using analysis of that dataset/database to discover issues that can be fixed via this approach. Here is a good example of what kind of issues this bot can help me fix:

  • For infobox param "th_fuel_primary": There are 153 articles using the term "[[Coal]]", 90 articles using the term "Coal", 80 articles using the term "Coal-fired", and 14 articles using the term "[[Coal]]-fired". This bot can automatically change the value of "th_fuel_primary" to "[[Coal]]" for the 184 articles that use equivalent terms, resulting in 337 articles that all use the same correct homogenous terminology and are all wikilinked correctly.

So yeah, this is essentially just a specialized high-speed-editing/assisted-editing tool. As far as I understand, it is still possibly classified as a bot and thus I have to submit it to BRFA as I am doing now. I did run this on my personal account for a single run (on the infobox param "status" - changing the non-standard value "Active" into "O" (expands to "Operational") for 185 articles) before realizing that it may be classifiable as a bot (and that I was also performing operations too fast if the bot action speed limits applied - I had quite a bit of trouble locating the actual documentation on this so I had initially assumed that it was the same as the API itself and set a 1s + overhead delay between requests) and stopping. So if you want a demonstration of what this bot does in the real world, just look at the long string of commits in my history with the edit summary "Automated edit: fixing infobox parameter "status"".

Discussion

Could the bot implement some of User:Headbomb/sandbox (expand collapsed sections)? Headbomb {t · c · p · b} 11:07, 24 October 2017 (UTC)
1.a/1.c crash my scraping script, so I’ve already manually fixed those in all affected articles using either infobox dam or infobox power station. I can look into building a new script to locate and automatically fix those types of issues in other infoboxes, it would be an interesting problem to try to solve automatically, but no promises on that since it might not be doable automatically with high confidence.
For the rest, yes, the bot can do at least some of them if not most or all of them (and in fact I was already planning on implementing a number of those items), although it’s going to require additional work to implement them, and my first priority is still going to be fixing the more substantial issues. Garzfoth (talk) 17:36, 25 October 2017 (UTC)
I would greatly appreciate getting a response to at least the specific question of if this use is classified as a bot or not (i.e. does it actually need approval as a standalone bot through BRFA or can I just run it on my personal (or InfoboxBot?) account(s)?)... I have been waiting two and a half weeks for another response and it's getting a bit frustrating. I would prefer to have an account with the bot flag to run it on simply because of the expanded API limits available in that case (and being able to edit without unnecessarily cluttering up anyone's watchlist, since I could then flag my edits as bot-made which allows them to be easily hidden by users if desired), but I do not by any means need the bot flag to operate the program. Garzfoth (talk) 19:58, 11 November 2017 (UTC)

{{BAGAssistanceNeeded}}

It has been over a month since the last response. I would greatly appreciate a response to at least the question highlighted in bold above (is this use even classifiable as a bot or can I just run this as a script on my personal account without approval required?). Thanks! Garzfoth (talk) 21:17, 26 November 2017 (UTC)

From the BOTPOL definitions, the fact that you aren't personally approving each edit means that this is probably a bot, and would likely need to be approved here. It shouldn't be controversial, though. Going through the edits you made (convenience link!), the random sample that I picked all look good. It would be nice if you had some examples of the Coal change, as opposed to just the "Active" to "O" change, however. Even better would be if the code were somewhere BAG members and others could review it - you don't even have to put it on GitHub, as it's just as readable in the bot's userspace.
One important change you should make is the edit frequency: 1 second between edits is too low. For nonessential maintenance tasks, the usual delay is 10 seconds (source: WP:BOTREQUIRE). I'm not a BAG member myself, so I can't grant a trial; so I'll leave the tag here. You should probably fix the rate thing before the trial, though. Enterprisey (talk!) 13:40, 5 December 2017 (UTC)
Thanks for the feedback! I am aware of the editing frequency issue (it's specifically mentioned in my BRFA if you missed it), I would of course change that to 10 seconds between edits for a production run, as I said I only operated that fast in the first place because I originally could not locate the correct documentation on bot policies and had assumed that the general API rate limits applied.
I can't exactly give more precise examples of changes since I apparently wasn't supposed to be running the bot without BRFA approval in the first place, but I suppose I could manually make some example edits to show what the bot would be capable of doing? My main goal originally was just to homogenize a lot of common simple stuff like the coal example, but then I got branched out and started thinking of wider applications, so my application is admittedly a bit open-ended.
As far as the code goes, I dislike open-sourcing anything I've written for personal use until it's been extensively polished because I keep a lot of debug stuff commented out and don't write my commented notes for a general audience, so it gets more than a bit sloppy/unprofessional and I prefer to only publish very clean code unless absolutely necessary. I guess I could strip the comments entirely and publish it more or less as-is though. I'll think about that.
I'll leave the tag up until someone from BRFA can drop by to discuss a trial. Garzfoth (talk) 03:35, 11 December 2017 (UTC)
I've cleaned up and posted the original code used for the Active => O change run: User:InfoboxBot/wikipedia_edit_pages_clean.py Garzfoth (talk) 03:48, 11 December 2017 (UTC)

@Garzfoth: This request has sat for a very long time. I would like to apologize for that.

Minor code review. This line:

	tpl = next(x for x in templates if x.startswith("{{Infobox power station") or x.startswith("{{infobox power station") or x.startswith("{{Infobox power plant") or x.startswith("{{infobox power plant") or x.startswith("{{Infobox wind farm") or x.startswith("{{infobox wind farm") or x.startswith("{{Infobox nuclear power station") or x.startswith("{{infobox nuclear power station"))

would look better as:

    tpl = next(x for x in templates if x.name.matches(["Infobox power station", "Infobox power plant", "Infobox wind farm", "Infobox nuclear power station"]))

Now, my only real concern here is that certain changes can seem uncontroversial on the surface but are actually not once you do them en-masse. The "Active" to "O" thing is surely fine, but whether or not to wikilink "Coal" is something I could see as contentious. How do you determine what the convention is when the most common option is used by only 45% of articles (153/337, per your numbers)? Arguments could exist either way, and it might depend on the article (maybe).

Anyway, let's do a fairly loose trial to get a sense of the kinds of changes you would like to make and how they pan out. If possible, please do a variety of types of fixes, but if you only have a couple in mind right now, that's fine too. Approved for trial (100 edits). — Earwig talk 06:23, 12 December 2017 (UTC)

Thank you for your comments. The code suggestion is extremely helpful, I tested it and subsequently refactored all of my code (including components that have not been published such as the scraping stuff) to incorporate it.
I have thought extensively about the issue of balancing too-minor/controversial changes with real action for a while now. For wikilinking stuff like that I think it's no contest — a wikilink is almost always going to be justified for stuff like that (especially as the infobox is a separate entity and the MOS makes the provision that repeating links in infoboxes is fine if helpful for the readers). For capitalization issues, it's a messier situation, but I think the best approach is to focus on choosing the option that makes the most grammatical sense (something I've tried to clarify with limited research), fits best within the generalized context of an infobox, adheres to the MOS, is the most visually consistent & pleasing with other infobox elements, and corresponds with the established consensus (I can see how popular each option is while analyzing the DB for variables to work on, so that lets me measure the rough level of consensus for existing options). I'm actually really curious if anyone will object to the capitalization standardization I'm using — if it triggers an objection, I'll of course discuss the issue, and if the discussion results are to use non-capitalization for the standard (or whatever else), I can then use the bot to put the articles in line with the outcome of the discussion instead.
I started on the trial run. Here are changes done so far:
IPS parameter
name/key/category
Original value Modified value #
th_technology steam [[Steam turbine]] 2
th_technology Steam [[Steam turbine]] 17
th_technology [[gas turbine]] [[Gas turbine]] 3
th_technology [[Gas Turbine]] [[Gas turbine]] 3
country United States [[United States]] 5[a]
country England [[England]] 5[b]
ps_units_manu_model Siemens [[Siemens]] 3
ps_units_manu_model Vestas [[Vestas]] 2
status Operating O (expands to Operational) 5[c]
status operational O (expands to Operational) 17
status Baseload O (expands to Operational) 6
status Peak O (expands to Operational) 5
th_fuel_primary Coal [[Coal]] 5[d]
th_fuel_primary Coal-fired [[Coal]] 5[e]
th_fuel_primary [[Natural Gas]] [[Natural gas]] 5[f]
th_fuel_primary [[natural gas]] [[Natural gas]] 5[g]
th_fuel_primary Natural gas [[Natural gas]] 5[h]
Total edits made during initial trial: 98
  1. ^ There were 257 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  2. ^ There were 105 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  3. ^ There were 38 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  4. ^ There were 88 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  5. ^ There were 72 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  6. ^ There were 27 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  7. ^ There were 24 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
  8. ^ There were 23 instances to correct, but due to the 100 edit limit on this trial I edited only 5 as examples. There are a number of other examples in this category that were excluded from the trial — this is just a representative example of an edit within this category.
During the run only one edit was reverted (this one), with the reason being "editing tests". The editor in question subsequently thanked the bot's account for a different edit, and I'll be replying to their message on the bot's talk page to explain the matter and see what their views on the capitalization change really are (i.e. did they truly intend to revert or did they simply not notice that the edit actually changed something).
Here is the updated primary bot code, with various improvements made, functionality added, code cleaned up, and most code comments preserved (even the stupid ones): User:InfoboxBot/wikipedia_edit_pages_clean.py
Thanks again! Garzfoth (talk) 14:02, 15 December 2017 (UTC)
WP:OVERLINK applies. You should not be linking countries like the U.S. and England. — JJMC89(T·C) 19:40, 15 December 2017 (UTC)
That seems fair enough for the specific case of countries. Here's a question: if WP:OVERLINK unambiguously applies to the country field, then would it be justified to edit the infobox to remove all country wikilinks for violating WP:OVERLINK? This would mean for example that all instances of country = [[United States]] would be changed to country = United States, and so on and so forth for all the other countries. Garzfoth (talk) 14:25, 19 December 2017 (UTC)
  • Any update on progress for this request? — xaosflux Talk 19:37, 21 March 2018 (UTC)
I've been wondering that myself! Due to personal issues I haven't been able to spend very much time on Wikipedia over the past few months and never got around to re-flagging this page with the BAGAssistanceNeeded tag, but I have been periodically checking this page for updates, wondering why I never got a response. I think I'll have those personal issues finally mostly resolved within the next week or so and should be able to resume active editing again, at which point I'll re-flag this request with BAGAssistanceNeeded if nobody has responded by then. As far as I'm aware I've completely fulfilled the requirements for a post-trial status update and I don't see any remaining issues that should preclude moving onwards to either an extended trial or the final approval/denial stage. Is there a reason why this request has sat like this for so long? If you guys are actually waiting on something from me to proceed, please tell me what that is! Garzfoth (talk) 17:12, 24 March 2018 (UTC)
Looks like you never tagged it {{BotTrialComplete}} so it may be in the wrong queue of watchers!— xaosflux Talk 03:12, 26 March 2018 (UTC)
Oh dear, I didn't even notice that part at all. What a stupid mistake to make on my part! Thanks for pointing that out! Hmm, in the future, maybe it'd be a good idea to directly/explicitly tell people when their bots get approved for a trial, in the same message as the trial approval message, that when the trial is complete they should tag the page with that particular template? Something along the lines of "[bot trial approval template here] — when this trial is complete, please tag this page with {{BotTrialComplete}}"? Just an idea... Anyways, tagging it now...
Trial complete.
Garzfoth (talk) 13:57, 26 March 2018 (UTC)
@Garzfoth: no big deal, as you can see at Wikipedia:Bots/Requests_for_approval - we are struggling with a big backlog on this page right now. — xaosflux Talk 14:01, 26 March 2018 (UTC)
Got it. I added exclusion compliance to the bot (and in the process of testing, I discovered and fixed a major bug in the example Python code on Template:Bots). Source code has been updated to reflect that change.
Just to be sure that this is prioritized appropriately in the backlog, I'm going to flag this page with {{BAGAssistanceNeeded}} once again, as I fear otherwise that this will end up getting overlooked given the current size of the backlog (which it is kinda buried in):
A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag.
Garzfoth (talk) 21:43, 4 April 2018 (UTC)

DeprecatedFixerBot 2

Operator: TheSandDoctor (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 16:40, Wednesday, March 14, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: https://github.com/TheSandDoctor/DivCol-ColumnsList-Fix

Function overview: Remove deprecated parameters from both Template:Columns-list and Template:Div col transclusions, as found in Category:Pages using Columns-list with deprecated parameters and Category:Pages using div col with deprecated parameters

Links to relevant discussions (where appropriate): Template talk:Div col#Proposal for standardized changes (by bot or AWB)

Edit period(s): Routine runs until categories are cleared, possible maintenance runs in future

Estimated number of pages affected: 57,000+

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: The bot would run through one category at a run in a series of subsequent runs.

As described in the linked discussion,

#1 {{div col|2}} → {{div col}}
#2 {{div col|cols=2}} → {{div col}}
#3 {{colbegin|2}} → {{div col}}
#4 {{colbegin|cols=2}} → {{div col}}
   (etc. for other redirects)
#4 {{colbegin|3}} (or cols=3) → {{div col|colwidth=22em}}
#4 {{colbegin|4}} (or cols=4) → {{div col|colwidth=18em}}
#4 {{colbegin|5}} (or cols=5) → {{div col|colwidth=15em}}
#4 {{colbegin|6}} (or cols=6) → {{div col|colwidth=13em}}
#4 {{colbegin|7+}} (or cols=7+) → {{div col|colwidth=10em}} (any number of columns 7 or higher)
#5 {{Columns-list|2| → {{Columns-list|colwidth=30em| (while we're here; also do for all redirects to this one)
#5 {{Columns-list|3| → {{Columns-list|colwidth=22em| (and etc. as below)
#6 {{div col||([0-9]em)}} → {{div col|colwidth=$1}}
#7 {{div col|([0-9]em)}} → {{div col|colwidth=$1}} (as currently coded, this is an error, but fixable)
#8 {{colend}} → {{div col end}} (replace all end template redirects with div col end)

If the bot finds the first unnamed parameter, it assesses its value. If the value is less than 2 (1), then it removes that parameter so the template is set to its default value. If the value is between 2 and 6, it changes as shown above. If the value is 7 or greater, then the bot changes the value to 10em.

In offline tests (printing both input and output to individual local files on a random few), it worked as expected.

Discussion

  • If you would like a dry run with outputted files put in the GitHub repository, I would certainly be open to doing that. --TheSandDoctor (talk) 16:48, 14 March 2018 (UTC)

@TheSandDoctor: Per WP:BOTACC "The account's name should identify the bot function (e.g. <Task>Bot), or the operator's main account (e.g. <Username>Bot)." I would find it extremely confusing if a Twitter-related bot started to do div-related changes. We need either a new bot account, or a rename of the existing bot account (such as TheSandDoctor's Bot / TheSandDoctorBot or similar). This also applies to Wikipedia:Bots/Requests for approval/TweetCiteBot 2. Headbomb {t · c · p · b} 21:33, 17 March 2018 (UTC)

Moved task from TweetCiteBot to DeprecatedFixerBot. --TheSandDoctor (talk) 21:59, 17 March 2018 (UTC)
{{BAGAssistanceNeeded}} --TheSandDoctor Talk 16:41, 21 March 2018 (UTC)

I'm a bit concerned this could fall under WP:CONTEXTBOT. I can conceive of cases where exactly two or exactly three columns is what is meant, rather than a more lax "well on my screen, 3 columns looks nicer so let's go with that". You might argue that those should be done with different templates, but how would the bot deal with something like say

Team A
  • Alice
  • Bob
  • Charles
Team B
  • Denver
  • Emily
  • Fiona
Team C
  • Gilligan
  • Hector
  • Ines

Headbomb {t · c · p · b} 17:08, 21 March 2018 (UTC)

@Headbomb: It would just remove "cols" and use the default, nothing would visibly change (aside from a slight shift left on my screen at least, but barely noticeable).
Team A
  • Alice
  • Bob
  • Charles
Team B
  • Denver
  • Emily
  • Fiona
Team C
  • Gilligan
  • Hector
  • Ines
Like that. --TheSandDoctor Talk 17:23, 21 March 2018 (UTC)
What about the 3 column output then (updated my example)? Also this seems to contradict the listed logic "{{div col|cols=2}} → {{div col}} {{div col|colwidth=30em}}"Headbomb {t · c · p · b} 17:53, 21 March 2018 (UTC)
Headbomb, please read the linked discussion above. Specifying a number of columns is deprecated behavior, for the same reason that the behavior was deprecated (and removed) in {{reflist}}. This is just a continuation of that work. – Jonesey95 (talk) 18:16, 21 March 2018 (UTC)
@Jonesey95: I agree it's deprecated, what IOwant to know is how the bot deals with when this is intended behaviour and either screws up with that intended behaviour, or how it avoids screwing up with that intended behaviour. The discussion offers no insight on this. Headbomb {t · c · p · b} 19:12, 21 March 2018 (UTC)
@Headbomb: It would look like this
Team A
  • Alice
  • Bob
  • Charles
Team B
  • Denver
  • Emily
  • Fiona
Team C
  • Gilligan
  • Hector
  • Ines
I was hesitant in responding until I figured out the issue. For some reason, when I previously set colwidth to 22em, it just gave a vertical list, yet now it posts normally. Clearly I did something wrong. Either way, works now? (cc @Jonesey95: --TheSandDoctor Talk 18:20, 21 March 2018 (UTC)
Also, sorry about the contradictions, they have now been resolved. --TheSandDoctor Talk 18:21, 21 March 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── So what's the plan for avoiding the issue of screwing up intentionally specific column numbers? Or dealing with screwups when they happen? Headbomb {t · c · p · b} 19:09, 21 March 2018 (UTC)

Can you please clarify what a "screwup" would look like? If you mean converting "cols=2" to "colwidth=30em", that is not a screwup, that is the consensus of the editors at the linked discussion. Specifying a fixed number of columns using the {{div col}} template is deprecated behavior (see the template's documentation) that is being removed. Since there are 40,000 articles that use deprecated parameters in the template, a bot is the least disruptive way to make that change. – Jonesey95 (talk) 20:20, 21 March 2018 (UTC)
I've already clarified above with the 3 column example above. The 3 columns in that case would be intentional, and it is not something that converted to 22em because it screws up the intended layout (zoom in/out of the page to see that it's messed up). So the question is how does the bot plan on either a) avoiding those sort of edits or b) providing a way for editors to restore the fixed-column format when it was intended / is desired, without the bot edit-warring with them over and over. Headbomb {t · c · p · b} 20:39, 21 March 2018 (UTC)
Thank you. The example was not clear to me, because the "after the bot" example used div col without any parameters, which is not how the bot will work. I have added colwidth=22em to the "after" example, and the columns look the same to me on my normal-sized screen.
I see what you mean about zooming in and out, but that is the best compromise. The cols=3 format fails on mobile (it displays only one column), so it is already broken. It has been deprecated for four years for this and other reasons, so editors have had plenty of time to make other choices. Changing to use colwidth makes div col work better on a wider variety of devices. If people want fixed columns, they need to use one of the other templates listed in the See also section of the documentation. – Jonesey95 (talk) 20:52, 21 March 2018 (UTC)
Then the bot should point to that documentation, either with instructions on the bot page on what to do when the bot touches something people don't want touched, similar to this, or with an edit summary link to the documentation which points to the templates people can use to create fixed columns. Or both. Headbomb {t · c · p · b} 21:02, 21 March 2018 (UTC)
Good idea. I have updated the template documentation to clarify how to get a fixed number of columns. The bot's edit summary should contain a link to the template. – Jonesey95 (talk) 21:15, 21 March 2018 (UTC)
Approved for trial (10 edits for each "type" of edits listed above.). That works for me. The edit summary should specifically point to Template:Div col#Usage of "cols" parameter as a resource link for editors. Headbomb {t · c · p · b} 21:21, 21 March 2018 (UTC)
@Headbomb: Could you please clarify "type"? Pages with 1-10 as the cols parameter or? --TheSandDoctor Talk 00:30, 22 March 2018 (UTC)
You mean 10 times the number of "types" specified in the original list in nomination, correct? --TheSandDoctor Talk 01:04, 22 March 2018 (UTC)

Done. I've numbered the 'types' in the original description. Anything with the same # is the same 'type'. Headbomb {t · c · p · b} 01:06, 22 March 2018 (UTC)

Thanks Headbomb. Would it be okay if I (or you or anyone else) set up these situations in a sandbox and just had the bot run over that? It would be simpler/easier and quicker to do. --TheSandDoctor Talk 02:08, 22 March 2018 (UTC)
Sure. But after you've done sandbox testing, do 50 random edits in mainspace. What's in the wild is often not as we expect it to be. Headbomb {t · c · p · b} 02:26, 22 March 2018 (UTC)
@Headbomb: Sandbox test done. I don't think I missed any. If you are happy with that, then I will do the 50 random articles. (If you aren't, bot can easily be tweaked and re-run on sandbox) --TheSandDoctor Talk 03:24, 22 March 2018 (UTC)
@TheSandDoctor: sure, go to mainspace. 50 edits, as stated above. Headbomb {t · c · p · b} 03:25, 22 March 2018 (UTC)
As a note, I'd use "or message" rather than "and message" in the summary, otherwise it makes it seems like people need to do both. Headbomb {t · c · p · b} 03:28, 22 March 2018 (UTC)
Trial complete. @Headbomb: Didn't see that until after the run, but rest assured I just changed it in the code. All 50 edits done, no issues. --TheSandDoctor Talk 03:41, 22 March 2018 (UTC)
Nice work. I inspected all of the edits and saw only one problem, templates being changed from their actual names to lower-case names, in some cases breaking the templates. I see that you reverted that change and it did not happen again. – Jonesey95 (talk) 04:08, 22 March 2018 (UTC)
@Jonesey95: Ah, yes. That problem. TweetCiteBot's rewrite and the first task for this bot had that issue (as they all share some code in common). It was a line left over from testing in dry runs (just outputting text to in and out.txt locally) that was A: redundant and B: didn't work as intended. Because I have had issues like that in the past, I tend to run any live editing task with 1 or 2 edits tops at the beginning to check for errors like that (that way they are easily reverted and corrected). I reran the bot on that page shortly after (1 minute later) to confirm that the issue had been sorted (which it had). --TheSandDoctor Talk 04:18, 22 March 2018 (UTC)

@TheSandDoctor: got a link to trial diffs? Headbomb {t · c · p · b} 16:11, 23 March 2018 (UTC)

@Headbomb: They kind of were buried, weren't they. Starting (and ending) here (about half way down or just to Ctrl + F for "div col" to find them). Some randomly selected diffs from that page : 1, 2, 3, 4) --TheSandDoctor Talk 16:31, 23 March 2018 (UTC)
@Frietjes: That is unrelated to this bot request, but I accidentally substituted Infobox film as well. Thank you for reverting that as I had not noticed. I have since gone back and correctly substituted just Infobox Album (which cleans up its params and properly removes deprecated params per Category:Music infoboxes with deprecated parameters, Template:Infobox album#Code, Wikipedia:Substitution trick. --TheSandDoctor Talk 15:45, 24 March 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag. --TheSandDoctor Talk 23:15, 29 March 2018 (UTC)

Gabrielchihonglee-Bot 3

Operator: Gabrielchihonglee (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 14:40, Monday, January 15, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python (pywikibot)

Source code available: will be given after test run

Function overview: Changing param "symbol" into "symbol_type_article" in infobox template.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Symbol_parameter_in_Infobox_former_country

Edit period(s): One time run

Estimated number of pages affected: Less than 3000 pages

Namespace(s): Main

Exclusion compliant (Yes/No): yes

Function details: Flow of the bot:

  1. Get all pages with template Infobox former country
  2. Change parameter name from "symbol" into "symbol_type_article"
  3. Save

Discussion

  •  Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT 01:02, 16 January 2018 (UTC)
Made 1 manual edit to test the theory works. And yes the theory works. :P --Gabrielchihonglee (talk) 01:16, 16 January 2018 (UTC)
  • Approved for trial (50 edits). In the future, do not make any edits from the bot account until approved for a trial. You can manually test from your main account, if necessary. ~ Rob13Talk 17:37, 24 January 2018 (UTC)
  • @BU Rob13:  Done. The first around 10 edits were trail and error, found some bugs and improved the code. It is working fine for the last around 35 edits, thanks. :) --Gabrielchihonglee (talk) 15:29, 25 January 2018 (UTC)
forgot to add this: Trial complete. --Gabrielchihonglee (talk) 03:58, 2 February 2018 (UTC)
I haven't reviewed the trial yet, but I'm extremely concerned that you continue to make test edits outside of trials with your bot account after being explicitly told not to. This is a bot policy violation. These edits should not have been made from a bot account at all. [20] [21] [22] [23] [24]. This edit represents the second time you've trialed a task from your bot account before being approved to do so. [25] Pinging Xaosflux for his thoughts, but my initial reaction is to decline based on repeated bot policy violations after receiving a warning. ~ Rob13Talk 17:34, 7 February 2018 (UTC)
While not required by overall policy, I prefer that bots with multiple tasks etc include information about which task has approved each edit in its edit summary. I do this with my bot to make sure everything is clear to anyone reviewing its edits (e.g. Special:Contributions/Fluxbot). — xaosflux Talk 18:33, 7 February 2018 (UTC)
@BU Rob13: Would this not be a case of WP:BOTUSERSPACE? Nihlus 18:36, 7 February 2018 (UTC)
I don't read any of those edits as testing a bot process, personally. Perhaps the user talk ones are defensible. The userpage edit noting the bot is in trial definitely was not a test edit. The trial edit in project space is also not defensible. ~ Rob13Talk 18:45, 7 February 2018 (UTC)
@BU Rob13: I made other edits as my bot's another request was approved for trail. I don't see any problems with it. Thanks. --Gabrielchihonglee (talk) 16:39, 9 February 2018 (UTC)
Your first edit of that trial came before the trial was approved. Further, that doesn’t explain the edit to the bot’s user page. ~ Rob13Talk 16:47, 9 February 2018 (UTC)
@BU Rob13: Oh I finally understand why u said that. I'm very sorry for not following the policies strictly. I do understand that i need to wait till getting approval to do any edits, actually an admin friend came to me and reminded me a few days ago. I do realize the problem and I'm really sorry for my fault. With all due respect, may I know what is the next step of this request? -- Gabrielchihonglee (talk) 23:45, 9 February 2018 (UTC)
  • @Gabrielchihonglee: I had hoped for other BAG members to respond regarding their thoughts on whether to decline for the bot policy violations, but that never happened, and it's been long enough that it's best just to move on. I noticed two issues with the trial. Generally, you moved a lot of parameters to the end of the template. You also introduced extra white-space. Ideally, they should remain in their original place and not introduce new lines. See [26] for examples of both issues. ~ Rob13Talk 15:32, 9 April 2018 (UTC)
@BU Rob13: Will fix soon, thanks! :) --Gabrielchihonglee (talk) 11:17, 10 April 2018 (UTC)
when ready let us know to schedule a trial with your corrected settings please. — xaosflux Talk 18:56, 21 April 2018 (UTC)
no problem --Gabrielchihonglee (talk) 23:19, 22 April 2018 (UTC)


Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.


Retrieved from "https://en.wikipedia.org/w/index.php?title=Wikipedia:Bots/Requests_for_approval&oldid=837980006"
This content was retrieved from Wikipedia : http://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval
This page is based on the copyrighted Wikipedia article "Wikipedia:Bots/Requests for approval"; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA