User talk:GreenC
User page Talk
  (Redirected from User talk:Green Cardamom)
Talk Page 2021

Happy New Year!
Thanks for your contributions to Wikipedia, and a Happy New Year to you and yours!North America1000 05:44, 3 January 2021 (UTC)
– Send New Year cheer by adding {{subst:Happy New Year}} to user talk pages.
Iron Law of Oligarchy
You mention this on your userpage. I got indeffed and IP-banned from Conservapedia for a single well-sourced post noting that molality and molarity are respectively absolute and relative measures. If I hadn't bragged about it on RationalWiki, it'd probably still be there... Narky Blert (talk) 18:24, 3 January 2021 (UTC)
I have to say
"As a proof, they told me that the Keep vote of [GreenC which you see in the deletion discussion is done by them." If that's true, it's actually pretty cunning. Gråbergs Gråa Sång (talk) 20:47, 25 January 2021 (UTC)
Eh, my guess is a high end fraud lawyer is more cunning to fall for it and then post tears of regret publicly on Wikipedia. Story doesn't add up. -- GreenC 20:56, 25 January 2021 (UTC)
That feels probable, sure. Gråbergs Gråa Sång (talk) 20:59, 25 January 2021 (UTC)
The Wikipedia Motivation Barnstar
You are the true motivator :) Sliekid (talk) 07:16, 27 January 2021 (UTC)
perse nuk vendosni import export te vitit 2020 por keni len ate te vitit 2017? Apo ngaqe esht deficiti shum me i lart — Preceding unsigned comment added by (talk) 19:25, 28 January 2021 (UTC)
Your Opinion Requested at Michael Shellenberger
Hi GreenC,
You've previously weighed-in on the issues of publicity at Michael Shellenberger. I recently tried to clean said page up and add academic literature to the page, and it seems the page's subject has recently taken umbrage with said revisions. If you have the time, do you mind taking a look at the issues that recently occurred at Talk:Michael Shellenberger? --Hobomok (talk) 20:41, 28 January 2021 (UTC)
Edits reverted
Hey GreenC, thanks so much for reviewing Saket Modi. I noticed you reverted all of my edits although they were written in a neutral tone and were supported by third party, reliable sources. I read WP:LEAD that you highlighted in your comment and it says "a lead section should contain no more than four well-composed paragraphs and be carefully sourced as appropriate." and "The lead section should briefly summarize the most important points covered in an article." I am not trying to stuff the lead section rather updating it and adding an award. It was hardly a one-line addition and i provided citations too to back them up. In addition to this, i had made some small changes to the rest of the article with citations and they were also reverted. You seem to be very well versed in policies and guidelines. I would really appreciate your guidance and help with this. Thanks.​2405:204:C:AD2D:18B4:8F41:A22B:98E2 (talk) 16:05, 1 February 2021 (UTC)
Hi Sir, did you have a chance to look at it?.
Scripts++ Newsletter – Issue 20
News and updates associated with user scripts from the past month (January 2021).
Hello everyone and welcome to the 20th issue of the Wikipedia Scripts++ Newsletter:
Scripts Submit your new/improved script here
Ajbura's anrfc-lister assists with listing discussions at WP:ANRFC to request that they receive a formal close
Bradv's endlesscontribs provides for endless scrolling of contributions pages
Evad37's livenotifications displays notification alerts and messages in a little popup box, (almost) live as they happen
Terasail's Edit Request Tool allows users to reply to, close or remove protected edit requests
ToBeFree's clear-watchlist allows emptying your watchlist even if the watchlist is too full to be edited or cleared by conventional methods
Dentonius's GlobalRecentChanges allows you to monitor recent changes across various wikis.
Yahya's SNA (Start New Article) allows you to start a new article or draft from the navigation bar.
  • CatChangeLinker bluelinks the "(diff | hist)" part of category additions/removals on your watchlist.
  • CatChangesViewer lists recent category additions/removals on a category page.
  • Consecudiff adds links to diffs of consecutive edits by the same user on watchlist, history, etc.
  • DiffFontSwitcher allows you to toggle between fonts for diff by clicking a line number.
  • MoveHistory lists the past moves a page has gone through.
Novem Linguae:
User:Evad37/TimestampDiffs – fixed compatibility issues with various scripts/gadgets, including Wikipedia:Comments in Local Time. For discussion archives, now looks for revisions on the base page.
User:Enterprisey/undo-last-edit.js - no longer tries to load on special pages.
User:Ahecht/Scripts/pageswap - version 1.4 fixes reading destination from form field if destination is not in article namespace, and fixes self redirects.
Wikipedia:XFDcloser - version 4 brings a new user interface for dialogs, some preferences for customising XFDcloser, major behind-the-scenes coding changes, and resolves various issues raised on the talkpage. Also, since version 3.16.6 non-admin soft delete closure have been allowed at TfD.

Open tasks
As a reminder, the legacy javascript globals (like accessing wgPageName without first assigning it a value or using
instead) are deprecated. If your user scripts make use of the globals, please update them to use
instead. Some global interface editors or local interface administrators may edit your user script to make these changes if you don't. See phab:T72470 for more.
  • For people interested in creating user scripts or gadgets using TypeScript, a types-mediawiki package (GitHub, NPM) is now available that provides type definitions for the MediaWiki JS interface and the API.
  • A GitHub organization has been created for hosting codebases of gadgets. Users who maintain gadgets using GitHub may choose to move their repos to this organization, to ensure continued maintenance by others even if the original maintainer becomes inactive.
Pending requests
As always, if anyone else would like to contribute, including nominating a featured script, help is appreciated. Stay safe, and happy new year! --DannyS712 (talk) 01:17, 3 February 2021 (UTC)
About the newsletter​Subscription options Discuss this issue
Signature issue on your comment at AfD
It seems like there was a problem with your signature for your comment (​Special:Diff/1004919157/1004929875​) on the Jack Schlossberg AfD. It was a good comment any you might want to correct this issue. Cheers! - tucoxn\talk 13:42, 5 February 2021 (UTC)
crowd governance
the external link on your user page no longer works :( i dont want to edit your page, but i was able to enjoy the story at this address: https://web.archive.org/web/20200127053758/http://misrc.umn.edu/wise/2014_Papers/110.pdf have a good one Violarulez (talk) 20:39, 11 February 2021 (UTC)
Thank you, added the archive link. Interesting and non-intuitive story. -- GreenC 20:43, 11 February 2021 (UTC)
Speedy deletion nomination of Category:Esperanto literary awards
A tag has been placed on Category:Esperanto literary awards requesting that it be speedily deleted from Wikipedia. This has been done under section C1 of the criteria for speedy deletion, because the category has been empty for seven days or more and is not a disambiguation category, a category redirect, a featured topics category, under discussion at Categories for discussion, or a project category that by its nature may become empty on occasion.
If you think this page should not be deleted for this reason, you may contest the nomination by visiting the page and clicking the button labelled "Contest this speedy deletion". This will give you the opportunity to explain why you believe the page should not be deleted. However, be aware that once a page is tagged for speedy deletion, it may be deleted without delay. Please do not remove the speedy deletion tag from the page yourself, but do not hesitate to add information in line with Wikipedia's policies and guidelines. LizRead! Talk! 15:58, 13 February 2021 (UTC)
@Liz: I didn't create empty categories 11 years ago, I guess whatever was there has been deleted. -- GreenC 16:09, 13 February 2021 (UTC)
TravelMate URL's
Unfortunately I can not find the exact reference for this but back in June 2020 you stopped bot InternetArchiveBot from archiving links for [1] or possibly a shorther name, as the archived versions do not work - the only reference I can now find is Wikipedia:Australian_Wikipedians'_notice_board/Archive_57#premierpostal.com​. I have now encountered a similar problem with Red Cliffs, Victoria where a link to http://www.travelmate.com.au/MapMaker/MapMaker.asp which is dead, is being archived but the archived versions do no work as the javascript does not operate. The bot has now for the second time archived this link, on both occasions removing the 'dead link' tag. Can you please again help in stopping this bot archiving links for this URL. Fleet Lists (talk) 02:52, 20 February 2021 (UTC)
Fleet Lists, I believe the correct action is to 'whitelist' the URL which means the bot will always consider it 'alive' and will not try to add an archive. I just did this which should stop IABot. It could still be a problem with any other bot trying to save dead links in the future due to the
{{dead link}}
tag. If the link is dead and no viable archive it might be better to convert these to
without a |url=. -- GreenC 03:32, 20 February 2021 (UTC)
Thank you for your reply and update of the Red Cliffs article. However the bot has now revisited and removed the "dead link" tag. So we are back where we started. How can the URL be "whitelisted"? Fleet Lists (talk) 22:00, 23 February 2021 (UTC)
Now I'm not sure what is happening. For the moment, I added
which tells the bot to stay off the reference. This is fine, except when there are dozens or 100s of citations, as in this case, as each requires the cbignore. I'm going to ask the developer why the whitelist is not working. -- GreenC 22:45, 23 February 2021 (UTC)
Ah now figured it out: at iabot.org set the URL status to blacklist (not whitelist) and also delete the archive URL from the record. This action can only be done by an administrator. Should be set now. -- GreenC 22:51, 23 February 2021 (UTC)
DYK for George Dinning
On 21 February 2021, Did you know was updated with a fact from the article George Dinning, which you recently created, substantially expanded, or brought to good article status. The fact was ... that in 1897, former slave George Dinning was the first black man to successfully sue a mob of the Ku Klux Klan? The nomination discussion and review may be seen at Template:Did you know nominations/George Dinning. You are welcome to check how many pageviews the nominated article or articles got while on the front page (here's how, George Dinning), and if they received a combined total of at least 416.7 views per hour (ie, 5,000 views in 12 hours or 10,000 in 24), the hook may be added to the statistics page. Finally, if you know of an interesting fact from another recently created article, then please feel free to suggest it on the Did you know talk page.
 — Amakuru (talk) 00:02, 21 February 2021 (UTC)
Disambiguation link notification for February 22
An automated process has detected that when you recently edited Brian Nelson (literature professor), you added a link pointing to the disambiguation page Swann in Love.
(Opt-out instructions.) --DPL bot (talk) 06:14, 22 February 2021 (UTC)
there's a mess...
... in this edit.
Trappist the monk (talk) 23:02, 2 March 2021 (UTC)
Bug that caused this fixed. -- GreenC 16:36, 11 April 2021 (UTC)
Removing archived urls
Hi! I'm sure you're doing great work, but not all of it seems to be going well. I've already posted at User talk:GreenC bot to ask why your bot removed an archived link from Louise Blouin. Why did you then again remove this archived url with this edit? Why should that url not be archived in case it ever ceases to be accessible in the future? Are you aware that, because of the General Data Protection Regulation, many North American websites block access for users from Europe? And that archive.org in many cases provides a way of restoring that access? Of course, if we have a policy that links should not be archived unless unavoidably necessary, do please point me to it. Otherwise, can you unconditionally guarantee that neither you nor your bot will again remove a working archived link from Wikipedia? And that you will, as a matter of priority, identify and repair any instance where either you or the bot has done so in the past? Thanks, Justlettersandnumbers (talk) 22:23, 13 March 2021 (UTC)
We don't use archives with the intention of bypassing policy blocks, that is not what are archives are meant for, there is no community consensus for that. Policy blocks, be it a pay wall or government regulation. There is no problem adding archive URLs as a precaution for link rot, but in this instance it was added directly into the URL with no citation template or
thus in effect making to live URL inaccessible - literally deleting it. Now, the bot in this case was doing a URL move of observer.com because a user requested it - changing a dead URL to a live URL (there was a change in schemes at observer.com). During URL moves it does preserve the archive but only if there is a citation template or
. I probably could add a feature to add a new
when it's a square URL with an archive in order to preserve the archive. -- GreenC 22:48, 13 March 2021 (UTC)
The archived link leads directly to the actual source cited when the content was written (see WP:Text-source integrity). That content may have been changed or completely removed from more recent versions of the external page. There is no obligation that I'm aware of to cite a current link to a page if we already have an archived link; nor is there any obligation to use citation templates or webarchive templates (WP:CITEVAR). Anyway, would you kindly either point me to community consensus that a working archived link may be removed without discussion or unconditionally guarantee that neither you nor your bot will again remove a working archived link from Wikipedia, and that you will, as a matter of priority, identify and repair any instance where either you or the bot has done so in the past? Thank you, Justlettersandnumbers (talk) 11:32, 14 March 2021 (UTC)
I already added the feature. I'll take a look about readding old ones. -- GreenC 13:53, 14 March 2021 (UTC)
Incorrect IABot's edit summary in Russian
The current summary "Добавьте № книги для Википедия:Проверяемость​" has no sense in Russian language. Correct summary can be "Добавление ссылок на электронные версии книг" or "Добавление ссылок на электронные версии № (plural|книги|книг)". MBH (talk) 14:10, 24 March 2021 (UTC)
@MBH: I don't know which is better so I did the first one. Thank you very much. -- GreenC 14:31, 24 March 2021 (UTC)
Also I advice you not to use machine translation for translating bot messages into any languages you don't know. Maybe machine translation between big Roman and Germanic languages is not very bad, but machine translation from English to Russian is always terrible due to big difference in languages' structure. MBH (talk) 14:42, 24 March 2021 (UTC)
Nomination for deletion
An article you created or have significantly contributed to has been nominated for deletion. The article is being discussed at the deletion discussion, located here. North America1000 11:41, 1 April 2021 (UTC)
Hi GreenC! I'm enjoying using the Backlinks functionality - it's been about a year now. I didn't receive any emails today - did your process stop for April Fools' Day?  :-) Thanks! GoingBatty (talk) 13:48, 1 April 2021 (UTC)
It's not that clever :) I checked the logs and it appears to have run and sent emails, the data looks normal. I just sent you a test email from the server can you verify it came through? -- GreenC 15:09, 1 April 2021 (UTC)
I did not receive the test email, and have received emails from other senders. @Certes: Did you receive the Backlinks emails today? GoingBatty (talk) 16:03, 1 April 2021 (UTC)
Hmm strange. Certes is using a new system that post results online instead of email. Do you want to use that instead? For example:
Config page: https://en.wikipedia.org/w/index.php?title=User:Certes/Backlinks
Data page: User:Certes/Backlinks/Report
Otherwise I can try to debug why emails are not coming through. -- GreenC 16:08, 1 April 2021 (UTC)
Yes, I'm interested in having the results posted online instead. I've created User:GoingBatty/Backlinks/Report​. For User:GoingBatty/stopbutton​, when stopped, does this mean that results are queued on your side, and then all posted once we set Action=RUN again? If so, I'm interested in using that on the days when I'm away from my computer. Thanks! GoingBatty (talk) 16:38, 1 April 2021 (UTC)
Just ran it, and it worked. I forgot to adjust the filters you wanted to keep out Template, Project and some others, those will be in effect next run. The stop button is a hard stop the program does not cache results. Useful for extended disabled. For random days, recommend viewing the page history which serves as a cache of prior runs. -- GreenC 19:13, 1 April 2021 (UTC)
There were quite a few links to be fixed in the Template, Project and other spaces, so feel free to keep those coming. Thanks! GoingBatty (talk) 00:53, 2 April 2021 (UTC)
You now have everything except these:
(^Talk:|^Wikipedia:|^Wikipedia talk:|^Template talk:|^Portal talk:|^User:|^User talk:|^File talk:|^MediaWiki:|^MediaWiki talk:|^Help:|^Help talk:|^Category talk:|^Book:|^Book talk:|^Draft:|^Draft talk:|^TimedText:|^TimedText talk:|^Module talk:)
-- GreenC 01:32, 2 April 2021 (UTC)
My Backlinks appeared on the data page as usual at 10:47 UTC today. It has failed to appear a couple of times over the last few months, but worked fine today. I asked to stop receiving Backlinks by e-mail, as my long list produced lots of e-mails. If I'm away for a few days I'll just catch up using the page history. Certes (talk) 23:47, 1 April 2021 (UTC)
IABot bug - "blocked: You have been blocked from editing." despite not being blocked
Hello! I think phab:T274050 is back to bug us again. I'm getting a "blocked: You have been blocked from editing." error when trying to analyse & edit pages despite not being blocked. I can't seem to make the tool report on the exact API message it's getting (e.g. to see if an autoblock of a Toolforge IP is to blame), could you have a look? Thanks! ƒirefly ( t · c ) 15:36, 3 April 2021 (UTC)
Pages using duplicate arguments in template calls
is it possible to remove User:GreenC/test from Category:Pages using duplicate arguments in template calls (easier to see the actual problems when there aren't user pages in there)? thank you. Frietjes (talk) 16:22, 11 April 2021 (UTC)
Done. -- GreenC 16:33, 11 April 2021 (UTC)
Bot functionality request
Hi GreenC, nice to meet you. I found you trawling through the bot status report (User:MajavahBot/Bot status report). I was wondering if I could interest you or request a relatively simple bot task? That task is: periodically go through the entries in this category: Category:Peer review requests not opened.
For each peer review talk page there will be a template like {{Peer review|archive=X}}. There should be a corresponding peer review page called Wikipedia:Peer review/PAGENAME/archiveX​, but about once a week someone starts the process but doesn't actually create the page, so the template just hangs there. It would be very useful for a bot to remove the template if the peer review wasn't started for, like, a week after the template was placed, as that probably means no review page will be created.
I've had some problems with single functionality bots before so I thought I might ask you because your bot seems unlikely to randomly become inactive :P. Crossing my fingers, Tom (LT) (talk) 10:28, 12 April 2021 (UTC)
Hi Tom (LT) - I can help with this, though it would be a standalone bot, running on Toolforge from cron ie. servers maintained by Wikimedia in their datacenter, with code accessible to anyone with a Toolforge account. I think once a day it could retrieve the list of page names in the tracking category, along with today's date, and add it to a text file in two columns (page name|added (ie. today's) date). If the page name is already in the text file don't add it again, but check if it has been more than 7 days since the added date. If so, verify there is Peer review archive and if not then remove the Peer review template, and remove from the text file. Likewise if the pagename is in the text file but not in the tracking category then remove the pagename from the file. Sound good? -- GreenC 02:19, 14 April 2021 (UTC)
That would be wonderful. It is just one of those small thankless tasks that a bot could so, so I'm very appreciative of this. There are a couple of similar tasks lying around, would it be possible to pester you in the future if something similar arises? Tom (LT) (talk) 07:48, 14 April 2021 (UTC)
Alright. Hopefully will get to it this week. It depends on the task how complicated, and how busy I am at the time. There is also BOTREQ. BTW I will need to send this through BRFA which sometimes can take forever but see no trouble in approval given it's simplicity and non-controversial. -- GreenC 15:30, 14 April 2021 (UTC)
User:GreenC bot/Job 20 & Wikipedia:Bots/Requests for approval/GreenC bot 20 -- GreenC 03:29, 15 April 2021 (UTC)
When you have a moment
Hello Green C. I hope you are well. I asked for a run from Template:Cleanup bare URLs/bot last night that it still hasn't processed. You may already be aware of this but I wanted to let you know just in case. My year and a half long infobox person cleanup project is almost finished so I will have time to use this bot again. Cheers. MarnetteD|Talk 22:24, 17 April 2021 (UTC)
Hello MarnetteD, there was a stuck/zombie process on one of the Toolforge grid computers blocking the spawning of new processes. That can happen, it's beyond my control to prevent but easily fixed by killing the process (done). If by chance it ever happens again and I am not around for a while, you can request help at Village Pump Technical who will point you to the right place (probably a Phab ticket), the stuck process will be called "tagbot.awk". Last resort waiting for the computer to reboot every couple months would also clear it. You take on big projects :) This one is probably infinite but every change is a huge help. -- GreenC 02:46, 18 April 2021 (UTC)
You said it :-) Thanks for the info and the fix! MarnetteD|Talk 02:55, 18 April 2021 (UTC)
MarnetteD, looks like it zombied again. If it keeps happening I might need to make another program that monitors for stuck processes. -- GreenC 17:57, 26 April 2021 (UTC)
I'm glad you noticed. I was waiting a bit to see if it would kick in. It is hard to say when this problem crept up since it wasn't getting used as regularly in the last year or so. Thanks for the update. MarnetteD|Talk 18:12, 26 April 2021 (UTC)
InternetArchiveBot in esWiki
Hi, GreenC. Thanks for taking care of this. Can you assure me that, in addition of fixing the duplicates, the bot won't perform inconsequential editions like this (it's difficult to find, it's just an added space)? That's the other half of the complaint. If that's so, I'll lift the block. Thanks. --Angus (talk) 22:07, 18 April 2021 (UTC)
This bot is small and purpose-built it shouldn't make empty edits. Bigger bots that can happen as they are doing many functions adding and deleting text. -- GreenC 23:28, 18 April 2021 (UTC)
Sorry I misunderstood, you mean ensure IABot does not (was thinking the smaller fixer bot). I contacted Cyberpower678, this should be an easy bug to detect and avoid by removing all whitespace from the original and new article, compare the two strings and if they are equal abort the edit. -- GreenC 00:19, 19 April 2021 (UTC)
GreenC, yes, this will be corrected. I should have a fix for this ready fairly quick. —CYBERPOWER (Message) 02:51, 19 April 2021 (UTC)
Hi guys, thanks for your cooperation. I unblocked the bot. --Angus (talk) 12:44, 19 April 2021 (UTC) cc user:cyberpower678
Hi Angus, could you recommend wording for a Spanish edit summary equivalent to "Fixing 1 redundant {{wayback}}" and "Fixing 2 redundant {{wayback}}" (plural). Will also need "Fixing 1 redundant archiveurl/urlarchvo argument" and "Fixing 2 redundant archiveurl/urlarchvo arguments". I've learned not to use Google Translate or guess but ask a native speaker. Thank you! -- GreenC 14:41, 19 April 2021 (UTC)
--Angus (talk) 14:52, 19 April 2021 (UTC)
Angus, btw, the bot has a run page so it doesn’t need to be blocked to stop it. You can find the run page at https://iabot.toolforge.org/index.php?page=runpages&wiki=eswiki —​CYBERPOWER​(​Around​) 16:27, 19 April 2021 (UTC)
Cyberpower678, unfortunately the "IABot Management Console" wants me to give it unnecessary access to private information, like my email address and who knows what else, before it will show me that page. So it remains inaccessible to me. --Angus (talk) 16:49, 19 April 2021 (UTC)
Angus, as the designer of the bot and the UI I can assure that not only is your email address not saved anywhere unless you explicitly tell the tool to, your email address is not ever passed to the tool on authorization. I have no idea why it says that. All you are giving the tool is your username and public accessible data like your registration date, permissions, and block status. —CYBERPOWER (Chat) 17:05, 19 April 2021 (UTC)
Angus, user privacy is taken very seriously and is never leaked. Private data is only stored with the users’ permission and critical data is encrypted to prevent unauthorized access. —​CYBERPOWER​(​Chat​) 17:06, 19 April 2021 (UTC)
Cyberpower678, it's ok, no worries. Maybe the Mediawiki API (or whatever) should be changed so it doesn't request unneeded data...
GreenC, thanks man! Sorry I wasn't there when needed, I'm glad things are fixed now! --Angus (talk) 22:54, 20 April 2021 (UTC)
Hi, I'm not sure if this is the right place to report this bug, but InternetArchiveBot duplicated two articles on esWiki while trying to fix a redundant archive. The first one is es:Anthem Sports (a duplicate of es:Anthem Sports & Entertainment) and the second one is es:Heckler (a duplicate of es:Heckler & Koch MP5). I think these are the only cases so far ([2]). --Soulreaper (talk) 15:08, 20 April 2021 (UTC)
Yes I am aware of this bug in the code and fixed it and had already redirected Heckler but was not aware of Anthem, now also redirected. If you think they should be deleted instead I'll start that process. -- GreenC 16:34, 20 April 2021 (UTC)
GreenC Bot
What do GreenC Bot do ? Cookersweet (talk) 11:44, 22 April 2021 (UTC)
Thanks for helping out at peer review!
The Peer Review Barnstar
For your very helpful bot-related contributions to Wikipedia peer review, I present to you the peer review barnstar. Nice work! Tom (LT) (talk) 07:11, 5 May 2021 (UTC)
Tom (LT) (talk) 07:11, 5 May 2021 (UTC)
No problem! At this rate it will be longest trial period for 25 edits in history :) -- GreenC 01:15, 6 May 2021 (UTC)
Transclusion of deleted template
I've nowiki'ed a transclusion of the now-baleeted {{Wayback}} from a subpage in your userspace, but I will let you know here for the sake of visibility (since I don't know if you're going to see an edit on some random userspace page). jp×g 17:11, 17 May 2021 (UTC)
@JPxG: thank you, I just pre'd the whole page for now. -- GreenC 18:21, 17 May 2021 (UTC)
Wikipedia:Link rot/Templates
Just wanted to tell you about a project I've started recently. Wikipedia:Link rot/Templates is intended to list all our external link templates on one page along with the status of the links to more quickly catch when links go down. I hope to get all templates with over 1000 transclusions on there within a few weeks.
If it would be possible to have a bot assisting with detection of dead links that would be great. If the links were checked to be working weekly by bot that would make the page a lot more useful. Is that plausible or not? I'm sadly completely out of my depth with that kind of bot and can not answer even simple questions like that on my own. --Trialpears (talk) 23:24, 22 May 2021 (UTC)
Disambiguation link notification for May 25
An automated process has detected that when you recently edited Lionel Terray, you added a link pointing to the disambiguation page Mount Huntington.
(Opt-out instructions.) --DPL bot (talk) 06:01, 25 May 2021 (UTC)
Dead link
Hi GreenC, I noticed you marked a link I added as dead. Thanks for pinging me. I added it today, and just checked again, and the link is definitely not dead. ― Tartan357 Talk 21:46, 7 June 2021 (UTC)
Never mind, I figured it out. The link is uniquely-generated and has a short expiration. I'll just link to the index. ― Tartan357 Talk 22:06, 7 June 2021 (UTC)
Hi GreenC! I've been enjoying the daily updates posted User:GoingBatty/Backlinks/Report and fixing the appropriate articles. I noticed that the bot didn't post an update today. Could you please check on it? Thanks! GoingBatty (talk) 01:55, 10 June 2021 (UTC)
It ran and generated the table, which it keeps on hand, but it didn't post for some reason. Maybe a network transient? I just posted it manually. Good thing you asked as it only keeps it for up to the next batch run. -- GreenC 02:06, 10 June 2021 (UTC)
Thank you for the manual list. The bot worked fine today, as usual. Happy editing! GoingBatty (talk) 22:37, 10 June 2021 (UTC)
Hi again! Unfortunately, your bot did not post a new version of User:GoingBatty/Backlinks/Report today. Could you please check on it? Thanks! GoingBatty (talk) 13:57, 6 July 2021 (UTC)
OK just added a loop it will try 10 times with 30 second pauses to account for timeouts. After 10 on fail it will email me. I believe this will solve it. -- GreenC 15:33, 6 July 2021 (UTC)
Thank you for manually posting an update for today, but that update contains many items not on User:GoingBatty/Backlinks​. Did you accidentally provide me someone else's list? Thanks! GoingBatty (talk) 16:56, 6 July 2021 (UTC)
Oi, that's my list! Certes (talk) 17:31, 6 July 2021 (UTC)
lol yeah sorry about that the procs are called "bw" and "bw2" on the server and I got confused which is GoingBatty (bw) and Certes (bw2). Should be corrected now. -- GreenC 17:34, 6 July 2021 (UTC)
Shadows Commons bot
User:GreenC bot/Job 10 hasn't tagged anything as {{​ShadowsCommons​}} since 4 May. The bot page says it uses Quarry 18894 but that query doesn't work due to [3]. It seems unlikely that absolutely nothing shadowed Commons after 4 May. — Alexis Jazz (talk or ping me) 22:16, 24 June 2021 (UTC)
@Alexis Jazz: Ah. Shoot. It was discussed in this Phab a while back and the WMF sysadmins didn't come up with a viable alternative. I just posted an alternative idea but it would take some time to develop, assuming it can even be made to work. The basic issue is that Commons has 60+ million titles and downloading that list takes a very long time, meanwhile ShadowBot needs to run daily. So my idea was to break the problem down into sub-lists; and scrap using database queries which can't deal with this problem effectively. It's an ugly problem. -- GreenC 00:02, 25 June 2021 (UTC)
GreenC, MGA73, how do I get a list of files on enwiki? As in, without the local description pages. I already have a list that includes those. It seems technically SELECT img_name FROM image should work, but it looks like it'll take about half an hour? — Alexis Jazz (talk or ping me) 09:51, 25 June 2021 (UTC)
@Alexis Jazz: I have no good solution atm. It seems that it does take a long time to run a quarry. --MGA73 (talk) 11:21, 25 June 2021 (UTC)
MGA73, 1298.78 seconds to return 894832 rows to be exact. — Alexis Jazz (talk or ping me) 11:31, 25 June 2021 (UTC)
MGA73, User:Alexis Reggae/The Real Slim ShadyCommons something something — Alexis Jazz (talk or ping me) 14:24, 25 June 2021 (UTC)
Alexis Jazz hi, sorry, not sure what we are looking at, were you able to devise a working query? -- GreenC 18:49, 25 June 2021 (UTC)
Not really, just wanted to get a list once to prevent the backlog from getting bigger. It actually returned more than expected, but many problem files so I'm keeping the list. The Real Slim ShadyCommons is essentially a list of files on enwiki (SELECT img_name FROM image) that shadow a Commons file (index from dump) or redirect and don't have the {{keeplocal}} template. It includes what the bot would have tagged, but also much more, so it's not as simple as this. — Alexis Jazz (talk or ping me) 19:00, 25 June 2021 (UTC)
Who cares?
Moved to Talk:Christopher_C._Horner#Who_cares?
Moved to WP:URLREQ#Reuters (again)
Backlinks and common words
I'm thinking of adding selected common words (maybe 20) to my Backlinks list. Of course, a search for "The" would match almost everything and time out, but a search for linksto:"The" is fast. Would these additions be safe, or would they slow things down in an antisocial way? I already have A listed (to catch jokers who link each letter), so you could check whether that runs noticeably slower than less common words. Certes (talk) 01:06, 15 July 2021 (UTC)
Certes, top 50 by size from your list
Extended content
3449577 china 2551179 London 925721 Boston 899215 Sydney 822928 National Football League 787712 Jazz 742641 Melbourne 637715 The Daily Telegraph 602020 Luxembourg 545562 Athens 470567 Manchester 417654 Liverpool 414820 Birmingham 358707 Perth 298299 Naples 287433 Edmonton 284325 Hollywood 262800 New Brunswick 244875 surrey 244846 Surrey 242423 Oxford 241038 guinea 227722 Havana 224517 Blues 224517 blues 219266 Wellington 208890 National League 198799 Oxygen 197470 Butterfly 197470 butterfly 197311 Cambridge 197190 Norfolk 194592 Hyderabad 191996 Christchurch 190041 ABC News 183295 The Observer 182475 Country 182475 country 179265 Madonna 179265 madonna 174391 Alexandria 164651 Portsmouth 160175 Sculpture 160175 sculpture 143481 The Sunday Times 139840 York 138493 Hanover 135255 Company 135255 company 134502 Stream
That's file size in bytes (each file contains a list of article names), but it gives a relative sense of which ones are the largest. The system was never designed with this many in mind but it seems to be holding up fine. One reason it might have trouble is if it takes > 24 hrs to run, at which point we increase the time period between runs. The last run took 2.5 hours so you're 10% of the way there ;) Or if the number of links is in the millions, like the
{{cite web}}
template, but it's hard to imagine linked terms much more common than china or london. Ghandi? Jesus? China has 133,717 backlinks. 'The' has less than a thousand. -- GreenC 02:31, 15 July 2021 (UTC)
Thanks for the list. I'd already removed London, Boston, Sydney, Melbourne, National Football League, Luxembourg, Manchester and others for producing too many false positives. I removed Jazz and Athens last night, so that's most of the top ten gone. china (lower case for pottery) can also go now as it appears rarely. Of the top ten, that just leaves The Daily Telegraph. I still get plenty of links for The Daily Telegraph (Sydney) in semi-automated citations; I think it's linked to the wrong WP article in Trove. However, it would be perfect for a variant which limits the search to articles which also mention Australia (or perhaps NSW, Brisbane, etc.) Here's another Telegraph error today: David Storey (politician). Certes (talk) 13:00, 15 July 2021 (UTC)
User:Certes, you may already know about this, but in case thought you might be interested in Zipf's law (second paragraph of lead section). External links has a Zipf's list for English. They might contain frequent disambiguation problems. -- GreenC 17:15, 19 July 2021 (UTC)
I vaguely remember Zipf's law but it had slipped my mind. I just checked a couple of lists and the only word I'd missed was "information", which seems a legitimate enough target for the false positives to dominate the errors. I've deliberately omitted words such as Be, which are or redirect to dabs and will be picked up by WikiProject Disambiguation. The words are now being checked: just one today; no false positives. Certes (talk) 17:39, 19 July 2021 (UTC)
Precious anniversary
Five years!
--Gerda Arendt (talk) 07:45, 4 August 2021 (UTC)
Orphaned non-free image File:Sheikh Zayed Book Award medal.jpg
Thanks for uploading File:Sheikh Zayed Book Award medal.jpg. The image description page currently specifies that the image is non-free and may only be used on Wikipedia under a claim of fair use. However, the image is currently not used in any articles on Wikipedia. If the image was previously in an article, please go to the article and see why it was removed. You may add it back if you think that that will be useful. However, please note that images for which a replacement could be created are not acceptable for use on Wikipedia (see our policy for non-free media).
Note that any non-free images not used in any articles will be deleted after seven days, as described in section F5 of the criteria for speedy deletion. Thank you. --B-bot (talk) 17:38, 8 August 2021 (UTC)
GreenC bot seems to have blacklisted all links to https://www.independent.co.uk/​, causing IABot to repair all links to that domain, despite most still being alive. See e.g. this. Could this be "fixed" somehow? Jonatan Svensson Glad (talk) 01:26, 9 August 2021 (UTC)
This is mostly undone it will be finished in a few hours. Started rolling back yesterday after the problem was noticed. (It's not all links, but a lot). It was caused by a wrong header where they were returning 406's but the page was 200 (but only for the bot not web browser) - whatever caused it has since stopped since they now return 200 correctly. Some live URLs may have gotten archived by IABot, I don't have a way to determine which. -- GreenC 01:45, 9 August 2021 (UTC)
Not related to your bot, but do you know how to unmark the domain gothamist.com as dead on IABot? The websites are alive, but it seems the domain is blacklisted and being tagged as dead. Jonatan Svensson Glad (talk) 20:40, 12 August 2021 (UTC)
Yup, that's an IABot admin action, which I am. Done. It was blacklisted since 2017. -- GreenC 22:16, 12 August 2021 (UTC)
Your RFC
I'm pretty annoyed that you've chosen to misrepresent me and ask a completely pointless straw man question in your RFC. Whatever the "result", it will have no bearing on the edit I made. Wtqf (talk) 23:23, 12 August 2021 (UTC)
I could tell right away there was something wrong with you.. belligerence, anger, obvious deep knowledge of Wikipedia but blank user page and new account. I figured it was only a matter of time before you would be blocked, at least > 30 days, so I had no choice but the start the RfC. Well, it only took 24 hrs. You had "block permanently" written all over. Next time, check your attitude maybe your sock won't be so obvious. -- GreenC 14:15, 13 August 2021 (UTC)
Could I interest you in some more...
Could I interest you in one more peer review related task...? (​Wikipedia:Bot_requests#Bot_to_repair_broken_peer_review_links​)
Summary: a nearly completed bot exists but the owner went away. Old peer reviews didn't contain a fixed link to the peer review page, which means over time as pages are moved, the links get broken. The bot was designed to fix those links. There was one outstanding issue which was that sometimes it would include a link twice in the output. Once that happens it can fix the rest of the 680 outstanding broken links. Would I be able to interest you in picking up and finishing this task...? :D Tom (LT) (talk) 01:31, 9 May 2021 (UTC)
Tom, do you know if the source available somewhere that I could take a look? -- GreenC 16:59, 9 May 2021 (UTC)
Ah, looks like the owner has resurfaced and there is a new bot RfA in the works (​Wikipedia:Bots/Requests for approval/AWMBot 2). Hurray, and ignore my request! Tom (LT) (talk) 04:08, 10 May 2021 (UTC)
Ok good! -- GreenC 15:22, 10 May 2021 (UTC)
Never mind, the original creator has since retired. The bot's code is contained as a link in the bot request. I would be super grateful if you'd be able to take up the baton here - there's still 679 broken links (​Category:Pages_using_Template:Old_peer_review_with_broken_archive_link​) and having a functioning bot makes it much easier to maintain and repair them.Tom (LT) (talk) 01:34, 17 August 2021 (UTC)
Tom, I looked at it. JavaScript is not a language I know and could not follow what it is doing. It might be good to first see if any JS programmers want to adopt it. A place they hang out is WP:SCRIPTREQ - it's scripts not bots but both are JS. Also, DannyS712 has made tons of JS bots and runs the scripts newsletter. -- GreenC 14:44, 18 August 2021 (UTC)
No problem, thanks for your help to date. Tom (LT) (talk) 07:17, 19 August 2021 (UTC)
WikiProject SpaceX
Hi. Would you be interested in joining a WikiProject SpaceX? If you are, can you please make a WikiProject proposal for it (I as an IP, can not make the proposal cause I would be stopped when trying to create the proposal page). @GreenC:
You can sign up for an account is best. -- GreenC 03:00, 20 August 2021 (UTC)
Manual conversions?
Regarding your posts on converting archive references, such as [4]: did you mean "manual conversions"? isaacl (talk) 16:04, 22 August 2021 (UTC)
Oh yeah, my ability to type lately has really gone downhill, one word thought and another comes out. I'll just blame the spell checker. -- GreenC 16:09, 22 August 2021 (UTC)

What should we do for cites from punesite dot com?
Moved to Wikipedia:Link_rot/URL_change_requests#What_should_we_do_for_cites_from_punesite_dot_com?
Internet Archive bot froze
It seems that no jobs are running. Normal single page things are working, but no jobs are progressing or even starting. AManWithNoPlan (talk) 20:45, 26 August 2021 (UTC)
Aware, thanks. -- GreenC 21:11, 26 August 2021 (UTC)
thanks, did it
Hi, GreenC. Thanks for your help here https://en.wikipedia.org/wiki/Wikipedia:Reliable_sources/Noticeboard/Archive_350#Can_plausible%2C_unsourced_statement_rely_on_sourced_statement%3F​. Did go ahead and add the citation needed notation. Felt pretty good like the section is stronger now. Greg Dahlen (talk) 19:54, 31 August 2021 (UTC)
Master Reference sourcer
Hi, you seem to be a person that I could ask...
Has anyone every discussed trying to create a "single" reference place? Such that articles would not have "individual" references, but be able to find, then tag them from a reference repository? (One that would also mark "BAD"/invalid references) - a centralized (maybe project based) place to put standard "named" references. a template '<nref name="Green" />' would "look up" a named reference. Though I am loathe to create a "reference editor" permission, maybe limit who is allowed to change the "master reference"? (maybe based on "# of pages affected").
I am wondering if the "Cite" function of the WikiText Editor could:
1a) look up the url, (in the "repository")
1b) display a warning if "designated invalid",
1c) if valid, then insert an "nref" (with shortname already on file)
2a) if "new" URL, place it in the repository,
2b) request a "shortname"; checking that is does not already exist..
2c) then insert the nref
3) a "dead" URL would still be searchable, but could be tagged in the "repo" with a "substitute"?
A side benefit would be to make a LOT of articles "much" easier to edit (lack of long references interrupting text), and "much" easier to "alter" a reference when broken (or archived)? Mjquinn_id (talk) 13:34, 8 September 2021 (UTC)
yeah that's basically the vision of WikiCite, and recently there was a proposal to build such a database but the WikiMedia Foundation declined to fund the project. -- GreenC 15:53, 8 September 2021 (UTC)
Thanks for responding! "What is the Best AI model for Content Moderation on Wikipedia?"
Dear @GreenC: Thanks so much for your participation in our discussion post "What is the Best AI model for Content Moderation on Wikipedia?" in Village Pump! Your insights are really appreciated! I wonder if you would have time to engage a bit more with us and other interested editors on this topic? We plan to host an online zoom discussion session soon to invite editors to discuss further with us and with each other -- of course you can turn your camera off if you want :) If you're interested, please get in touch and I will send you (and others) a WhenToMeet link to schedule the session! Thanks so much again for your time! Bobo.03 (talk) 03:32, 16 September 2021 (UTC)
Last edited on 16 September 2021, at 03:32
Content is available under CC BY-SA 3.0 unless otherwise noted.
Privacy policy
Terms of Use
HomeRandomNearbyLog inSettingsDonateAbout WikipediaDisclaimers
WatchHistoryContributionsEditUser groupsLogsPage informationPermanent linkWhat links here