Wikipedia talk:Double redirects

Latest comment: 1 month ago by Denniscabrams in topic Bots
WikiProject iconRedirect Project‑class
WikiProject iconThis page is within the scope of WikiProject Redirect, a collaborative effort to improve the standard of redirects and their categorization on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Note: This banner should be placed on the talk pages of project, template and category pages that exist and operate to maintain redirects.
This banner is not designed to be placed on the talk pages of most redirects and almost never on the talk pages of mainspace redirects. For more information see the template documentation.
ProjectThis page does not require a rating on Wikipedia's content assessment scale.

MediaWiki deficient? edit

If Wikipedia terminates a redirect as a precaution to infinite loops, they should come up with a better plan than that.

My plan is to improve the Wikiware to automatically detect a self-redirect to stop an infinite loop but allow a sequence to other redirects until it reaches the terminus. --SuperDude 05:54, 14 May 2005 (UTC)Reply

Is there anyone working on removing the restrictions on double-redirects? Where might I find more information? Ewlyahoocom 22:14, 8 February 2006 (UTC)Reply

Er, yes, that's what I thought. Loop detection is a solved problem. All you do is remember every page you've redirected through and see if you hit one again. As a back-end check you would have a maximum hop count of some reasonable value (like 20)? Easy-peazy. Why should we have to compromise the logical structure of Wikipedia (or any wiki) for the sake of a few lines of code and a few CPU cycles? Duckbill 09:40, 20 March 2006 (UTC)Reply

Functioning redirect chains are expected default behaviour and are sometimes the correct way to set up complex redirects. The WikiMedia software is currently broken in this regard. The fact that the breakage seems to be a deliberate act on the part of the programmers doesn't make it any less frustrating (or any less broken).
If wikipedia doesn't want to allow multiple redirects as a matter of site policy then that's fine, but for those of us trying to use the software for other wikis shouldn't have our hands tied. If the programmers are worried about infinite redirect loops, then the very least, there should be a wiki-level system setting for "max number of redirects = n" , where "n=1" gives Wikipedia's behaviour, n=2 allows double redirects but blocks triples, n=3 allows up to three redirect stages (etc). I don't expect I'll need more than two on my project, but to be artificially limited to just one is silly. ErkDemon (talk) 00:28, 1 July 2011 (UTC)Reply
I may be wrong, but I thought the developers had introduced such a variable. (The reason it's not set to a higher value on WP is that there is then no system for generating a list of overlong chains similar to Special:DoubleRedirects; and no-one seems to be in a hurry to solve this problem.)--Kotniski (talk) 09:46, 1 July 2011 (UTC)Reply
Yaaay! Thanks Kotniski! Found it, and it seems to work ( http://www.mediawiki.org/wiki/Manual:$wgMaxRedirects , since v1.15, apparently). And my apologies to the MediaWiki programmy people. ErkDemon (talk) 19:00, 12 July 2011 (UTC)Reply
Somebody somehow probably accidently didn't notice this section and continued on this discussion at #...in order to prevent infinite loops. Blackbombchu (talk) 01:44, 1 May 2014 (UTC)Reply

Page move edit

I propose that this be moved to Wikipedia:Double redirects. To my recollection, I never even saw the term "multiple redirect" before I stumbled onto this page a bit earlier. (Well, maybe I saw it once.) Even the page itself consistently uses the term "double redirect", except for the initial defining sentence. Eric119 04:59, 24 Jun 2005 (UTC)

  • I will implement this move. Eric119 16:50, 11 July 2005 (UTC)Reply

prevention is better than cure edit

Whenever we move a page, we are reminded to check for double re-directs and fix them, if any. However, whenever we merge two articles, and convert one of them into a re-direct, we do not get reminded to check for double re-directs. We should probably be reminded whenever a re-direct is made so that this problem can be remedied by the initiator of the re-direct himself. --Gurubrahma 10:36, 1 November 2005 (UTC)Reply

...in order to prevent infinite loops. edit

I believe the phrase "...in order to prevent infinite loops" should be removed from this description. The limitation is simply a limitation (and there are better ways to avoid infinite loops than this kind of deficiency -- I even believe following double-, triple-, or whatever-number-of-redirects could be done within a single existing SQL statement without adding any additional looping in the software). Ewlyahoocom 22:27, 21 May 2006 (UTC)Reply

I prefer to keep the explanation about infinite loops. It's only a few words. I prefer to see an explanation of why a limitation is there, rather than just be told that there's this limitation. There may be better ways to do it; maybe the development team will change it when they have time. You could, if you want, start up a discussion perhaps at the technical section of the village pump or on Meta, or perhaps put in a bugzilla feature request, though I'm guessing the development team is already aware of the problem. If it's done with a single SQL statement there would be implicit or explicit looping within the SQL statement; perhaps the SQL interpreter automatically checks for and avoids infinite loops. I don't know the details and I'm guessing there's some reason the development team doesn't find it easy to change this, or they would have done it already. --Coppertwig 13:27, 18 February 2007 (UTC)Reply
As of now, infinite loops don't cause the browser to freeze because when ever someone gets redirected to another article, the url is the url of the redirecting article, not the url of the article they're redirected to. That prevents a double redirect from taking somebody to the article they're double redirected to. If that problem can be fixed, a double redirect could be very useful. For example, the double redirect from Coza! to Cara Operations might be useful to indicate that Coza tuscan grill should have a stand alone article but doesn't have one yet. The problem can be fixed as follows: have the article Cara Operations should have the url https://en.wikipedia.org/wiki/Cara_Operations regardless of whether it says 'Redirected from Coza tuscan grill' and Coza tuscan grill should have the url https://en.wikipedia.org/wiki/Coza_tuscan_grill, not https://en.wikipedia.org/w/index.php?title=Coza_tuscan_grill&redirect=no if it still had the double redirect. Since it's possible for a YouTube video to have the same url whether or not you're signed in and whether or not it contains a comment, it should also be possible for a Wikipedia article to have the same url regardless of whether it says which article it's redirected from. In addition to that, each article url should keep an invisible record of a set of other articles in the following way: for any 2 articles article 1 and article 2, article 2 should be removed from the set of articles article 1 contains in its record when article 1 or an article in its record that contains article 2 in its record stops having a redirect, gets deleted or changes the article it redirects to to an article that doesn't contain article 2 in its record and article 2 should be added to the set of articles article 1 contains in its record when article 1 or an article in its record adds or changes a redirect to article 2 or an article that contains article 2 in its record. After that change gets made, a robot should make a null edit to every redirect page that doesn't store the article it redirects to in its record to get it in its record. Infinite loops can be prevented by having an error message appear when ever somebody starts adding a redirect from an article to itself or to an article that contains it in its record. Each article should also contain a secondary record that contains at most 1 article that works in the following way: for any 2 articles article 1 and article 2, article 2 should become article 1's secondary record if any of the following happens, adding a redirect to an article whose secondary record is article 2 from article 1 or an article in article 1's primary record, adding a redirect from article 2 to a non-redirect article, and removing a redirect from the article article 2 redirects to; and every time somebody clicks a link to an article that has a redirect, it takes them to the article that it's secondary record redirects to and says 'Redirected from [it's secondary record].' Forthermore, if Coza! still had its double redirect, then clicking a link to Coza! should bring you to Cara Operations and that article should say 'Redirected from Coza tuscan grill' and then clicking Coza tuscan grill at the top of the article should bring you to the Coza tuscan grill article and that article should say 'Redirected from Coza!.' Blackbombchu (talk) 21:53, 30 April 2014 (UTC)Reply
I don't understand what you are trying to solve. Let's assume that MediaWiki is configured to follow redirect chains of length up to 2. If there is a double redirect X → Y → Z and you visit https://en.wikipedia.org/wiki/X, your browser will not be redirected anywhere – the server will follow the redirect internally and serve you the contents of page Z with a "Redirected from X" notice. The browser address bar will still read https://en.wikipedia.org/wiki/X. If there is a loop A → B → A and you visit page A, the server will stop after two hops and you will see page A saying that it is a redirect to B. Nothing bad can ever happen. — Petr Matas 07:18, 1 May 2014 (UTC)Reply

Fix double redirects automatically edit

Would it be possible for the database to fix double redirects by itself, without relying on us to find them and edit them? --Smack (talk) 18:16, 25 January 2007 (UTC)Reply

Ask it on Bugzilla. Melsaran (formerly SalaskĐ°n) 13:55, 4 August 2007 (UTC)Reply

Added section "Checking for double redirects" edit

I just added instructions on how to check for double redirects. These instructions may seem obvious and unnecessary; but let's just say that some of us are the type of people that if we write a program that handles linked lists we need to draw little diagrams with boxes and arrows, and if we write another similar program a few weeks later we need to draw the diagrams all over again. In other words, it may be obvious to some people but confusing for others. I hope these instructions will remain here so I can re-read them next time I move a page. --Coppertwig 13:19, 18 February 2007 (UTC)Reply

Contested move request edit

The following request to move a page has been added to Wikipedia:Requested moves as an uncontroversial move, but this has been contested by one or more people. Any discussion on the issue should continue here. If a full request is not lodged within five days of this request being contested, the request will be removed from WP:RM. —Stemonitis 06:47, 4 August 2007 (UTC)Reply

  • It is unclear to me that pages in the Wikipedia namespace must be bound by the same rules as article titles. If there is thus little point in making the move, then the disruption it might cause is reason enough not to move the page. --Stemonitis 06:47, 4 August 2007 (UTC)Reply
  • Opposed this isn't an article, WP:NC is for articles. Even the WP:NC page name is pluralized. 132.205.44.5 23:19, 6 August 2007 (UTC)Reply
  • Yes, because there are multiple naming conventions on it. Melsaran (formerly SalaskĐ°n) 04:24, 7 August 2007 (UTC)Reply

Byrial method? edit

The article mentions that the "Byrial method" builds on the redirect (Topbanana) code. What is the Byrial method, and why is it not explained or linked? Cww 03:29, 19 September 2007 (UTC)Reply

redirects to singulars with possibilities edit

I'm being told that under this policy, pattern-avoiding permutations must not redirect to pattern-avoiding permutation, which is tagged as a redirect with possibilities. If the latter becomes an article, then pattern-avoiding permutations would be left redirecting to something other than pattern-avoiding permutation, and those clicking on "what links here" at the latter page would never find out. That is absurd. At least one bot that fixes double redirects will NOT "fix" them in cases where there's a "redirect with possibilities" tag, but there's another that does (user:computer) and it's owner refuses to override it in such cases as this. Michael Hardy 20:54, 19 September 2007 (UTC)Reply

I've already provided Michael Hardy an alternative that accomplishes what he wants without having to have broken redirects. It is trivial to add an HTML comment (or simply text as Wikipedia will ignore anything after the redirect) after the redirect that mentions the others. Example:
#redirect [[Stanley-Wilf conjecture]]
<!-- If you change this to an article, please be sure to modify [[Patern-avoiding permutation]] & [[Patern-avoiding permutations]] to redirect to here -->
It is easy to solve these concerns without having to have double redirects that confuse our readers. -- JLaTondre 21:07, 19 September 2007 (UTC)Reply
Michael Hardy chose to place a text with links after the redirect code.[1][2] The text is not displayed and the redirect works so our readers should not be confused, but Special:Whatlinkshere/Stanley-Wilf conjecture may confuse editors (at least it confused me). It falsely looks like there are double redirects having to be fixed. PrimeHunter 01:57, 20 September 2007 (UTC)Reply
If this is really necessary (and it may be) he should be adding something like this {{R from alternate name|Pattern-avoiding permutation}}, which would at least "look" right (even if it doesn't do anything). I would prefer if the Wiki software could be modified to at least handle just 1 extra level of redirection, which would settle this and a lot of other issues. Ewlyahoocom 02:07, 20 September 2007 (UTC)Reply

bots are often idiots; they should be supervised by thinking humans edit

The redirect page titled "ML Inequality" redirects to "estimation lemma". Note that it has a capital initial "I". If "ML inequality" (with a lower-case initial "i" should ever become an article rather than a redirect that points (like "ML Inequality") to "estimation lemma" then "ML Inequality" (with a capital "I") should be edited to redirect to "ML inequality" (with a lower-case "i"). Therefore, this redirect page reads as follows:


#REDIRECT[[estimation lemma]]

If [[ML inequality]] ever becomes an article rather than a redirect, then this page should be changed to redirect to that.


It works fine as a redirect page. But this bot called Computer changed it to read as follows:


#REDIRECT[[estimation lemma]]

If [[estimation lemma]] ever becomes an article rather than a redirect, then this page should be changed to redirect to that.


Now think about that. The owner of the bot doesn't think that's a problem. And he cites THIS style page, on double redirects, to justify his bot's behavior. That's the most unreasonable view on design of Wikipedia that I've seen in a long time. If THIS page requires that behavior of that bot, I propose to alter it so that it still forbids double redirects but doesn't require that idiotic edit. Michael Hardy 23:28, 7 October 2007 (UTC)Reply

Is this done automatically? edit

Special:DoubleRedirects says It is not necessary to fix these by hand. Bots will go through the entire list periodically and fix all of the double redirects. Is this true? If so, why do the move pages give stern warnings that fixing double redirects is your responsibility?--Yannick (talk) 15:52, 24 November 2007 (UTC)Reply

Bots will fix double redirects on that list. However, that list is generated periodically which means that a double redirect can exist for sometime before it gets fixed. The move page directions are to avoid having readers being confused by double redirects in the period between the move and a bot being able to fix them. The bots are to fix ones that are missed; they shouldn't be used as a shortcut as that would be a disservice to our readers. -- JLaTondre 15:08, 26 November 2007 (UTC)Reply
Some of these page moves can produce hundreds of double redirects. That's a lot of tedious work to do by hand, something of a disservice to our editors to put this responsibility on them. How often do those bots run?--Yannick (talk) 02:05, 28 November 2007 (UTC)Reply
It's not an issue of how often the bots run, but rather how often the double redirects page is generated. That's generated by the Wikipedia software. I think it's done daily, but I'm not positive. I gather that it is database intensive. I doubt there are many moves that "produce hundreds". The ones that are overwhelming can obviously be left to the bots. However, the nominal case of a handful should just be taken care of by the editor. -- JLaTondre 13:20, 28 November 2007 (UTC)Reply

MediaWiki software automatically fixes things edit

bugzilla:4578, "Page moves should not create double redirects", has been implemented, per Wikipedia:Wikipedia Signpost/2008-07-28/Technology report#New features and this posting at Wikitech-L. That will sharply rduce the number of double redirects. (It's possible to tell the software to let a double redirect be created, by unchecking a box before doing a move, but it seems fairly obvious that few editors will do that.) In particular, it's now unnecessary for editors, after moving a page, to bother checking for double redirects that were created by the move. -- John Broughton (♫♫) 17:32, 7 August 2008 (UTC)Reply

Many double redirects are good edit

At Wikipedia:Village_pump_(proposals)/Archive_44#Double_redirects, Kotniski, Happy-melon and others provided examples for good double redirects, which all seem to fit the following formula:

  • X' → X → Y

In a common situation ("type 1"), X' is an alternate spelling of X, and Y is an article that, for some reason, currently is the best we have on X. This could be because it contains something about X, among other topics.

A related situation ("type 2") also fits the above formula is in Kotniski's example:

  • Mosial → MĂłsiał → Jan MĂłsiał

Here, X is a more general title than Y that might need to be disambiguated in the future.

A majority seemed convinced by that, and agreed with allowing longer redirect chains. Despite that agreement, nobody thought of changing this page then. It would be bad enough if this were only a question of wrong recommendations we give to our editors, but that could be mitigated by the fact that most users can use their common sense. The situation has become worse because we now have bots which are busily and indiscriminately "fixing" all any and all double redirects. I recently became aware of this problem (see User talk:Xqt#Xqbot is edit warring), and did a little bit of research: I counted 20 double redirects "fixed" by the bot. Out of these, I found seven - or 35% - that fit type 1.

The gory details
line original intermediate target type by diff
1 Ihsan Sencer Horasan Ä°hsan Sencer Horasan Turkey at the 2005 Mediterranean Games#Equestrian 1 Xqbot [3]
2 Sencer horasan Ä°hsan Sencer Horasan Turkey at the 2005 Mediterranean Games#Equestrian 1 Xqbot [4]
3 Kurenai Tsubasa Tsubasa Kurenai List of Ranma ½ characters#Tsubasa Kurenai 1 Xqbot [5]
4 Tsubasa Kuenai Tsubasa Kurenai List of Ranma ½ characters#Tsubasa Kurenai 1 Xqbot [6]
5 Tobacco keeper Tobacco Keeper Ali Bader 1 Xqbot [7]
6 Yan Sun Sun Yan Fanqie 1 Xqbot [8]
7 Stephen "Doc" Kupka Stephen Kupka Tower of Power#Stephen Kupka 1 Xqbot [9]
8 Mo jing MĂł jĂŹng Tribadism 2 Eubot [10]

Note: Some of these are from unlikely typos (line 4) or from unnecessary case distinctions (line 5); which means that they should instead have been deleted. Editing the redirect target makes even less sense in these cases. (Per Wikipedia:Redirect: "No redirect to Francis Ford Coppola is needed because the "Go" command is case-insensitive for an article whose title is all initial caps.")

We need to take that reality into account and do the following:

  1. Change the wording of this page to clarify that not all double redirects need to be "fixed".
  2. Fix the bots so that they stop "fixing" double redirects indiscriminately.

— Sebastian 00:32, 24 August 2009 (UTC)Reply

Unless something has recently changed in how Mediawiki works (as configured for EN Wikipedia), the suggested actions seem pointless because double-redirects do not work. Unless and until the software changes, there is little point suggesting that some double redirects should not be fixed. older ≠ wiser 01:31, 24 August 2009 (UTC)Reply
There was a recent period during which double redirects did work, and there was a discussion that seemed to produce consensus to keep that, but nothing ever happened. --NE2 02:01, 24 August 2009 (UTC)Reply
It is very unfortunate that the software has this problem, and I admit that this situation makes my proposed changes somewhat less opportune now. (I wasn't aware that the software lapsed back.) It is not nice when a user has to click twice to get the page he or she wants. However, the problems we're creating are bigger than the problems we're fixing: For every 2 correct fixes, we have 1 where the logical connection is mindlessly destroyed. I don't think it is wise to automatically and indiscriminately destroy what people created for a reason, just to work around the momentary quirks of the software. We need to stop this destructive hack. — Sebastian 04:01, 24 August 2009 (UTC)Reply
While I honestly believe that double redirects should work, but as long as they don't we need to live with what we have. It can't hurt to start making a list of wanted double redirects, or a potential policy/guideline for when to use them. עוד מישהו Od Mishehu 07:09, 24 August 2009 (UTC)Reply
We should keep badgering the developers to fix this (there is a bug report at bugzilla somewhere). The thing that needs fixing, AIUI, is not the doube redirects themselves (these could easily be enabled at the flick of a switch), but the reporting of overlong chains of redirects. At the moment we have Special:DoubleRedirects to spot and fix them; if the maximum chain length were increased (to 3 or 10 or whatever value of n) then we would need an equivalent Special page to spot and fix chains of length n+1. It's that functionality that's lacking.--Kotniski (talk) 08:55, 24 August 2009 (UTC)Reply
bugzilla:17888, as mentioned in the original discussion, and it's still open. Amalthea 11:14, 24 August 2009 (UTC)Reply
I don't think living with what one has is a very helpful metaphor in an environment that's built on change. You can't take that literally, unless you include our intelligence and ingenuity among the things we have. Certainly we do not need to keep the wording of this page, nor do we need to keep a horde of bots running amok. The edit war which I had with one bot, which I thought was over once and for all, now got taken up by the next.[11] This is just like the science fiction nightmare of robots taking over the world, albeit only on our little world of wikipedia.org. — Sebastian 13:58, 24 August 2009 (UTC)Reply
I'm not sure if "broken" double redirects are preferable though, speaking from a reader's point of view. Amalthea 14:57, 24 August 2009 (UTC)Reply
Ah, that was an eye-opening remark! It seems, we're caught in a self-perpetuating meme here. People have come to call double redirects "broken link" - something every responsible web developer hates! I think people here talked each other in a panic that makes one problem appear exceedingly big, and lets them ignore the other problem completely. But look at "broken link" - that's not what we're talking about. Even the worst case, when a reader has to click one more time, is not "broken" by any stretch of the imagination. And it may not even necessary to click the second time. Just look at an example from my list above: Reader sees Kurenai Tsubasa, doesn't know what that means, and clicks on it. Up comes a page that says "→ List of Ranma ½ characters#Tsubasa Kurenai". That's already good information, that may even be all the reader wanted to know. Is avoiding such "problems" really worth destroying our logical link structure? — Sebastian 15:21, 24 August 2009 (UTC)Reply
I'm afraid you've lost me. Are you now saying that it is a good thing to have Kurenai Tsubasa be a simple redirect to meaningful content rather than a broken redirect to Tsubasa Kurenai (which itself redirects to the same content)? That seems to contradict what you've been saying before. And to be clear, no one is talking about broken links -- I think everyone commenting here understands what a broken redirect is. older ≠ wiser 15:57, 24 August 2009 (UTC)Reply
No, what I'm saying is that calling double redirects "broken redirects" is making mountains out of molehills. I understand that it's hard to break free of such a view, especially when, as you say "everyone ... understands" it that way. Maybe I'm a just bit more aware of what language can do to us, having grown up in a country that had a history of abusing it, and where it was a vital part of the curriculum to make us aware of the mass hysteria that can be caused by emotional buzzwords. — Sebastian 16:27, 24 August 2009 (UTC)Reply
A broken redirect is a redirect that does not function. Where is the pernicious effects of misusing language? Your opinion that forcing readers to make extra clicks through such broken redirects is not uncontroversial, and although there is significant support for enabling double- (or multi-) redirect functionality, forcing readers to use broken redirects is a separate question which does not appear to have the same level of support. And in any case, you didn't really address my question above. Your recent example appears to contradict your earlier statements. That is rather confusing. older ≠ wiser 16:35, 24 August 2009 (UTC)Reply
Sorry, I didn't mean to say that some evil person intentionally coined a misleading term. I understand how one can get to that term. But it is misleading, because the redirect does function, albeit not as well as we would want it to. Calling it "broken" is adding unwarranted emotional ballast. If anything is broken, it is not the redirect, it is our software. (Maybe it helps to illustrate this with an example: Imagine you're driving a car with a broken gear shift, and you have to always stay in first gear. If you took the highway, you would get into trouble, so you take a detour instead. Calling the redirect broken is like calling the highway broken, instead of the car.) And while I'm trying to open your eyes about one such emotionally loaded word, you're introducing another. The word "forcing". Again, I understand how you can feel this way, but please let's keep things in proportion. — Sebastian 16:59, 24 August 2009 (UTC)Reply
So, now it appears that you are unnecessarily distorting the language in order to make a point. A double-redirect at present does not function as expected. The simplest sense of that is that it is broken. I'm sorry that that doesn't support your opinion, but to describe it as anything other than broken seems like Orwellian double-speak. older ≠ wiser 17:09, 24 August 2009 (UTC)Reply
OK the redirect also breaks "related changes". Even one level of redirect does this AFAIK. Now back to the original point. If we think information is being lost all we need to do is ask the double-redirect bots nicely to populate a suitable field. E.G. {{R from double}}. Then when double (truple, whatever) redirects become available again we can unwind the redirects. Or indeed if MΘΝΠ becomes an article or a dab. Rich Farmbrough, 04:22, 30 August 2009 (UTC).Reply
Thank you for your post and for understanding my concern. Template:R from double is a good idea; that could also contain a {{nobots}}.
I hadn't thought about the "related changes"; why do you say it breaks them? Do you mean is that if we have * X' → X → Y, and you watch related changes for Y you won't see a change in X'? This is not necessarily a problem, since X is not the same as Y. Conceivably, you may be interested in changes related to Y, without being interested in changes related to X. Admittedly though, if you're using "related changes" to watch for vandalism, then it wouldn't hurt to be alerted to changes in X'. But since that's only a redirect and unlikely to incur a lot of changes it is also much less likely to be changed, and much less of a problem if you miss a change. — Sebastian 22:19, 31 August 2009 (UTC)Reply
Broken related changes mean the following: If X is an article, which contains a link to Y, and you watch for related changes of X, you will see changes in Y. However, if there is a redirect Y → Z, you will only see changes in Y, not in Z. This has nothing to do with double redirects. — Petr Matas 15:23, 29 April 2014 (UTC)Reply

Relying on bots edit

I've noticed that bots now appear to be more reliable and frequent (once per day) in fixing double redirects. Should I now rely on bots to fix multiple double redirects, or should I continue to correct multiple double redirects myself out of Wiki courtesy? Tinlinkin (talk) 09:08, 18 May 2010 (UTC)Reply

I followed the posted advice that appears after I move a page, to fix only the most important double redirects and let the bot take care of the rest later. That must be a relatively new message, and I'm relieved by it. Tinlinkin (talk) 04:38, 1 June 2010 (UTC)Reply

WikiGnomes edit

I think that a WikiGnome would fix double redirects (I don't know, but it seems like it would be in their nature.) If they do, it should be noted. — Preceding unsigned comment added by 74.99.167.211 (talk) 01:29, 4 October 2011 (UTC)Reply

Noting...

30xelawalex03 (talk) 23:38, 11 March 2012 (UTC)Reply

Special page edit

  A redirect....special page....

Tis could be misleading. They are pages that are special, but calling it a "special page" could be misleading. It is edited, has wikitext, etc. etc. So it is not a special page on Wikipedia.

Walex & 03. A Life together. (talk) —Preceding undated comment added 23:10, 26 February 2012 (UTC).Reply

Original redirects edit

An example of a double redirect is an editor redirecting to a page he has not read, unaware that it is a redirect. Therefore the double redirect could be caused of unawareness, and the only thing I can say is: Do the check, then the redirect!

Walex03. Talking, working, friending. 20:41, 22 March 2012 (UTC)Reply

Wikipedia:Example of a double redirect edit

Shouldn't Wikipedia:Example of a double redirect be listed under see also? Emmette Hernandez Coleman (talk) 17:08, 9 October 2012 (UTC)Reply

No, it should be listed under a new section labelled: "Example of a double redirect". Otherwise, under "see also", one could assume that the link had not been updated since its target page had been moved. --Funandtrvl (talk) 17:24, 9 October 2012 (UTC)Reply

Are double redirects still being fixed by bots? edit

Since Special:DoubleRedirects hasn't been working for weeks, are bots actually still fixing double redirect? Kaldari (talk) 02:52, 9 December 2012 (UTC)Reply

Looks like AvocatoBot is still working at least. I guess it must be using the move logs. Kaldari (talk) 03:00, 9 December 2012 (UTC)Reply
Reply - I've noticed as well that the redirect bots are working a lot more slowly than usual, if they are working at all. This could become extremely problematic in the event of page moves when disambiguation pages are created. Since I do not have the skill set to create a redirect bot, should this be posted in some more highly visible forum, such as WP:AN? --Jax 0677 (talk) 14:55, 25 September 2013 (UTC)Reply

Proposal to increase $wgMaxRedirects edit

It seems there is a consensus that double redirects should be permitted. MediaWiki will already follow longer redirect chains if $wgMaxRedirects is set to a value greater than 1. Even $wgMaxRedirects = 2 would make the tedious manual fixing of double redirects from page moves unnecessary. A possible dispute whether double redirects should be discouraged does not matter, because bots can continue fixing double redirects even if MediaWiki is able to follow them. So, why is $wgMaxRedirects still set to 1? — Petr Matas 22:09, 25 March 2014 (UTC)Reply

  • Reply - I don't have the skill set to know whether infinite loops would be created by setting $wgMaxRedirects = 2. All I know, is that bots are taking days instead of hours to fix double redirects. This also happened in September 2013. --Jax 0677 (talk) 22:26, 25 March 2014 (UTC)Reply
No $wgMaxRedirects value can create infinite loops, because MediaWiki will always stop after at most $wgMaxRedirects hops. Let's assume for example, that $wgMaxRedirects = 3. If you create a cyclic chain "A → B → A" and visit page A, MediaWiki will stop after traversing the path "A, B, A, B", so you will see page B saying that it is a redirect to page A. — Petr Matas 22:48, 25 March 2014 (UTC)Reply

Some double redirects are good or MDRAG Redux edit

I just ran into a situation similar to some of what User:Sebastian was addressing previously above in Many double redirects are good.

Specifically to the form of:

  • X′ → X → Y


The rationale
I created a redirect (X) to an existing article's subsection (Y); however, said redirect may well grow into an article page of its own at some point. I then created a redirect for an acronym of the preceding redirect (X').

I initially directed X′ to X in anticipation of some future point where things would progress to:

  • X′ → X ← Y′
(with Y′ being a {{main|X}} link at the head of section Y)

before finding that at present redirects actually work in steps:

  • X′ → X, X → Y)


Anyway my rationale was/is that:

  • X′ → X → Y

takes less steps to get to:

  • X′ → X ← Y′

than does:

  • X′ → Y, X → Y

And much less than:

  • X′ → Y, X″ → Y, X‴ → Y, ... X → Y

takes to get to:

  • X′ → X, X″ → X, X‴ → X, ... , X ← Y'
(with X″, X‴, etc. being additional alternate terms for X)


--Kevjonesin (talk) 09:37, 29 April 2014 (UTC)Reply


The goal edit

The goal would be to allow multiple sub-redirects to simply continue pointing to the same target when a primary redirect (to another articles subsection) becomes a mainspace article and supersedes a preexisting subsection which had been 'filling in' by receiving links in the interim.

--Kevjonesin (talk) 09:37, 29 April 2014 (UTC)Reply

The proposal edit

The proposal is not to change the default settings which are optimized to handle unintentional double redirects incurred by moving pages and such. Instead I propose—if technically feasible—implementing an optional manual exception that can be intentionally applied by people to tell the 'bots to leave well enough alone and allow a specifically indicated double redirect. Preferably with an option to have it seamlessly carry through the double to the destination target so as to be largely transparent to readers. While retaining a notice at the top of the page noting the chain of redirects—like what is currently in use.

--Kevjonesin (talk) 09:37, 29 April 2014 (UTC)Reply

  • Support. This is an interesting idea and I like it. Let us have redirects X1 → X, X2 → X, X → Y, where X1 and X2 are alternate spellings of X, and Y is not equivalent to X, but it is the best we have on X. In this case, something like a {{Do not fix double redirects}} template call should be added to the source of X. Note that to reflect the logical structure, X should be marked with the template, which should prevent the bots from modifying X1 and X2. A {{nobots}} template with similar functionality already exists, but it has some drawbacks: It would have to be placed to X1 and X2 instead of X, and not all bots obey it. Concerning transparent following of double redirects, this is already implemented, but disabled, see #Proposal to increase $wgMaxRedirects. — Petr Matas 11:49, 29 April 2014 (UTC)Reply
Petr, I've a question regarding your edit summary linked here. I'm not seeing the specific #Special:MovePage contains a useless double redirect caveat thread on Wikipedia:Village pump (technical) has it been buried in the archive? If you come across it again, please leave a URL link to it. I'd like to check it out. --Kevjonesin (talk) 13:47, 29 April 2014 (UTC)Reply
p.s.— Disregard the previous; I found the thread. It was a couple pages into the archive. --Kevjonesin (talk) 13:54, 29 April 2014 (UTC)Reply
  • Comment: After looking at comments on this bug page It occurs to me I hadn't been giving thought to intentional misuse. Some sort of oversight or permission consideration may be worth considering to avoid prank redirect chains. Or perhaps not ... How is this handled with current single redirects? Are they formally vetted in any way? Not just upon initial creation, but if changed as well? I guess doubles are no more subject to abuse than singles. Both have potential for surprise endings. The template which initiates the override to allow a double redirect exception could likely trigger addition to a patrolled list as well if so desired. --Kevjonesin (talk) 14:23, 29 April 2014 (UTC)Reply
I think that the discussion you are talking about does not apply to your original proposal. The discussion dealt with the following: For some time in the past, there was a check box in the Move page form, which caused all redirects to the original name to be updated to the new name immediately after moving the page. Page move vandals enabled this often, which caused troubles. That's why the check box has been removed. — Petr Matas 16:06, 29 April 2014 (UTC)Reply
It appears to me, that the only way of watching the redirects to page X is to find them using Special:WhatLinksHere/X with hidden links and add them to your watch list manually. — Petr Matas 16:06, 29 April 2014 (UTC)Reply
I don't understand your proposal for automated addition to the watch list. When you edit a page, there is a check box Watch this page above the save button. — Petr Matas 16:13, 29 April 2014 (UTC)Reply
I wasn't intending to suggesting automatic addition to an individuals watchlist but rather to a category's list. Similar to how using {{R with possibilities}} adds the page on which it's used to Category:Redirects with possibilities. Not just to keep an eye out for misuse but also as a general administrative reference tool. --Kevjonesin (talk) 00:26, 30 April 2014 (UTC)Reply
I see. That will be useful. It appears to me that this {{R with possibilities}} template is exactly what we need. — Petr Matas 01:42, 30 April 2014 (UTC)Reply
As to misuse concerns, I think I mostly just felt naive that I hadn't given consideration to such previously. After giving it more thought, I now think potential for abuse of double redirects is fairly trivial. At least no worse than for normal redirects. --Kevjonesin (talk) 00:26, 30 April 2014 (UTC)Reply
Right on, y'all. Attaching the functionality to the existing template does seem to make sense. And makes the idea easier to state — i.e. "Please allow redirect pages targeting a page tagged with {{R with possibilities}} to be redirected through to the next targeted page." --Kevjonesin (talk) 07:27, 1 May 2014 (UTC)Reply
  • Support per Kevjonesin. {{R with possibilities}} already states "Do not replace links to this redirect with a link directly to the target page", which could be easily changed to "Do not replace links or redirects to this redirect with a link or redirect directly to the target page" (with the appropriate code changes to bots and $wgMaxRedirects). --Ahecht (TALK
    PAGE
    ) 14:25, 2 May 2014 (UTC)Reply
  • Support. I have run into this situation quite often already, therefore I very much support the proposal to allow double-redirects, if it can be implemented in a seamless manner for users. So far, my approach to the problem was to add a HTML comment to would-be-double-redirects to hint future editors at a possibly more suitable, but not currently possible redirect target (together with a context-specific set of Rcat templates) as follows (assuming this would be self-explanatory):

X1:

#redirect [[Y]] {{R to related topic}}
<!-- #redirect [[X]] {{R from alternative name}} -->

In some cases, I even gave multiple such HTML comments to indicate perceived priorities or to make other editors aware of "parallel" redirects which may need "parallel" maintenance if something is changed on this redirect page.
However, I like the idea to use the R with possibilities even more. --Matthiaspaul (talk) 09:21, 5 May 2014 (UTC)Reply

 Â Remark: The proposal is ready. Please vote! Petr Matas 11:03, 7 May 2014 (UTC)Reply

URL edit

When one visits a double redirect, the URL now changes to the URL containing w/index.php?title=target&redirect=no rather than wiki/target. If one then refreshes the page, it remains the same instead of redirecting one more time. GeoffreyT2000 (talk) 17:01, 24 May 2015 (UTC)Reply

Let's make sure we both understand what a "double redirect" is, GeoffreyT2000. Let's say that there is a page called Article that has a redirect to it called Redirect. I decide to move Article to a different title called New article, which means that Article will be left as a redirect to New article. Now Redirect is a "double redirect" because it targets another redirect (Article). This isn't allowed by the software, so I must then go to the Redirect page and change it to target the New article page. (If I don't do this, a bot will soon find the double redirect and make it right.)
What you may be experiencing is a direct link to a redirect, either from the TOP of an article page just under the title, or from a link like this: {{-r|Redirection}}. All redirects are designed to go immediately to their targets – there is no pause. If you go to a redirect page by a direct link, and then you refresh the page, it will not usually change the page. Hope this helps! – Paine  17:55, 29 May 2015 (UTC)Reply
Upon re-reading what you wrote above, I may have misunderstood you, GeoffreyT2000. It appears that what you describe is exactly how a double redirect works until it is retargeted away from the second redirect. Refreshing the page won't cause the second redirect to go to its target, and that is normal. Due to the software limitation, your choices would be to either click the link to the target or to go ahead and fix the double redirect, so that what happened to you won't happen to others. Sometimes the server load is high and it takes more time for the bot to find and fix any double redirects. That is why, for example, it is still important to manually fix any double redirects that result from a page move/rename. – Paine  14:10, 30 May 2015 (UTC)Reply

Edit request on 30 May 2015 edit

A protected redirect, Wikipedia:Example of a double redirect, needs updated information and redirect category (rcat) templates added. Please modify it as follows:

  • from this:
#REDIRECT [[Wikipedia:Double redirect]]
<!-- Please do not "fix" this double redirect. It exists to be a real example of what a double-redirect is for the ultimate target page. -->
{{nobots}}

If you can see this text and you are not looking at the edit or search page this page is not working as an example of a double redirect.
  • to this:
#REDIRECT [[Wikipedia:Double redirect]]

{{Notice|'''''Note:'''  The target of this redirect is also a redirect, which makes this an example of a "double redirect".  '''Please do not "fix" this double redirect.''' It exists to be a real example of what a double redirect is for the ultimate target page.  It is linked as an example from the lead of the project page, '''[[Wikipedia:Double redirects]]''' (note the plural as opposed to the singular target above).''}}

{{Redr|to project namespace|with history|for convenience}}
{{nobots}}
  • WHEN YOU COPY & PASTE, PLEASE LEAVE THE SKIPPED LINES BLANK FOR READABILITY.

Template Redr is an alias for the {{This is a redirect}} template, which is used to sort redirects into one or more categories. No protection rcat is needed, and if {{pp-protected}} and/or {{pp-move}} suffice, the This is a redirect template will detect the protection level(s) and categorize the redirect automatically. (The categories will also be automatically removed when and if protection is lifted.) Thank you in advance! – Paine  16:19, 30 May 2015 (UTC)Reply

  Done — Mr. Stradivarius ♪ talk ♪ 03:26, 31 May 2015 (UTC)Reply
Many thanks, Mr. S ! – Paine  12:19, 31 May 2015 (UTC)Reply

Double redirect bots are working more slowly as of late edit

There is a discussion about "Double redirect bots working more slowly as of late" at Wikipedia:Village_pump_(technical)#Double_redirect_bots. Please contribute to the discussion at this location. --Jax 0677 (talk) 18:44, 5 February 2016 (UTC)Reply

STATICREDIRECT edit

See Help talk:Magic words#STATICREDIRECT. Do any bots or other tools look for and honor STATICREDIRECT on English Wikipedia? wbm1058 (talk) 20:28, 3 May 2016 (UTC)Reply

Soft redirects edit

Green-ink letter is currently a soft redirect to Wiktionary. Should Green ink letter be a redirect (ie a double redirect?) to Green-ink letter, or should it be a soft redirect to Wiktionary? 81.141.56.172 (talk) 02:30, 25 December 2016 (UTC)Reply

I see no reason that soft redirects cannot be linked to by hard redirects. It does not appear that hard redirects are considered "double redirects" when they target soft redirects. I could be wrong, however I cannot find a policy nor a guideline that advises differently.  Paine Ellsworth  u/c 16:14, 18 January 2017 (UTC)Reply

The bots should operate with a delay edit

The bots don't seem to take more than a couple of minutes to notice when a double redirect has turned up and proceed to fix it. If the editor action that lead to the existence of a double redirect is such that it is soon reverted (say, a bad move), then this otherwise commendable rapidity leads to two kinds of undesirable and unintended consequences:

1) This occurs in a small but still significant number of cases across different types of scenarios. For example, A is moved (say, by a POV-pusher) to the new title B, the redirects to A are promptly retargeted to B, but then it's quickly realised that B actually refers to something else entirely, so the article is moved back to A, while B is redirected to C. The result is that all former redirects to A are now pointing to the unrelated article C. Or a different situation: article P gets redirected to Q, then all the redirects to P are retargeted to Q, but before long P is restored as an article; the net effect is that now all its redirects have ended up leading to Q. In both scenarios, the bad edit that caused them is not vandalism, it's content-related, and the time this takes to be noticed by the article watchers and get reverted is of the magnitude of hours or occasionally days, which is quite different from the rapid timescale at which the bots seem to operate.

2) This is less perncicious, almost harmless, but it occurs every time a move is reversed: X is moved to Y, but then Y is moved back to X. In the meantime, the redirects have been repointed first to Y and then back to X, where they belong, so that's fine, everything is as before. What has changed is the number of edits in the histories of those redirects: before that, the history is most likely to have contained only the edit that created the redirect. Afterwards, all the redirects have edit histories containing at least three edits each, so if at some point in the future it's decided to move the article over any of those redirects, the move will only be executable by an admin or a page mover. Again, this is a really small effect but it's cummulative and in the long run it reduces the number of page moves that can be carried out by regular editors.

These effects can be prevented if the bots that fix double redirects are set to operate with a certain delay, which should be roughly equal to the time it takes for most questionable edits to get noticed and reverted. Of course, this remedy will itself have the negative consequence of creating inconvenience for readers who will be more likely to encounter double redirects. Given the transient and minor nature of this inconvenience, I think it's outweighed by the benefits of avoiding wrong redirects and reducing the long-term maintenance burden.

Situations similar to the ones in 1) were noticed by Ivanvector in 2015, but I'm not aware of any previous proposals to the same effect. Any thoughts anyone? – Uanfala 21:29, 8 October 2017 (UTC)Reply

Hm, nobody had any input when I started that thread two years ago, and this issue is just a little bit different but I kinda like the solution of programming in a delay. It probably doesn't even need to be that log of a delay, maybe minutes or hours, but I'm guessing. Maybe the bots could be programmed with logic to determine the "quality" of fixing a double redirect, based on finding the text of the redirect within the body of the new target, or based on the similarity of other incoming links, or based on the activity of the account which made the edit triggering the double redirect, or something, and if the quality is low it will populate a list on a subpage at WP:RFD or something like that, rather than fixing it right away. Of course, I don't know how to do any of that. Ivanvector (Talk/Edits) 12:21, 9 October 2017 (UTC)Reply
I too like the idea of a delay, but more thought would be needed as to how long it should be and if it should be different for different types. If it is to be one-size-fits-all then something on the order of 30 minutes is probably going to be the best compromise. For a more fine-grained approach, some thoughts:
  • user levels may be useful - a move by an admin should be presumed to be correct much sooner than one by a user who is not yet autoconfirmed.
  • If a requested move is mentioned in the move summary then it should be fixed after only a few minutes (say 5, to allow for typos to be spotted, etc).
  • If a page has been moved many times previously then wait longer as it's more likely to be controversial. Doubly so if a previous move to this title was reverted.
I think the bots should watch for reverted moves and undo/reverse any edits they made as a result of the first move (but only if the page has not been edited by a human in the meanwhile). I don't have any idea how easy any of this would be to do though, so I'll leave a message for some bot operators. Thryduulf (talk) 16:37, 9 October 2017 (UTC)Reply

If some sort of delay gets accepted, then I think it needs to be much more substantial than half an hour. A blatantly bad move like the last one I've had to deal with (and which prompted me to start this thread): West Azerbaijan Province -> West Azerbaijan is Kurdistan [12] took about 80 minutes to get noticed. Most bad moves are neither so blatant nor occurring on such well-watched pages, so the average time it takes to notice and revert them is higher.
An alternative solution would be to allow the existence of double redirects, from the little I've seen so far it appears that all this depends on is a switch in the server software (currently set to '1') that tells how many levels are followed. There was proposal precisely to the same effect a few threads above: #Proposal to increase $wgMaxRedirects (which was followed at Wikipedia:Village pump (proposals)/Archive 112#Allow some double redirects). Ultimately, this is probably a better solution as it would reduce the redirect maintenance needed when topic restructuring (e.g. splitting or moving articles, creating dab pages etc.) and it would make for more sensible rcatting (think of the {{R from misspelling}} of an {{R from related word}} etc.). – Uanfala 22:52, 10 October 2017 (UTC)Reply

One related solution: bots should add {{R avoided double redirect}} tags when fixing a double redirect (AvicBot does not do this, I'm not sure if other bots do). That way the information about the former target is preserved in the Wikitext (not just the edit summary), and better yet that template catches the A-B-C situation you describe: if D redirected to A, and was tagged as {{R avoided double redirect|A}} when the bot updated D to A's new bad target, then once A is reverted to being an article the template would put D into the maintenance category Category:Avoided double redirects to be updated. (I'm not sure if the template also catches the P-Q situation.) 59.149.124.29 (talk) 02:17, 12 October 2017 (UTC)Reply
Seconded. I recently came across a case where a page was redirected without discussion, and reverted.[13] However, the bot had already changed the redirect, and this wasn't caught by the reverting editor, leaving the redirect pointing to the wrong target for almost a year.[14] Tagging such redirected pages with {{R avoided double redirect}} would help address the issue. --Paul_012 (talk) 15:25, 16 November 2017 (UTC)Reply
Proposal filed at WP:PROPS § Bots fixing double redirects should tag them with rcat. Petr Matas 06:47, 1 August 2022 (UTC)Reply

Within a couple (of) days of creation edit

Should the sentence about double redirects and bots read:

  1. "Double redirects are easily and automatically fixed by bots, and most are fixed within a couple days of creation."; or
  2. "Double redirects are easily and automatically fixed by bots, and most are fixed within a couple of days of creation." (bolding for emphasis here only)

Bkonrad has reverted an ip user, myself and Redrose64 in favour of option 1, citing the version with two "of"s as "bad grammar" - directly contradictory to what I've always been taught and my experience as a native English speaker. Before this gets into edit warring territory, please could we have additional input. Thryduulf (talk) 12:43, 2 October 2018 (UTC)Reply

Clearly (2). "couple days" sounds like an event for married people only! If you want to avoid the "of ... of", rephrase it as: "most are fixed within two days of creation". Martin of Sheffield (talk) 14:23, 2 October 2018 (UTC)Reply
Agree. "Couple days" is valid only for informal spoken English. The proper form in more formal written English is "couple of days". But there's a different problem: "couple" gives an ambiguous count. Some people think it should mean exactly two, others that it is more broad in meaning. I think "a few days" would probably be a better choice in this case. (And not "a few of days"!) —David Eppstein (talk) 17:18, 2 October 2018 (UTC)Reply
That must be a dialect difference then. I've only ever heard "a couple of days", although the "of" is sometimes reduced to just a shcwa in informal speech it is always there. Thryduulf (talk) 18:37, 2 October 2018 (UTC)Reply
Personally I think reduction to a shcwa (or even a couple shcwa) isn't much of a reduction. EEng 21:34, 2 October 2018 (UTC)Reply
It's certainly not a reduction that is usually (ever?) reflected in writing (other than eye-dialect and phonemic transcription). Thryduulf (talk) 22:23, 2 October 2018 (UTC)Reply
Well, that's one joke that certainly fell flat. EEng 00:03, 3 October 2018 (UTC)Reply
One might more easily say "most are fixed within days of creation". -- GreenC 18:09, 2 October 2018 (UTC)Reply
I've always taken to mean "a couple of days" in this context to mean "approximately two", "a few days" would mean to me "between about 2 and 5 days". If we're rephrasing it (to which I have no objection) then I'd prefer "within days of creation". Thryduulf (talk) 18:37, 2 October 2018 (UTC)Reply
If the word "couple" is being used then the phrase needs to be "a couple of days". If having two "of"s is problematic then an overall re-wording could be inorder. Also, "couple" refers to specifically two, especially in terms of formal writing. It sounds to me like this is not actually guarenteed, and that "few" is probably more appropriate. - adamstom97 (talk) 20:05, 2 October 2018 (UTC)Reply
Brought here by Eeng's ringing endorsement. It needs to be 'a couple of'. If repetition is a problem (I don't think it is), reword to say 'a few'. Usage example. GirthSummit (blether) 20:42, 2 October 2018 (UTC)Reply
Must be something of a regional thing. "Within a couple of days of" sounds rather peculiar to my ear. There are many examples in the wild of both formations, (1) (2) so I don't think either can be said to be incorrect grammar (or "nonsense" as Thryduulf claims). And for the record, Thryduulf quite incorrectly states that I said (2) was "bad grammar".
Also, the language has been part of the guideline since August 2012, so if the grammar was so bad, it certainly was overlooked by many. I support proposal to rephrase as "a few". older ≠ wiser 01:17, 3 October 2018 (UTC)Reply
Much bad phrasing in the English WP does go overlooked by many, sometimes for many years - there's a lot here to read. I can't imagine "A couple days" ever being used in Brit Eng, either spoken or written. I suspect that it began in North America - compare e.g. informal "A hundred twenty" where Brit Eng, and surely all formal English, would always be "A hundred and twenty". There is no reasonable objection to two "of"s separated by a word as a wording; such semi-repeats of other words occur in English from time to time, with no objections. With regard to the remedy, there is (as ever) no point in being unnecessarily imprecise in meaning. Either "two" or "a few", depending on the intended meaning, seems appropriate to me. Harfarhs (talk) 07:47, 3 October 2018 (UTC)Reply
@Knonrad: Fair enough you dind't use those exact words, but it is I feel a not-unfair summary of your various stated objections. As for "within a couple days" being "nonsense" as a native (British) English speaker it is. While I see there are examples, they're much rarer and seem to be almost entirely informal contexts (raw google numbers: "within a couple days" 2.8 million hits, "within a couple of days" 15.7 million hits). Thryduulf (talk) 10:09, 3 October 2018 (UTC)Reply
"Within a couple of days of" sounds like pedantic fuddyduddyism to my ear. When I said it "seems like worse English" in response to Redrose's calling the other "bad English", that doesn't say anything about grammar -- it was about whether the "of" is necessary. older ≠ wiser 10:35, 3 October 2018 (UTC)Reply
In British English, and all forms of formal English, the "of" is necessary and anything else is ungrammatical. Thryduulf (talk) 10:46, 3 October 2018 (UTC)Reply
Seems not everyone in Britain agrees with that prescriptivist edict. older ≠ wiser 11:17, 3 October 2018 (UTC)Reply
OK, I'll spell it out: it was bad grammar, but bad English is quicker to type. This is needless nitpicking that seems to be a poor attempt to justify sloppy writing. It's not been pointed out so far, but Bkonrad's edits were tantamount to WP:EW. --Redrose64 🌹 (talk) 18:53, 3 October 2018 (UTC)Reply
You're entitled to have presriptivist predilections. Fact is the usage is commonplace in both written and spoken English. older ≠ wiser 19:02, 3 October 2018 (UTC)Reply

Assuming the goodest of faith in all parties concerned, the way it is in the present moment, "within a few days", is more than acceptable. Keep this up and all of you will receive a docking of your pay!  Paine Ellsworth  put'r there  18:59, 4 October 2018 (UTC)Reply

This is getting far too silly. The Colonel (talk) 15:16, 5 October 2018 (UTC)

The following discussion is an example of how The Department of Artificially Induced Reactions Stimulated by Substantial Amounts of Downright Hilarity Resulting in an Excess Amount of Endorphins can be the BADDEST!!! Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Goodest of faith edit

Let's switch the debate to whether goodest of faith is acceptable. EEng 21:48, 4 October 2018 (UTC)Reply

Come on, you all. Don't we have gooder things to do? – Uanfala (talk) 22:11, 4 October 2018 (UTC)Reply
Did grapple a bit betwyxt "goodest" and "bestest"; however, the "G" in "AGF" won the day!  Paine Ellsworth  put'r there  01:31, 5 October 2018 (UTC)Reply
It's not "goodest", but "doubleplusgood". --Redrose64 🌹 (talk) 09:23, 5 October 2018 (UTC)Reply
I prefer Time to Newsweek. EEng 11:57, 5 October 2018 (UTC)Reply
Yes, or well, in my humble opinion, 2 + 2 = EEng's user page.  Paine Ellsworth  put'r there  15:10, 5 October 2018 (UTC)Reply

grapple a bit betwyxt edit

Now let's switch the debate to whether grapple a bit betwyxt is acceptable. EEng 02:00, 5 October 2018 (UTC)Reply

"Why's everybody always pickin' on me?" – Charlie Brown  Paine Ellsworth  put'r there  14:41, 5 October 2018 (UTC)Reply
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Fun Police edit

Looks like de Fun Police has arrived!  Paine Ellsworth  put'r there  15:54, 5 October 2018 (UTC)Reply

Protected edit request on 9 July 2020 edit

Please add Wikipedia:Example of a double redirect to Category:Example. WT79 (speak to me | editing patterns | what I been doing) 15:41, 9 July 2020 (UTC)Reply

 Â Not done This page is a informational page about double redirects, not an example of a double redirect. Thryduulf (talk) 15:51, 9 July 2020 (UTC)Reply

Some double redirects edit

There are some double redirects that can't be identified by the software, when there is non-obvious content after the redirect. Thingofme (talk) 14:44, 2 March 2022 (UTC)Reply

@Thingofme: For example? --Redrose64 🌹 (talk) 21:51, 2 March 2022 (UTC)Reply
This link. Thingofme (talk) 00:51, 3 March 2022 (UTC)Reply
It's no longer a double redirect. Are there any existing examples? --Redrose64 🌹 (talk) 08:31, 3 March 2022 (UTC)Reply

Move discussion in progress edit

There is a move discussion in progress on Wikipedia talk:RR (disambiguation) which affects this page. Please participate on that page and not in this talk page section. Thank you. BilledMammal (talk) 05:50, 29 May 2022 (UTC)Reply

@BilledMammal: Why does it affect this page? --Redrose64 🌹 (talk) 19:54, 29 May 2022 (UTC)Reply
The disambiguation page includes Wikipedia:Double redirect. I think someone believed that RR was a reasonable possible link to this page. BilledMammal (talk) 23:05, 29 May 2022 (UTC)Reply

Template:R avoided double redirect edit

{{R avoided double redirect}} allows pre-emptive linking of redirect X to redirect Y, where both X and Y point to the same target. The benefit is that if Y is refactored into an article, then a bot will update X to point to Y. Useful where Y is (or ought to be) a {{R with possibilities}}, enough I think to be worth mentioning on this page. But any explanation would need to be worded carefully, to make sense to intermediate-level editors and so that beginners could ignore it rather than be frightened by it. To being with, maybe just add a mention in the See-also section? jnestorius(talk) 23:23, 9 May 2023 (UTC)Reply

Bots edit

I created 41 double redirects via a page move. This article says that there are bots which correct double redirects. That's cool and all but it says nothing about how they work. I know it says "automatically" but is that "automatic" after I've made a request somewhere? Please advise. Dennis C. Abrams (talk) 14:07, 14 March 2024 (UTC)Reply

@Denniscabrams all you need to do is wait and they will be fixed. AIUI they read the move log and check whether any of tem resulted in a double redirect, and fix any they find. It can sometimes take a few hours though (it seems to depend on when the operators are awake) so if there are only one or two, or very high traffic pages impacted it is worth fixing them manually but otherwise it's not worth your time. Thryduulf (talk) 18:58, 14 March 2024 (UTC)Reply
Thanks. And they've already been fixed! Dennis C. Abrams (talk) 19:08, 14 March 2024 (UTC)Reply