User talk:Ladsgroup
User page Discussion
About this board
Edit description
Previous discussion was archived at User talk:Ladsgroup/Archive 8 on 2015-09-02.
All structured data from the main, Property, Lexeme, and EntitySchema namespaces is available under the Creative Commons CC0 License; text in the other namespaces is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy.
Start a new topic
Question about deletion criteria
7 comments • 5 days ago
Bovlb
Hi! I recently discovered Wikidata:Requests for permissions/Bot/Dexbot 13 and I had a question: How does it handle the case of an item where a user deletes the claims and sitelinks from an item, but the former sitelinks still exist on the client project?
Reply
10 days ago
Ladsgroup
Hi, no it doesn't delete unless it's a sitelink removal by deletion. Have you seen the bot doing otherwise?
Reply
8 days ago
Bovlb
No. I just learnt about this bot as part of processing a request for undeletion, and I couldn't see any test for that specific condition in the code, just:
    if item.sitelinks:
        continue
Reply
8 days ago
Ladsgroup
That code is extremely old. I just double checked and the new one checks for existence of the page and if it exists, it skips deletion.
Reply
7 days ago
Bovlb
OK, thanks. Good to know.
I don't know how to find the current code for your bot, so I was relying on the link in the RFP.
Reply
7 days ago
Ladsgroup
You are absolutely right. I just updated the gist.
Reply
5 days ago
Bovlb
Thanks. One case that especially concerns me is where an editor finds a duplicate item and, instead of merging, blanks it, transferring the sitelinks to another item. I'm happy to know that such items aren't quietly deleted by a bot, but instead have some change of being found and fixed.
Reply
5 days ago
Reply to "Question about deletion criteria"
Correct references through bot
34 comments • 23 days ago
Epìdosis
Hi! I remember you run a very efficient bot and in the past I asked you some fixes which were very efficient. Now I mostly do fixes through QuickStatements, which is a very good tool, but isn't still able to fix references leaving the statements unchanged. I sometimes notice big groups of items (thousands and tens of thousands) having references which are imprecise or wrong and I don't know who to ask for correction. Could I slowly report you some notable cases of references to be fixed, so that we can slowly deal with them through your bot? I think it is crucial for our data quality having references which are exactly correct, whilst at the moment this fact often doesn't happen. Thank you very much in advance!
Reply
1 year ago
Ladsgroup
Hey sure. I try to write something but I want to know the exact framework so I don't need write similar code every time, so I would write something general and use that every time.
Can you give me a couple of examples?
Reply
1 year ago
Epìdosis
OK, great! So, here is a detailed panoramic of the situation. I see three main types of errors to be corrected:
For whichever question, ask me! When you have the bot ready, please start with some test-edits, so that I can have a look. Thank you very much in advance!
Reply
Edited 1 year ago
Ladsgroup
Thanks. I try to tackle it next weekend. This weekend I'm drowning in something personal.
Reply
1 year ago
Epìdosis
Hi! Any updates? Obviously no urgence, as I said - just a little message in order not to forget myself the issue :)
Reply
1 year ago
Ladsgroup
Hey, sorry. I have been doing a million things and have been drowning in work but will get to it ASAP. I took some vacation for volunteer work :)
Reply
1 year ago
Ladsgroup
But it's on my radar, always has been. Don't worry.
Reply
1 year ago
Ladsgroup
Again. I have not forgotten about this. One day I will get it done. It's just there are so many things to do :(
Reply
Edited 1 year ago
Ladsgroup
Okay, one part is done: The bot now takes a SPARQL query and removes references that are exact duplicates. here's an example. I will write more in next weekends.
Reply
1 year ago
Epìdosis
Very good, thanks!
Reply
1 year ago
Ladsgroup
And the second type Let me know if we want to clean up more. First type is very similar to the second one. So consider that done as well. Let's do this then.
Reply
1 year ago
Epìdosis
Very good fixes for HDS ID (P902), great work! Could you link me also examples for InPhO ID (P863) and Spanish Biographical Dictionary ID (P4459)? After these two, second type is surely OK.
Reply
1 year ago
Ladsgroup
I'm doing them one by one because there's so many of them and for example the P902 took a day to finish. The P863 is underway
Reply
1 year ago
Epìdosis
Ok P863!
Reply
1 year ago
Epìdosis
A little case related to third type: Benezit ID (P2843)that had been inserted as reference in two different ways, the older one with reference URL (P854) and the more recent one with Benezit ID (P2843).
Reply
11 months ago
Ladsgroup
Right now I'm cleaning up the third part of type two () but I will get to others soon.
Reply
11 months ago
Epìdosis
Very good P4459!
Reply
11 months ago
Ladsgroup
Done now, Gosh it took days :))) Let me fix type one now.
Reply
11 months ago
Ladsgroup
Can you give me a SPARQL query for the first type? I'm not good at queries involving refs :(
Reply
11 months ago
Epìdosis
Use https://w.wiki/iNn, it contains both cases of date of birth (P569) and of date of death (P570).
Reply
11 months ago
Ladsgroup
Started
Reply
11 months ago
Epìdosis
Very good. Waiting for part 3, which is obviously the most difficult, I have another task: all uses of described by source (P1343) in references (these thousands) should be substituted with stated in (P248), in order to avoid scope-constraint violations.
Reply
10 months ago
Ladsgroup
Fixing
The third type is not that hard. I thought it's done. Let me double check and clean the mess.
Reply
10 months ago
Ladsgroup
Re-reading what you wrote for the third type a couple of times and now I get what you want but it's pretty complex. I'll try to see what I can do about it next weekend.
Reply
10 months ago
Epìdosis
Hi! When you have time, could you have a look at these three?
They are probably less difficult than point 3 above, which I understand is quite difficult. See you soon!
Reply
Edited 8 months ago
Ladsgroup
Hey, Sure. Just give me a week or two.
Reply
Edited 8 months ago
Ladsgroup
Wrote something that can cleanup duplicates and subsets (e.g. if the reference is fully covered in another reference and more). I already started the bot and it's cleaning. Will continue but I don't think I can clean up more than that as it gets really really complicated.
Reply
7 months ago
Epìdosis
Perfect! When it finishes, could you schedule it as periodic maintenance (e.g. once a month)? This would assure us the stability of the quality.
Reply
7 months ago
Ladsgroup
It works based on SPARQL queries. Which queries you want me to run regularly?
Reply
7 months ago
Epìdosis
Maybe after the cleanup Dexbot is doing now it won't be necessary anymore; I think that these redundant references have been inserted due to an error by Reinheitsgebot, so maybe the error has been solved and the cases won't surge again. Maybe, however, I will give you other queries (of third type) in the future if I find similar problems with different properties.
Reply
7 months ago
Epìdosis
Just two more tasks when you have time: Wikidata:Bot requests#Accademia delle Scienze di Torino multiple references and Wikidata:Bot requests#Fix values of P248 in references (2021-06-13). Thanks!
Reply
4 months ago
Epìdosis
When you have time, could you have a look at Wikidata:Bot requests#Accademia delle Scienze di Torino multiple references? Thanks as always!
Reply
1 month ago
Ladsgroup
Hi, you mean the Czech part? I just fixed it and running it again. Everything else has been for really long time now.
Reply
23 days ago
Epìdosis
No, I mean Wikidata:Bot_requests/Archive/2020/12#Accademia_delle_Scienze_di_Torino_multiple_references (don't know why it has been archived!); it would be very useful.
Reply
23 days ago
Reply to "Correct references through bot"
Machine learning
3 comments • 1 month ago
Mike Peel
Hi, thanks for the talk yesterday on ORES, and I hope you didn't mind my questions/comments. :-)
I've been working on adding interwikis to new articles for some Wikipedias (and also Commons!) to Wikidata items, but I'm wondering if there are better ways of doing it (currently I just auto-search for matches, and manually say yes/no to add them, within a python script). I've just proposed a potential Outreachy project to improve the current codes I'm using, see https://phabricator.wikimedia.org/T290718 . As part of that, I'm wondering if machine learning might be applicable here - it feels like there's a great training set with all of the other articles that already have sitelinks, which could then be used to assess how good potential matches are, and maybe the highest confidence matches could then be added automatically, so only lower confidence ones need manual checking. I know of machine learning, though, but not how to actually do it!
If you think this might be possible, would you be interested in being a co-mentor for the Outreachy project, and we can make it a bit more ML-focused?
Reply
1 month ago
Ladsgroup
Thanks. it is a great idea and I added it to my list of work to be used by AI. There are several ways to attack the problem but definitely building a machine learning system would help. I suggest not to add it to this outreachy work but part of that outreachy work would be to make the code pluggable, so later a service/API can be built and then your code easily use their recommendations. How does that sound? cc @Lydia Pintscher (WMDE)
Reply
1 month ago
Mike Peel
Thanks for the reply. I can't see a clear way to make the code 'pluggable' - I think that if we're going to use ML, then it has to be built in, or perhaps there has to be a clear way to query it, with a yes/no answer, that maybe could be added as a proceed/stop check.
Reply
1 month ago
Reply to "Machine learning"
Two questions about archiving
One comment • 2 months ago
Aram
Hi Ladsgroup, I have two questions about archiving sections using ckb:بەکارھێنەر:Dexbot/Archivebot​. First, As far as I understand, the bot archives level two sections only (or by default). Is there a way to determine the level of sections ourselves? For example, I want to archive level four for this page. Second, isn't this template working on the bot to delay archive time? Thank you in advance!
Reply
Edited 2 months ago
Reply to "Two questions about archiving"
Deletion on fa.wikipedia ineffective on Wikidata
2 comments • 6 months ago
Epìdosis
Hi! I've just noticed in the page history of Theophilos of Athens (Q12874082) that the article from fa.wikipedia got deleted in 2018 but remained present on Wikidata. Is it a know problem, or has it been already solved? Can a bot clear eventual similar cases? Thank you as always!
Reply
6 months ago
Ladsgroup
Hey, stuff like that can happen for various reasons. Network partition, lag in the database, bugs in the code, permission issues (the IP being blocked in Wikidata maybe?), etc. If it happens all the time, then we should look into it but there will be always cases that fail. Maybe we should have a way to spot and clean up.
Reply
6 months ago
Reply to "Deletion on fa.wikipedia ineffective on Wikidata"
Backup of the sanctioned user (Mr,p_balçi) with both an account and an IP
2 comments • 6 months ago
Yoelimo
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Mr,p_balçi
Hi, the mentioned user, who was expelled from Wikipedia, has once again attempted to sabotage articles and spare games with multiple IPs and accounts.
Some of her spares were blocked in the same month and many others are not . He has bypassed the block more than ten times. I request that his main account be closed endlessly and globally.
My guess is that if inspected, a lot more will be discovered.
In several cases, it has been blocked due to obscenity.
Blocked:
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Abolfazlyashar
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/قلج_ارسلان‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/بهنام_قاراداغلی‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Ешшак_тат
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Nima.game1
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Wiki.pedya17
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Wiki.toran
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Pedi_wiki7
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Yašasin_turan
Unblocked :
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Abix018
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Aztap
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Öç_pöçmaq
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Arslan_urmulu
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/احمد_جلیل_دوگونچی‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Aydin_turk
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Afshar70
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Original_Balish
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/5.115.222.138‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/5.208.106.76‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/37.120.244.122‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/Leytili
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/95.162.139.215‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/37.202.241.97‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/89.196.178.165‎
https://fa.wikipedia.org/wiki/ویژه:مشارکت‌ها/5.125.231.13‎
Several accounts under her have been closed globally (due to obscenity) :
https://meta.wikimedia.org/wiki/Special:CentralAuth/Luckie_Luke_kharkosde_kuni
https://meta.wikimedia.org/wiki/Special:CentralAuth/Luckie_Luke_kharkosde
https://meta.wikimedia.org/wiki/Special:CentralAuth/Huji_kharkosde
https://meta.wikimedia.org/wiki/Special:CentralAuth/Huji_kuni_binamus
Please handle thanks
Reply
6 months ago
Ladsgroup
Hello, please mention these cases in fawiki's وپ:دبک. It doesn't belong here.
Reply
6 months ago
Reply to "Backup of the sanctioned user (Mr,p_balçi) with both an account and an IP"
user:Quakewoody
2 comments • 6 months ago
5.125.91.181
Hello, you are a spare user.@ZEP55 is the user:@Quakewoody you blocked. Please close your spare account.
Reply
6 months ago
ZEP55
Hello, what? I joined recently and I'm not edited here.
Reply
6 months ago
Reply to "user:Quakewoody"
Help for دبک‎
One comment • 7 months ago
Asw!1!1
ترول بلندمدت
درود
آیا این کاربر چنین اجازه ای دارد؟
https://fa.wikipedia.org/w/index.php?title=%DA%A9%D8%A7%D8%B1%D8%A8%D8%B1%3AUltramolt&type=revision&diff=31295731&oldid=29737395
در حالی که این زاپاس خودش بازرسی پرونده دارد
https://fa.wikipedia.org/wiki/%D9%88%DB%8C%DA%A9%DB%8C%E2%80%8C%D9%BE%D8%AF%DB%8C%D8%A7:%D8%AF%D8%B1%D8%AE%D9%88%D8%A7%D8%B3%D8%AA_%D8%A8%D8%A7%D8%B2%D8%B1%D8%B3%DB%8C_%DA%A9%D8%A7%D8%B1%D8%A8%D8%B1/Mr,p_bal%C3%A7i
و حداقل در موارد زیر قطعیت یافته که زاپاسهای او بوده؟
https://fa.wikipedia.org/wiki/%D9%88%DB%8C%DA%98%D9%87:%D9%85%D8%B4%D8%A7%D8%B1%DA%A9%D8%AA%E2%80%8C%D9%87%D8%A7/Elmm_99
https://fa.wikipedia.org/wiki/%D9%88%DB%8C%DA%98%D9%87:%D9%85%D8%B4%D8%A7%D8%B1%DA%A9%D8%AA%E2%80%8C%D9%87%D8%A7/Turboratur
و چرا بعد از سه ماه به خرابکاری های او رسیدگی نمی شود؟ همین امروز چند زاپاس جدید از او بی پایان بسته شد
https://fa.wikipedia.org/wiki/%D9%88%DB%8C%DA%98%D9%87:%D9%85%D8%B4%D8%A7%D8%B1%DA%A9%D8%AA%E2%80%8C%D9%87%D8%A7/Abjadhavaz
این یک نمونه
آیا عزمی برای پایان اخلالگری های او نیست؟
حقیقتا من جز شما کاربری را پرتلاش و مورد اعتماد ندیدم آقایان اعصاب کاربران برایشان اهمیتی ندارد
این فرد تعداد زیادی ترول سراسری و غیرسراسری بسته شده دارد. به طور مکرر به کاربران شمالی، کرد، ارمنی و غیره توهین می کند و با آی پی اخلالگری می کند و در ویکی های فرعی به کاربران توهین می کند.

امکانش هست به بازرسی او ورود کنید سپاس
7 months ago
Solomon Hill (Ilam Province) (Q15975213)
7 comments • 7 months ago
Multichill
HI Amir, can you have a look at Solomon Hill (Ilam Province) (Q15975213). Seems to be completely wrong. I noticed it while working on heritage photos on Commons.
Reply
7 months ago
Ladsgroup
Thanks. It looks complicated. It's the first Iran's national heritage and used to be in Iran's borders (or parts of it still is? I check) but now it's mostly in Iraq. I ask people who know better than I do.
Reply
7 months ago
Ladsgroup
People say its country should be Iraq but technically still an Iranian national heritage that fell over to Iraq after changes in borders mid-20th century.
Reply
7 months ago
Multichill
The id seems to be invalid? Should be at least two digits according to the constraint on Iranian National Heritage registration number (P1369). If it used to be in Iran you should add the Iran info too, qualify both statements with start/end time and make the current one preferred.
Reply
7 months ago
Ladsgroup
Done. I think the constraint regex is wrong. Changed it to [0-9] so it accepts 01 too (maybe it should accept one digit instead? I don't know).
Reply
7 months ago
Multichill
I think you messed up the ranks. Was your intention to state that this item was never in Iran? I currently completely ignore the one digit entries on Commons because it will be mostly mistakes.
Reply
7 months ago
Ladsgroup
Fixed the rank
Reply
7 months ago
Reply to "Solomon Hill (Ilam Province) (Q15975213)"
Quakewoody
3 comments • 7 months ago
Lymantria
Can you explain to me why you gave them an indef block? I seem to miss a recent discussion that might be considered harassment/intimidating behaviour. But I am probably missing something?
Reply
7 months ago
Ladsgroup
Hey, the one that triggered it was Wikidata:Requests_for_deletions/Archive/2021/02/20#Q105443300 (the anti-LGBT behavior) but two other reasons: The user is indef blocked in five other wikis as well for harassment + the user has a history of being blocked for vandalism/edit warring in here. If you feel it's too much, feel free to reduce it to some other time.
Reply
Edited 7 months ago
Lymantria
Okay, that is malicious indeed. Indef is long, but let they themselves start an unblocking procedure if they want to proceed here.
Reply
7 months ago
Reply to "Quakewoody"
Load more
View edit history of this page.
Wikidata
All structured data from the main, Property, Lexeme, and EntitySchema namespaces is available under the Creative Commons CC0 License; text in the other namespaces is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy.
Privacy policy
Terms of Use
Desktop
 Home Random  Nearby  Log in  Settings  Donate  About Wikidata  Disclaimers
WatchHistoryContributions  User groups  Logs  Page information  What links here