Hi, thanks for the talk yesterday on ORES, and I hope you didn't mind my questions/comments. :-)
I've been working on adding interwikis to new articles for some Wikipedias (and also Commons!) to Wikidata items, but I'm wondering if there are better ways of doing it (currently I just auto-search for matches, and manually say yes/no to add them, within a python script). I've just proposed a potential Outreachy project to improve the current codes I'm using, see https://phabricator.wikimedia.org/T290718 . As part of that, I'm wondering if machine learning might be applicable here - it feels like there's a great training set with all of the other articles that already have sitelinks, which could then be used to assess how good potential matches are, and maybe the highest confidence matches could then be added automatically, so only lower confidence ones need manual checking. I know of machine learning, though, but not how to actually do it!
If you think this might be possible, would you be interested in being a co-mentor for the Outreachy project, and we can make it a bit more ML-focused?