Flickr has always been a global community of photographers and we remain committed to supporting artistic expression through photography in all its forms, including explicit photos. In order to cater to many different audiences around the world with different standards for explicit content, we worked with our community from the beginning to develop moderation guidelines
. We rely on those guidelines to make sure people only see the type of content they want to see.
In that spirit, we’re introducing new tools and technologies that will help to ensure the content community members see on Flickr meets their expectations.
Because of the incredible number of photos uploaded to Flickr every day, we are introducing the Flickr Moderation Bot. Moderation Bot will detect explicit content from new uploads and will automatically update mis-moderated content to the correct moderation levels according to our established policies. If the system ever detects mis-moderated content in your account, you will always receive a private notification under the bell icon that lets you know about the mismatch and directs you to the photo in question.
We’ll begin by ramping up auto-moderation on new uploads and monitor for a number of factors, including possible false positives. With any large-scale machine learning system, there will be needed adjustments as we dial in the technology. Moderation of uploaded content will always be the Flickr member’s responsibility, and you must not rely on Moderation Bot to do the job for you — however, we hope this tool will help avoid accidental mis-moderation and make the Flickr experience better for everyone. Eventually we plan to also backfill auto-moderation of a number of Flickr photos, such as when members update cover photos and avatars.
Additionally, in the new year, we’ll be overhauling our Report Abuse flows to bring greater flexibility and specificity to reporting. Instead of a few broad categories, we will be introducing a tiered system that directs you to be as precise as possible with your report. We’ll be expanding several of the reporting categories of highest priority so that our community can continue to help us eliminate spam, improperly moderated content, and illegal content from our site. As we have a number of community members who are dedicated to finding and reporting content that violates our Community Guidelines, we want to save their valuable time, and the valuable hours for staff who follow up on those reports.
You’ll still be able to report abuse from the link in the footer of every Flickr page, but we’ll also bring the same tools to the Flag Photo feature on every photo page. By adding these entry points, we hope that we can facilitate quality community interactions while limiting the disruptions of bad-faith actors.
We’ll share any relevant updates with you and we would love to hear from you
if you have any questions or encounter any issues.
Matthew is a Product Manager for Flickr and an editor on the Flickr Blog.
Your Best Shot 2020 is officially in full swing! If you haven’t visited the group yet, there’s still plenty of time to get in on the fun.
Dear Flickr friends, It goes without saying that 2020 was a year unlike any other. In the leadup to this year, the Flickr community told […]
Dozens of new camera models added to Camera Finder, an updated “Sharing and Extending” page, and some important fixes for new uploads and notifications.
Albert DrosAlbert Dros has an extreme passion for landscape photography and is driven to plan and capture unique shots, from volcanoes to solar eclipses. Albert's work has been published in Time, Huffington Post, National Geographic, and more. In addition to photography, Albert also practices design, motion graphics, video, and audio.
Welcome to the companion blog for Flickr, the home for all your photos. Flickr is the best site on the web for organizing, sharing and storing your photos. We provide you the tools for easy collaboration with one of the world’s most passionate photo communities.