The Washington PostDemocracy Dies in Darkness

Instagram is giving you more control over ‘sensitive content.’ Here’s how to turn it off — or dial it up.

Instagram is juggling the needs of creators, parents and brands, and making everybody mad in the process.

July 23, 2021 at 11:52 a.m. EDT
Instagram's new limit sensitive content setting
The new “limit sensitive content” setting has three levels: allow, limit and limit even more. It is not totally clear what each setting will filter and permit. (Source: Instagram)
4 min

Some people on Instagram want to see fewer images of bare bodies, violence, drugs and firearms. Others make money sharing those images.

Instagram and its parent company Facebook, meanwhile, are stuck in the middle.

This week, Instagram rolled out a “limit sensitive content” setting that gives users some control over how much content is filtered out of their explore tabs, where the app recommends content from accounts they do not already follow. But some advocates on both sides of the content-moderation debate feel the change does not go far enough.

Phillip Miner, a Brooklyn-based art promoter, found the new setting by accident.

Like all Instagram users, his account was automatically set to the default — to “limit” so-called sensitive content, rather than “allow” or “limit even more.” Miner, an art promoter that works specifically with LGBTQ artists, says he was immediately concerned how the setting would affect his account’s reach in the tab. He created and shared an infographic showing his followers how to turn off the limits, and it received more than 45,000 likes in 24 hours.

Instagram announced the change on its website, but not in the app itself, and many of the comments on Miner’s post accused the app of trying to sneak in the change.

“I think the reason my post took off and resonated with people is because this decision was made without them being told,” Miner said.

But these limits on content that Instagram defines as sensitive have been running in the background the whole time, as outlined in the app’s “recommendation guidelines,” Instagram spokeswoman Raki Wane said. Now, for the first time, people can turn off the filtering or make it more intense, she said.

Unless you go in and change the setting, your explore tab will look exactly like it did before, according to Instagram. To change the setting, go to your Instagram profile, tap the three lines in the top right corner, select Settings -> Account -> Sensitive Content Control. There, you will see three options: allow, limit and limit even more. People under 18 will not have the “allow” option, Instagram says.

Big Instagram and TikTok changes mean you’ll see longer videos, fewer friends

When it comes to what each option filters and allows, the details get fuzzy. For instance, Instagram’s Wane said each option uses the app’s recommendation guidelines to evaluate content.

But the “limit even more” setting may even filter out pictures of women in bikinis, Wane said, which do not violate the recommendation guidelines. Wane could not explain this discrepancy.

“It really just depends on the image in question and what it contains specifically, but our north star will always be the recommendation guidelines themselves,” she said.

Whether content makes it through the “limit even more” filter depends on “the precision with which” it is abiding by the recommendation guidelines, she said. The bikini photo abides by those guidelines, but it still might not show up for people who select “limit even more.”

As Facebook and its unit come under fire for failing to curb — and even boosting — misinformation about public health, Instagram’s new setting addresses a different content-moderation challenge. Some groups, like parents and religious people, want the option to see less content that does not align with their values or taste, Wane said.

Ariel Fox Johnson, senior counsel for global policy at parenting organization Common Sense Media, said even with the new setting, Instagram still is not going far enough.

“While the ability to choose to ‘limit even more’ sensitive content is better than nothing, it is not enough for young people,” Johnson said. “Kids and teens should have the strongest default protections, not have to wade through multiple settings to get them. And it should not all be on individuals to have to filter harmful content — Instagram should not be promoting such content to young people in the first place.”

Misinformation is a matter of fact, but so-called sensitive content is, in many situations, a matter of opinion. That puts Instagram in a tight spot. Allow images of bodies or legal gun use in the explore function, and Instagram makes some users uncomfortable. Perhaps more importantly, it might make brands nervous to advertise on the platform.

But limit the reach of that content, and Instagram drives away artists, content creators and marginalized communities who rely on the app for exposure, education and money.

Miner says he has experienced this firsthand. In 2019, in his role as director of grants and communications for Apicha Community Health Center in New York City, the primary care center had multiple ads promoting LGBTQ community health rejected by Instagram for what it called sexual or political content. Women, queer people and people of color have claimed they are disproportionately targeted by Instagram’s content moderation; the company even changed its nudity rules in 2020 after model Nyome Nicholas-Williams accused it of moderating Black women’s photos differently than White women’s, as reported in The Guardian.

Wane said Instagram has a separate equity team working to address those concerns. She said the new “limit sensitive content” setting might actually help the reach of users whose content gets flagged as sensitive: If people choose the “allow” setting, fewer posts will get sifted out of the explore tab, theoretically.

It is tough to discern how often the app unfairly censors content — or even why Instagram takes down some of the content it does. This new setting will not change that. Users cannot see what they are limiting or distinguish between different types of sensitive content, like by choosing to limit violent content but not sexually suggestive content, for example.

For now, “sensitive” content is lumped into one big bucket. Wane said that is because it is the first time Instagram has experimented with user-controlled content limits, and the company needs more time to understand if users want more granular controls.

“We are balancing the needs of one community over another, but it is never our intention to prioritize the needs of one community over another,” she said.