As an attempt to make Instagram a friendlier place, the company has introduced a new policy regarding account disabling. From now on, Instagram will detect if you violate its policies, and warn you before disabling your account. This means that more accounts will be subject to deletion. But on the other hand, you’ll be notified if you’re in the risk of being banned, so you can make your future posts appropriate.
At the moment, Instagram only removes accounts with a certain percentage of violating content. The new policy will also remove accounts with a certain number of violations within a specific timeframe. According to Instagram, the new policy will allow them to be more consistent, but also to hold people accountable for what they post on the platform.
However, as I mentioned, you’ll be notified before your account gets deleted, which is fair. I think it’s in a way similar to that new “guilt trip” rule, which notifies you when your comment is rude so you can reflect and undo it. The new notification system will delete the offensive post and let you know if your account is at risk of being disabled. You’ll also have a chance to appeal content deleted.
“To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months,” Instagram writes. Additionally, if your post turns out to be removed by mistake, Instagram will restore the post and remove the violation from your record.
Instagram writes that the new update is “an important step in improving [their] policies and keeping [their] platforms a safe and supportive place.” Along with shadow bans, AI for comments and photos, it could be another (small) step towards a safer and friendlier online place.