Instagram is now up to 800 million users, and is growing rapidly with additions like Instagram Stories helping to boost usage and diversify their audience. But with a larger audience comes more types of interactions, and thus, a greater need for community safety options.
Instagram has always been fairly active on this front – while Twitter, most notably, has struggled in dealing with trolls and abuse, Instagram, which has traditionally focused on images as opposed to exchanges, has worked to stay ahead of the curve, introducing advanced reporting tools and features to give users more control over what they see on the platform.
And now, Instagram’s adding in some new tools to help, with three updates to help improve community safety.
Comment Controls
The first new addition is the option to be able to choose who can comment on your posts.
As explained by Instagram:
“Starting today, if your account is public, you’ll see a new way to choose who can comment on your posts – from everyone to just groups of people, like people you follow or your followers. Also, whether your account is public or private, you’ll be able to block other accounts from commenting on your posts.”
Creating more closed in groups has actually become something of a focus for Instagram. Back in June, reports surfaced that Instagram was developing a new private lists feature which would enable users to share certain content with specific users, while they’ve also been developing their DM tools to facilitate increased use. According to Instagram, direct messages on the platform are shared between the same three friends 85% of the time, which further underlines that more enclosed group use-case.
Sharing in more private groups also likely harks back to Snapchat and how the app rose as an alternative to Facebook, where your parents are relatives were increasingly present. While in this application the focus is more on avoiding unwanted outside comments, the option may also help further facilitate that enclosed community feel and foster stronger links through the app.
Expanded Comment Block
Back in June, Instagram added the capacity to block offensive comments with a system that uses Facebook’s DeepText AI classification system to detect and eliminate any comments which it deems to be in violation of Instagram’s Community Guidelines.
Initially, that option was only available in English, but now Instagram’s expanding the filter to Arabic, French, German and Portuguese, enabling more users to improve their on-platform experience.
Reporting for Live Video
And the last of the new elements being added is a new option to report people who appear to be in need of support during a live broadcast.
Instagram added a similar option back in December, but this new variation enables users to make such reports during a live video, as opposed to after the fact. Facebook has a similar process in place for Facebook Live.
Reports are submitted anonymously – as you can see in the above image, when you submit a concern, the broadcaster will be shown a message offering help, with options to talk to a helpline, reach out to a friend or get other tips and support. Instagram says they have teams working 24 hours a day, seven days a week, around the world to respond to such concerns.
Cyberbullying is a significant issue, and the need to provide tools to assist those most at risk has to be a key priority for social networks. More generally, suicide is one of our biggest societal concerns – globally, another person is lost to suicide every 40 seconds, and it’s the second leading cause of death among 15-29 year olds, the key social media demographic.
As such, it’s good to see the major social networks put more focus on tools to help. Even in the smallest form, the capacity to control who sees your content, and who can comment, can be major for those in need.