December 10, 2019
Technology

Instagram under Fire for not quickly removing harmful content

news.sky: A young woman who says graphic images of self-harm on social media encouraged her to make multiple suicide attempts has criticised Instagram’s reporting process.

She has reported hundreds of images to the picture sharing site which she felt were harmful or triggering and estimates only 30% are ever taken down.

At least one of these images was reported after Instagram’s pledge last month to remove all graphic self-harm content from its platform.

Following the pledge, Sky News reported a number of very graphic images to the site.

A month later and the content hadn’t been taken down.

It raises questions about the scale of the challenge facing Instagram, if reported content isn’t coming down swiftly, how difficult will it be for Instagram to find and remove content which is less easy to find?

Anna Hodge is now 18 years old. She was first diagnosed with OCD at the age of 11 and became increasingly depressed as she entered her teenage years. As well as struggling with anorexia she began self-harming and was eventually diagnosed with borderline personality disorder and depression.

“You feel such a build-up you need to do something and self-harm was that something for me,” says Anna.

“It was a way of punishing myself and it felt like a good relief at the time, but looking back it didn’t help at all.

At first she found the mental health community online a helpful source of support. But there was a darker side to some of the accounts she was following

“The more people you follow the more chances there are of seeing things that aren’t appropriate and Instagram suggests accounts to follow and you go down a bit of a rabbit hole. Once you’ve talked to these people and you feel like you’ve bonded you don’t want to unfollow when they post harmful things because you don’t want to upset them

“Once you’re reading about all these people thinking such negative things and telling you what they are , if you’ve had some of those thoughts too it reinforces that pattern of negative thinking in your head, it kind of encourages it.”

After a number of years in hospital Anna is now back at home and studying for a degree.

She also runs a recovery account on Instagram encouraging body positivity and good mental health.

As part of her recovery she regularly reports content to the site and is frustrated by how often things aren’t removed.

“They just say ‘we’ve reviewed your content and it’s not against our guidelines’,” says Anna.

“I think they need to listen to people who have been in that vulnerable space and know how easy it is to access that content. They need to listen to people with these issues, listen to what they think is harmful and not what the big Instagram bosses think is harmful.”

The social media companies have faced criticism over the secrecy that surrounds their reporting and monitoring process.

The images Sky News reported to Instagram showed graphic self-harm as well as quotes encouraging suicide.

The site’s policy has always been that it does not allow content that encourages or promotes self-harm or suicide – despite this the content had not been removed a month after it was reported.

For charities like Samaritans who are helping advise Instagram, their goals will be hard to achieve without being more open.

“We definitely need it to be much more transparent,” says Harriet Edwards, Samaritans policy manager.

“Critically they need to ensure they’re bringing people with them, making sure the public know these improvements are being made and making sure they know why these changes are being made.”

 

View this post on Instagram

 

♡ I know that when you’re in a dark place you probably won’t believe this – you can’t see the point of anything, including living, when we’re just going to die one day anyway. ♡ But that’s focusing on the wrong thing. We are all going to die, it’s inevitable, and that can be really scary – but we’re forgetting that there’s a whole lot of time in between. There are hopefully many years inbetween you being born and you dying in which you get to LIVE. ♡ Maybe you don’t believe that we’re all here for a reason, but just think of all the billions of coincidences that had to occur in order for you to be here today. We may as well make the most of this miracle we’ve been given. ♡ Think of all the laughs, the tears, the hugs, the kisses, the smiles, the friends, the family, the falling in love, the exhilaration, the things we get to see and do…think of all the living. And then vow to do it the best we can, to eke out every ounce of life we can, while we can. Love, Anna x

A post shared by Anna Zoe (she/her) (@hopingforhappy) on

In a statement Tara Hopkins, Head of Public Policy at Instagram EMEA said:

“Nothing is more important to us than the safety of the people who use Instagram.

“We were clear that it would take time to build new technology and train teams to enforce the new policy, and people posting or searching for this content would be sent resources.

“This week, the policy came into effect.

“Now when people report this content It should be removed and the person who posted it will be sent resources.

“We are committed to doing everything we can to remove this type of content from Instagram, but it will take time.”

Ahead of running this story we pointed Instagram towards the images we reported. A couple of them have now been taken down.

Source: Published by news.sky

Drop Your Comments

Related posts

What Space-Travel Does to Human DNA

James

Toilet of the future will give you a Full health check while you poop

James

Germany planning Tax move on Google, Facebook and other foreign internet firms

James

Leave a Comment