Suicide Instagram Group For Teenagers Unveiled By The Police

Suicide Instagram Group For Teenagers Unveiled By The Police

Suicide Instagram Group For Teenagers Unveiled By The Police

There is a secret Instagram group named ‘suicide’ involving twelve teenage girls. Between age 12 and 16.

This instagram group was uncover by the police force in southern England this month.

The group was created for perfect plan on how to commit “suicidal crises” and seriously self-harm.

According to BBC news, the police investigation discovered this Instagram group when some girls went missing in London.

Although they found some of them, but in a critical health situation. The missing girls are currently receiving treatment in the hospital.

Once in a month, the teenagers meet in a secret spot London.

 

One of the girls mentioned that their first meeting was online and they only discussed suicide.

 

Out of 12 girls, seven girls have been found. The police is still tracking the locations of other 5 girls. Due to this incident, children’s social care services from seven different local authorities have been created. To help educate teenagers on the danger of suicide.

Police said in a statement to BBC that “peer-to-peer influence increased suicidal ideation amongst the children. To the extent that several escalated to suicidal crises and serious self-harm.”

Instagram says it has nothing to do with this. As the social media uses Artificial Intelligence(AI) to block violent and self-harm posts.

Most of the teenagers had met on other social media platforms like facebook and snapchat. And they were part of a self harm group in those platform. Some of the groups titles are “suicide” and “missing”.

Facebook, does not deny that the name of the closed group referenced “suicide” but indicated it wasn’t removed from the platform because the content of the messages did not break Facebook and Instagram rules.

This was the response, received from Facebook.

“We reviewed reports and found no evidence of the people involved breaking our rules around suicide and self harm.

“We don’t allow graphic content, or content that promotes or encourages suicide or self-harm, and will remove it when we find it.

“We’ll continue to support the police and will respond to any valid legal request for information.”

Updated: April 3, 2021 — 11:02 am

Leave a Reply

Your email address will not be published. Required fields are marked *