Algorithm Death: As we head towards an immersive life in the Metaverse, alarms are blaring. Tech companies need to get their algorithms in order, or this could be a risky future for some.
While we aren’t at Web3 yet, problems with Web2 are still incredibly concerning. While algorithms made by big tech can be good for society, like ones that predict crime in advance, there are others that are a total disaster.
Coroner Andrew Walker identified the cause of death of 14-year-old Molly Rose Russell as the act of self-harm. He said this was caused by depression and the negative impact of online content.
Walker said that “it would not be safe for us to acknowledge suicide as the cause of death.”
Algorithm Death: Instagram and Pinterest
The expert said that Instagram and Pinterest used algorithms that selected and shared inappropriate content with the student without her requesting the information. Walker reports, “Some content romanticized young people’s acts of self-harm, while others promoted isolation and made it impossible to discuss the problem with people who might help.”
According to The Guardian, Russell saved, liked, or posted more than 2,000 Instagram posts in 2017 on the eve of her death. They were all associated with suicide, depression or self-harm. The girl also watched 138 films of a similar nature. Among them were episodes of the series 13 Reasons Why for people 15+ and 18+.
A child psychiatrist said during the hearing that after analyzing what Russell had watched shortly before her death, he was unable to sleep soundly for weeks.
“10 Depression Pins You May Like”
Investigators found hundreds of photos of self-harm and suicide on the girl’s Pintere
GIPHY App Key not set. Please check settings