Last month, a glitch in Instagram’s search algorithm caused it to automatically recommend terms such as “fasting” and “appetite suppressants” to some people, including those recovering from eating disorders. Facebook, which owns Instagram, fixed it, but the lapse raised questions about the company’s ability to create a safe online space for those who are most vulnerable. It alarmed a group of over 40 attorneys general in the US enough that they mentioned it in a letter to Facebook CEO Mark Zuckerberg as an instance of why the company’s plan to create a version of Instagram for children below the age of 13 must be abandoned.
Instagram currently does not allow children under 13 to use the platform and the new product has been planned so that those who want to, can safely use it. But in the big, bad world of the internet, even spaces designed to be safe for children are not necessarily so. Take the case of Facebook’s own Messenger Kids where a design flaw — which has since been fixed — allowed strangers to infiltrate group chats. And “stranger danger” is just one aspect of a dodgy and unnecessary product: Multiple studies have shown the deleterious effect social networking platforms have on the mental health of users. Instagram, in particular, was flagged in a 2017 study by UK’s Royal Society for Public Health as being the “worst” for young people’s mental health, affecting sleep quality and leading to bullying and body image issues.
The responsible thing for Facebook to do now is to make it harder for under-age users to lie their way onto the platform: In other words, stronger fencing and more vigilance. This can only happen if the company stops looking at children as another potential market and starts seeing them for what they are: Still-developing minds that are curious, adventurous and very vulnerable.
This News Present You By Guwahatiassam.info With Out any changes as it Publish on Original Source.
Like Us to Read This Type of News