Instagram ‘accidentally’ promoted harmful diet content
Campaigners raised concerns that the content could trigger Instagram users who suffer from eating disorders.
Social media app Instagram has apologised for promoting diet-related content following what it described as an algorithm mistake.
The app’s search bar automatically suggested terms such as ‘appetite suppressant’ and ‘fasting’. The social media platform has since removed the terms after concerns were raised by eating disorder campaigners.
Explaining the new search functionality, a spokesman for Instagram’s owner, Facebook, explained that the search bar suggestions aimed to help users access content most relevant to their interests.
He added: “Those suggestions, as well as the search results themselves, are limited to general interests and weight loss should not have been one of them.”
Social media has been criticised for allowing content that could trigger vulnerable users.
Instagram influencer Lauren Black, who is recovering from anorexia, explained her experience of the app, saying: “I’m often promoted things like calorie counting images and diet methods.”
Direct links have been made between social media and negative body image concerns, particularly for users frequently comparing their appearance to online images. Heavily edited photos and ‘fitspiration’, for example, can be dangerous to users’ body image. Nutritionists also warned of a recent trend, ‘Full Day of Eating’, for potentially promoting extremely low-calorie diets or influencing an obsession with eating, known as orthorexia nervosa.
Director of External Affairs, Tom Quinn at BEAT, The UK’s Eating Disorder Charity, has called for more action against eating disorder content online.
He said: “Social media platforms should do more to direct affected users to sources of support. While people can develop an eating disorder at any point in life, we know that teenagers are particularly affected by anorexia, and we urge companies and individuals who market to this age group to consider the implications of their marketing for vulnerable people.”
When Instagram promised to install technology to remove any content showing self-injury, it defined eating disorders within this category. Facebook’s Community Standards claim that any content, including memes, illustrations, or more graphic images, will be removed if it encourages self-injury. The platform similarly directs users towards support, if they are searching for self-injury related content.
Leave a Reply