Instagram ‘pushes weight-loss messages to teenagers’

Researchers say site’s algorithms can flood users’ profiles with body image content

From September, companies that expect children to visit their websites or use their apps will need to present a child-friendly version of their service. Photograph: iStock
From September, companies that expect children to visit their websites or use their apps will need to present a child-friendly version of their service. Photograph: iStock

Instagram’s algorithms are pushing teenage girls who even briefly engage with fitness-related images towards a flood of weight-loss content, according to new research which aimed to recreate the experience of being a child on social networks.

Researchers adopting “mystery shopper” techniques set up a series of Instagram profiles mirroring real children and followed the same accounts as the volunteer teenagers. They then began liking a handful of posts to see how quickly the network’s algorithm pushed potentially damaging material into the site’s “explore” tab, which highlights material that the social network thinks a user might like.

One account that was set up in the name of a 17-year-old girl liked a single post from a sportswear brand about dieting that appeared in her Instagram explore tab. She then followed an account which was suggested to her after it posted a photo of a “pre- and post-weight loss journey”.

These two actions were enough to radically change the material suggested to the fake teenage girl on Instagram. The researchers found her explore feed suddenly began to feature substantially more content relating to weight loss journeys and tips, exercise and body sculpting. The material often featured “noticeably slim, and in some cases seemingly edited/distorted body shapes”.

READ MORE

When the experiment – which involved browsing the site for just a few minutes a day – was recreated with a profile posing as a 15-year-old girl, a similar effect quickly took place.

Researchers also replicated the behaviour of a real 14-year-old boy which led to his Instagram explore tab being flooded with pictures of models, many of which appeared to have heavily edited body types.

Instagram knew all of the accounts were registered to teenagers and served child-focused adverts to the users alongside the material. The site has recently resolved to fix issues around anorexia in its search functions after previous criticism – with the tech firm putting warning labels on content including pro-anorexia material.

The research was conducted in the UK by Revealing Reality and commissioned by the 5Rights Foundation, which campaigns for tighter online controls for children. Lady Beeban Kidron, who chairs the British charity, said it was the inherent design of the recommendation engines used by social networks such as Instagram which can exacerbate social issues for teenagers. She said she was disturbed by the existence of “automated pathways” that lead children to such images.

Dame Rachel de Souza, the children’s commissioner for England, said: “We don’t allow children to access services and content that are inappropriate for them in the offline world. They shouldn’t be able to access them in the online world either.”

Facebook, which owns Instagram, said it was already taking more aggressive steps to keep teens safe on the social network, including preventing adults from sending direct messages to teens who don’t follow them.

However, it claimed the study’s methodology was flawed and has “drawn sweeping conclusions about the overall teen experience on Instagram from a handful of avatar accounts”. They said much of the content accessible by fake teenagers in the study was not recommended but actively searched for or followed and “many of these examples predate changes we’ve made to offer support to people who search for content related to self-harm and eating disorders”.

The research comes at an awkward time for the social media platforms. In just over six weeks the companies will be forced to contend with the age appropriate design code, a stringent new set of rules coming into force in the UK. The code, developed by the Information Commissioner’s Office, cleans up the tangled rulebook on how companies should treat children online, in an effort to spearhead the creation of a “child-safe internet”.

From September, companies that expect children to visit their websites or use their apps will need to present a child-friendly version of their service by default, and should not operate under the assumption that a user is an adult unless they explicitly declare otherwise.

Further restrictions will arrive with the online safety bill, currently in draft form, which sets out punishing fines of up to 10 per cent of global turnover for companies which fail to live up to promises made in their moderation guidelines and terms of service. – Guardian