Instagram ‘pushes weight-loss messages to teenagers’

Rese­ar­chers find mini­mal inter­ac­ti­ons by teen users can trig­ger a deluge of thin-body and dieting images

Insta­gram’s algo­rithms are pushing teenage girls who even briefly engage with fitness-rela­ted images towards a flood of weight-loss content, accor­ding to new rese­arch which aimed to recre­ate the expe­ri­ence of being a child on social networks.

Rese­ar­chers adop­ting “mystery shop­per” tech­ni­ques set up a series of Insta­gram profi­les mirro­ring real chil­dren and follo­wed the same accounts as the volun­teer teena­gers. They then began liking a hand­ful of posts to see how quickly the network’s algo­rithm pushed poten­ti­ally dama­ging mate­rial into the site’s “explore” tab, which high­lights mate­rial that the social network thinks a user might like.

 

One account that was set up in the name of a 17-year-old girl liked a single post from a sports­wear brand about dieting that appe­a­red in her Insta­gram explore tab. She then follo­wed an account which was sugges­ted to her after it posted a photo of a “pre- and post-weight loss jour­ney”.

These two acti­ons were enough to radi­cally change the mate­rial sugges­ted to the fake teenage girl on Insta­gram. The rese­ar­chers found her explore feed suddenly began to feature subs­tan­ti­ally more content rela­ting to weight loss jour­neys and tips, exer­cise and body sculp­ting. The mate­rial often featu­red “noti­ce­ably slim, and in some cases seemingly edited/distor­ted body shapes”.

When the expe­ri­ment – which invol­ved brow­sing the site for just a few minu­tes a day – was recre­a­ted with a profile posing as a 15-year-old girl, a simi­lar effect quickly took place.

Rese­ar­chers also repli­ca­ted the beha­vi­our of a real 14-year-old boy which led to his Insta­gram explore tab being floo­ded with pictu­res of models, many of which appe­a­red to have heavily edited body types.

Insta­gram knew all of the accounts were regis­te­red to teena­gers and served child-focu­sed adverts to the users along­side the mate­rial. The site has recently resol­ved to fix issues around anore­xia in its search func­ti­ons after previ­ous criti­cism – with the tech firm putting warning labels on content inclu­ding pro-anore­xia mate­rial.

The rese­arch was conduc­ted by Reve­a­ling Reality and commis­si­o­ned by the 5Rights Foun­da­tion, which campaigns for tigh­ter online controls for chil­dren. Lady Beeban Kidron, who chairs the charity, said it was the inhe­rent design of the recom­men­da­tion engi­nes used by social networks such as Insta­gram which can exacer­bate social issues for teena­gers. She said she was distur­bed by the exis­tence of “auto­ma­ted path­ways” that lead chil­dren to such images.

Dame Rachel de Souza, the chil­dren’s commis­si­o­ner for England, said: “We don’t allow chil­dren to access servi­ces and content that are inap­pro­pri­ate for them in the offline world. They shouldn’t be able to access them in the online world either.”

A spokes­per­son for Face­book, which owns Insta­gram, said it was alre­ady taking more aggres­sive steps to keep teens safe on the social network, inclu­ding preven­ting adults from sending direct messa­ges to teens who don’t follow them.

Howe­ver, it clai­med the study’s metho­do­logy was flawed and has “drawn swee­ping conclu­si­ons about the overall teen expe­ri­ence on Insta­gram from a hand­ful of avatar accounts”. They said much of the content acces­si­ble by fake teena­gers in the study was not recom­men­ded but acti­vely sear­ched for or follo­wed and “many of these exam­ples predate chan­ges we’ve made to offer support to people who search for content rela­ted to self-harm and eating disor­ders”.

The rese­arch comes at an awkward time for the social media plat­forms. In just over six weeks the compa­nies will be forced to contend with the age appro­pri­ate design code, a strin­gent new set of rules coming into force in the UK. The code, deve­lo­ped by the Infor­ma­tion Commis­si­o­ner’s Office, cleans up the tangled rule­book on how compa­nies should treat chil­dren online, in an effort to spear­head the crea­tion of a “child-safe inter­net”.

From Septem­ber, compa­nies that expect chil­dren to visit their websi­tes or use their apps will need to present a child-friendly version of their service by default, and should not operate under the assump­tion that a user is an adult unless they expli­citly declare other­wise.

Further restric­ti­ons will arrive with the online safety bill, currently in draft form, which sets out punis­hing fines of up to 10% of global turno­ver for compa­nies which fail to live up to promi­ses made in their mode­ra­tion guide­li­nes and terms of service.

Authors:Jim Water­son and Alex Hern

Photo: A teenage girl looking at her smartp­hone. Photo­graph: Alamy Stock Photo