To test a child’s experience, the researchers created avatars to simulate the experience without putting anyone at risk. These new accounts, each based on a real child, liked and searched for content and reflected the behaviour of the child on which they were based.
The research found:
- Within hours the avatar accounts were targeted with direct messages from adult users, asking to connect and offering pornography.
- Companies are targeting children with age specific advertising and also serving those same children suicide, self-harm, eating disorder and sexual content
- A child who clicks on a dieting tip, by the end of the week, is recommended bodies so unachievable that they distort any sense of what a body should look like
- A child that tells their true age, however young, is offered content and experiences that in almost any other context would be illegal
- Many children in this research blamed social media for negative and challenging experiences they had faced growing up surrounding body image and relationships
These are areas we need to keep chipping away at, giving the appropriate support and guidance to students. Regardless of the Online Safety Bill, the Age Appropriate Design Code or new ‘features’ built into apps, these issues are not going to go away any time soon.
You can see the full report HERE or a short summary HERE.