A research study into use of social media platforms by minors found that upwards of 40% of children under 13 were using them, despite accounts supposedly being limited to teenagers and adults. The research included TikTok, Facebook, Instagram, and Snapchat.
The study also found that a full third of minors had experienced an “online sexual interaction,” which included being asked for, or receiving, nude photos …
Adding to the concern, children are much more likely to simply block those sending inappropriate messages than they are to report them to a parent or caregiver.
Child-protection nonprofit Thorn carried out the research among 2,000 children in the US.
Young people use many of the same widely popular platforms as adults, often in spite of age limitations put in place by the platform. They are drawn to opportunities to meet new people, generate content and build a following, and explore without fear of judgement.
While the internet offers boundless opportunities to connect and discover, it also creates new opportunities for risk and harm. Nearly half of participants (48%) said they had been made to feel uncomfortable, been bullied, or had a sexual interaction online.
While the most common experiences reported involved bullying or generally being made to feel uncomfortable (38%), 1 in 3 participants reported having had an online sexual interaction.
Response options coded as an “online sexual interaction” in analysis included: being asked for a nude image or video, being asked to go “on cam” with a nude or sexually explicit stream, being sent a nude photo or video, or being sent sexually explicit messages.
The most common online sexual interactions that participants reported involved receiving sexual messages (such as a “sext,” 21%), receiving a nude photo or video of the sender (18%), or being asked for a nude photo or video (18%).
The worst platforms for sexually abusive messages were Instagram and Snapchat, with 16% of minors on both platforms reporting sexual interactions.
Minors who did find themselves on the receiving end of such messages were twice as likely to block the sender than they were to inform a parent or guardian. Sixty-six percent said they blocked the person, 46% reported it on the platform, and only 29% told a parent or caregiver.
Thorn recommends that children need to be better warned of the risks, and specifically told that online sexual contact with a minor is illegal and should always be reported. Platforms should ensure reporting tools link to sources of help and support, and blocking should raise similar red flags to reporting. Blocking tools also need to be improved to prevent re-contact after blocking, which is currently common.
Photo by Gaelle Marcel on Unsplash
FTC: We use income earning auto affiliate links. More.