Snapchat has become the most popular medium for online grooming, according to police data given to the children’s charity the NSPCC. Over 7,000 sexual communication with child violations were reported in the UK in the last 12 months, which is the most since the offence’s inception.
Nearly half of the 1,824 cases law enforcement documented in the specific grooming platform were Snapchat-related. Due to this concerning number, urgent appeals have been made for internet businesses to ensure their platforms are kid-safe. “Society is still waiting for tech companies to make their platforms safe for children,” noted a representative for the National SPCC.
What Are the Concerns of Authorities and Parents?
Snapchat has said that it has a “zero tolerance” policy for the sexual exploitation of minors and that it has put in place extra safeguards for teens and their parents. A child protection lead, however, called the statistics “shocking.” It was underlined that “the regulator should strengthen the rules that social media platforms must follow, and the responsibility of safeguarding children online must be placed with the companies that create spaces for them.”
Data from cases where the gender of the victims of grooming offences was known showed that four out of five victims were females, even though this information was not always recorded. At age eight, Nicki, whose actual name has been kept anonymous for privacy reasons, was approached by a groomer on a gaming app and urged to move the conversation to Snapchat.
How Is a Mother's Vigilance Crucial?
Her mother, Sarah, clarified, “I don’t need to explain details, but anything you can imagine happening occurred in those conversations—videos, pictures, requests for certain material from Nicki, etc.” After the upsetting exchanges, Sarah made a phoney Snapchat account posing as her daughter, and she called the police right away after receiving a message from the groomer.
Sarah claims that “it’s my responsibility as a mom to ensure she is safe,” she now examines her daughter’s devices and messages once a week despite her daughter’s protests. She warned that parents “cannot rely” on games and applications to keep their kids safe.
Why Is Snapchat Popular Among Younger Users?
Despite being among the lesser social media sites in the UK, Snapchat is quite well-liked by kids and teens. Because of this attraction, Snapchat is a desirable target for prospective groomers. An NSPCC child safety online policy manager clarified, “This is something that adults are likely to exploit when they’re looking to groom children.”
Additionally, Snapchat has built-in design flaws that put kids at greater risk. “Senders are informed if the recipient has taken a screenshot of a message, and messages and images on Snapchat vanish after 24 hours, making incriminating behaviour harder to track,” she said.
How Are Children's Reports Going Unheeded?
Children who are worried about Snapchat frequently provide feedback to the NSPCC. According to the statement, “When they report [on Snapchat], this isn’t listened to, and they can see extreme and violent content on the app as well.”
“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities,” a Snapchat spokesperson responded, calling the sexual exploitation of youth “horrific.”
What Are the Record Levels of Grooming Offences?
Since the Sexual Communication with a Child statute was introduced in 2017, the number of grooming offences that have been reported has been rising gradually, and this year it reached a record high of 7,062. Snapchat was linked to 48% of the 1,824 incidents in which the platform was identified.
Additionally, while incidents on Facebook and Instagram have decreased recently, grooming offences recorded on WhatsApp have increased marginally over the previous year. Meta owns all three platforms. WhatsApp claimed to have “robust safety measures” in place to safeguard its users.
How Is the Government Responding, and What Regulatory Changes Are Expected?
The minister of protection and violence against women and girls stressed that social media companies “have a responsibility to stop this vile abuse from happening on their platforms.” “They will have to prevent this type of unlawful content from being shared on their sites, including on private and encrypted messaging services, or face significant fines under the Online Safety Act,” she added.
Major tech companies will have to reveal their risk assessments on illegal harms on their platforms starting in December as part of the Online Safety Act’s legislative requirement that tech platforms protect children.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children,” said Ofcom, the media regulator in charge of implementing these new regulations. When the time comes, we’re ready to employ all our enforcement tools against any businesses that fail.”