Themes – WeProtect Global Alliance https://www.weprotect.org Working for a digital world designed to protect children from sexual exploitation and abuse online Sun, 24 Nov 2024 21:50:30 +0000 en-GB hourly 1 https://www.weprotect.org/wp-content/uploads/cropped-WeProtect-favicon-1-32x32.png Themes – WeProtect Global Alliance https://www.weprotect.org 32 32 Child sexual abuse material https://www.weprotect.org/thematic/child-sexual-abuse-material/ Tue, 05 Nov 2024 15:30:31 +0000 https://www.weprotect.org/?page_id=436556 The issue:
Understanding Child Sexual Abuse Material (CSAM)

What is Child Sexual Abuse Material?

Child Sexual Abuse Material (CSAM) refers to sexually explicit content involving a child. This can include photographs, videos, or computer-generated images that depict a minor in a sexually explicit manner.

How has CSAM distribution evolved?

In the past, child abuse images were distributed through direct exchanges among networks, illegal storefronts or via mail services. Today, advancements in technology mean that abuse imagery can be uploaded and shared worldwide within seconds.

The volume of CSAM online is increasing rapidly – discovering CSAM can take just three clicks.

The impact of CSAM

CSAM not only creates a permanent record of a child’s abuse for their abuser but also serves as material to fulfill the fantasies of collectors worldwide. Unfortunately, the exploitation doesn’t end there. Once distributed, these images can be weaponized to manipulate the child further, whether by obtaining more images, arranging a physical meeting, or extorting money. Additionally, these images can circulate widely, catering to those seeking specific fantasies. Predators might also use these images to groom other potential victims.

How perpetrators interact with CSAM

There are three main ways abusers interact with CSAM:

  1. Producing CSAM
    • Creating CSAM by capturing photos, videos, or audio recordings.
    • Producing textual or non-photographic visual material.
    • Manipulating existing material to generate new imagery.
  2. Searching for and viewing CSAM
    • Actively seeking CSAM on the internet.
    • Viewing or attempting to view this material.
  3. Sharing and storing CSAM
    • Sharing or storing CSAM, perpetuating the cycle of abuse.
    • Re-victimizing the abused by keeping the material in circulation.

Addressing the growing threat of CSAM requires concerted efforts from all sectors of society, including law enforcement, technology companies, and the public, to protect children and stop the spread of this harmful material.

Prevalence of CSAM

The number of reports of Child Sexual Abuse Material (CSAM) worldwide has been increasing significantly. In 2023, the US National Center for Missing & Exploited Children (NCMEC) received over 32 million reports related to online child sexual exploitation, including CSAM. Data from Childlight’s Into the Light Index suggests that one case of child sexual abuse online is reported every second.

This data reflects the growing prevalence and reporting of CSAM, driven by increased internet access, the widespread use of digital platforms, and enhanced efforts by tech companies and law enforcement to identify and report such content.

However, these figures likely represent only a fraction of the actual extent of CSAM due to underreporting and the hidden nature of these crimes. The full scale of child sexual abuse and exploitation online and the volume of resulting imagery is difficult to quantify, with estimates only scratching the surface of the material that has been discovered.

Who is affected?

The Internet Watch Foundation (IWF) reported that in 2021, 62% of CSAM assessed involved children aged 11-15 years, with a notable increase in self-generated content among teenagers. In January 2024, IWF reported that under tens are increasingly targeted by groomers, with a growing number of younger children being coerced into performing sexually online. The National Center for Missing & Exploited Children (NCMEC) has observed similar trends, highlighting the vulnerabilities of teenagers in online spaces. Over half of the identified child victims in widely circulated CSAM are prepubescent, and many are even too young to speak.

Impacts of CSAM

Children

Victims: physical and emotional harm

Survivors: Onoing trauma from abuse material online

Families

Parents and guardians: deep distress and helplessness

Siblings: emotional impact affecting family dynamics

Society  

Communities and schools: need awareness and support

Public Health systems: dealing with long-term mental health care

Law & legal systems

Law enforcement: increased reports, finding victims, catching perpetrators, stopping CSAM spread

Legal systems: handling complex cases, prosecuting offenders, protecting victims

Technology platforms

Tech companies: improving detection and removal of CSAM

Content moderators: facing psychological stress

An emerging threat: legal content of interest to predators

Our 2023 Global Threat Assessment noted ‘legal’ content of interest to predators as a new challenge in responding to CSAM. ‘Content of interest to predators’ (COITP) is content of children playing or exercising, or content innocently produced by children but consumed by predators for sexual gratification. Unlike child sexual abuse material content, COITP is not illegal.

Offender groups are using this to attempt to evade current platform policies and protections. Offender community groups curate this content for consumption, providing collections which can be shared and accessed by wider offender groups via social media services or other fringe services.

Curation of this type of content are strong signals for some form of intervention. They may in certain circumstances indicate wider child sexual abuse material interests as well.

In 2020, an AI ‘bot’ on Telegram generated 100,000 ‘deepfakes’ depicting real women and girls engaged in sexual acts, illustrating the scale of this issue.

Meta reported that over 90% of its reports to the National Center for Missing & Exploited Children (NCMEC) between October and November 2020 concerned shares or reshares of previously detected content.

In just one month of 2020, 8.8 million attempts to access CSAM were tracked by three of the Internet Watch Foundation’s member organizations.

The response:
Addressing the threat

Responding to CSAM requires a comprehensive approach involving various stakeholders.

Prevention and education

  • Educating children about online safety and the importance of privacy.
  • Maintaining open communication between parents and children.
  • Offering educational programs in schools and communities about CSAM dangers.

Laws and enforcement

  • Implementing strict laws against CSAM creation, distribution, and possession.
  • Collaborating globally to track and prosecute offenders.
  • Imposing severe penalties, including imprisonment and fines.

Support and resources

  • Providing victim support services, including counseling and legal aid.
  • Offering hotlines and online resources for victims and families.

Reporting mechanisms

  • Utilizing reporting features on platforms to quickly remove CSAM.
  • Key organizations like NCMEC and IWF play vital roles in processing reports.

Role of technology companies

  • Developing advanced tools like AI and machine learning to detect CSAM.
  • Collaborating with law enforcement to dismantle CSAM networks.
  • Implementing robust content moderation policies to prevent CSAM spread.

User responsibility

  • Encouraging users to report suspicious activities.
  • Educating users on privacy settings and account protection.

Global collaboration

  • Coordinating international efforts to create and enforce laws.
  • Collaborating between governments, NGOs, and tech companies.
  • Advocating for increased awareness and resources to combat CSAM globally.

What our members are doing

]]>
Extended Reality (XR) https://www.weprotect.org/thematic/extended-reality/ Fri, 09 Aug 2024 11:34:44 +0000 https://www.weprotect.org/?page_id=429076

The emergence of Extended Reality (XR) technologies

The Issue: Understanding Extended Reality (XR) technologies

What is Extended Reality (XR)?

Extended Reality (XR) is an umbrella term for immersive technologies that merge the physical and virtual worlds. This includes:    

  • Virtual Reality (VR), which usually involves the user wearing a headset that displays a simulation of the real world 
  • Augmented Reality (AR), which enhances the real-world environment by overlaying digital elements 
  • Mixed Reality (MR), which integrates elements of VR and AR, to create a blended experience where the virtual and physical interact seamlessly.  

Although XR technologies have existed for decades, there has been a surge in interest, investment and availability in recent years, especially in workplaces and consumer leisure sectors. The global market for XR is forecasted to surpass $1.1 trillion by 2030. 

Augmented reality content for mobile phones is already widely available for consumers to use in app stores like Google Play and the Apple App store, Bespoke platforms are also being developed in industrial and workplace settings. Virtual reality content is available in various forms including gaming and education platforms and often includes use of a headset.   

What are the risks associated with these technologies?

XR presents a world of new experiences and opportunities for both adults and children to learn, play and interact with the world. Unfortunately, as with any emerging technology there is a risk that offenders may use these new environments to exploit children. 

The Alliance’s 2023 intelligence briefing on XR highlighted the risks posed to child safety, including opportunities for offenders to access victims, distribute child sexual abuse material, simulate abuse of virtual children, and use haptics to mimic real-world sensations like movement and force. Predators can also exploit ‘off-platforming’, where they move communication from XR environments to other platforms to groom children.  

In May 2024, we facilitated a roundtable discussion with leading experts to discuss key risks, challenges and opportunities to make the metaverse and technologies safer. A summary paper, Beyond the Headset, outlines the key issues discussed and opportunities for action.

Virtual reality users are usually isolated by the use of headsets, which can further exacerbate risks to children.  

We know perpetrators will find any crack in platforms they can penetrate. There’s a huge amount of work to be done, but also a lot of “ knowledge we already have.

Participant in WeProtect Global Alliance roundtable discussion on XR

Who is likely to be impacted?

Children form a significant portion of the XR (Extended Reality) user base, with gaming becoming an integral part of their daily lives. A 2023 WeProtect Global Alliance survey of children aged 7-10 revealed that they feel safest on private messaging apps and gaming platforms, even though these are often areas where online abuse occurs. 

Currently, many children worldwide cannot access these emerging technologies due to high costs, limited high-speed internet access, and the necessary hardware requirements. However, XR technologies are expected to become more prevalent in the near future as they become more affordable and accessible. 

Extended Reality (XR) statistics

$1.1 trillion – the predicted global XR market by 2030

75% of people believe children are at significant risk of sexual abuse when using VR technology

Four in 10 parents (41%) say they don’t know much, or anything, about the metaverse. Over half of children (53%) say the same.

The response: addressing the threat

Key measures that can be taken by taken by a range of actors to prevent harm to children as access to XR technologies include:  

Education and awareness 

  • Equip young people and parents with information about the potential hazards and ways to avoid them, including through school education. 

Technology  

  • Prevent children from accessing inappropriate content, via user authentication, age restrictions, age assurance mechanisms and parental controls  
  • Provide advice and education for children and their caregivers about managing risks 
  • Allow users to block or otherwise restrict interactions with other users 
  • Ensure content moderation systems are in place and work across platforms. 
  • Adopt Safety by Design approaches with children in mind and address different types of risk to children. 

Legislation  

  • Ensure XR risks are covered in existing or new legislation and polic.

Data and research 

  • Investment in research and development is urgently required to support criminal investigation and prosecutions. 

Involvement of children and young people 

  • Include children and young people in product development and testing to support Safety by Design  

Resources

beyond the headset roundtable paper
Beyond the headset: Charting a course for safer experiences for children in extended reality environments

Report summarising key themes and insights from a roundtable with members, focusing on potential harms in XR environments and current mitigations, including robust content moderation, age assurance, user authentication, Safety by Design and collective social responsibility. 

VR blog
Virtual reality risks to children will only worsen without coordinated action

WeProtect Global Alliance Executive Director Iain Drennan reflects on how without coordinated action, virtual reality risks to children will only worsen.

XR intelligence briefing
Extended Reality technologies and child sexual exploitation and abuse

WeProtect Global Alliance intelligence briefing, developed by Professor Emma Barrett OBE from The University of Manchester, aims to provide an overview of the latest information and trends on eXtended Reality (XR) and its potential impact on child sexual exploitation and abuse online.

GTA 2023
Global Threat Assessment 2023

Our Global Threat Assessment report aims to encourage evidence-based action, recommend solutions and measures based on the evidence, and highlight opportunities to prevent abuse before it happens.

What our members are doing

Below are just some of the ways our members are working to protect children and young people from potential Extended Reality threats.

nspcc immersive tech

NSPCC: Child safeguarding and immersive technologies

The research included interviews and focus groups with experts in emerging technologies, a literature review, primary data collection through visits to virtual reality platforms and additional desk-based research.

Based on the findings, and with support from Limina Immersive, the NSPCC has developed recommendations for government and industry on how to ensure children can remain safe in the metaverse.

unicef metaverse

UNICEF: The Metaverse, Extended Reality and Children

The report considers both positive and negative effects that virtual environments could have on children; the drivers of and predictions for the growth of the metaverse; and the regulatory and policy challenges posed by the metaverse. The report also recommends actions for government and private sector stakeholders to take in order to empower children and protect against or mitigate potential harms.

gaming and metaverse

BRACKET Foundation: Gaming and the Metaverse

UNICRI dives in the virtual world together with partners from the Bracket Foundation and Value for Good in the new report Gaming and the Metaverse: The alarming rise of online child sexual exploitation and abuse within the new digital frontier.

]]>
Livestreaming https://www.weprotect.org/thematic/livestreaming/ Wed, 06 Nov 2024 08:58:30 +0000 https://www.weprotect.org/?page_id=436555 Understanding the issue

What is livestreaming?

Livestreaming child sexual abuse is a widespread and prolific form of abuse with high international demand. The perpetrator is typically in a different location than the victim-survivor and request specific acts to be performed by the child or perpetrated against the child by another individual.

The role of technology in livestreaming

Initially, webcams were necessary for livestreaming. However, advancements in technology now allow livestreaming via any device with a network connection and a camera, such as mobile phones, computers, professional cameras, and even devices within the ‘Internet of Things’ like drones, glasses and watches. This evolution significantly increases the potential for the proliferation of livestreamed child sexual exploitation and abuse.

The advertisement of children for livestreamed abuse commonly occurs on the surface web. Photos of children are often uploaded in masked posts that use coded keywords to a public social media page to reach a larger pool of buyers. However, the actual livestreaming of abuse tends to occur in secure environments where passwords or encryption prevent open access.

Prevalence of livestreaming

The scale of livestreamed child sexual abuse is challenging to determine due to several factors:

  • inconsistent criminalization of livestreaming child sexual abuse across different jurisdictions
  • difficulties in investigation and prosecution because once the livestream ends, evidence may be scarce unless recorded
  • most platforms do not monitor private livestreams.

While the Philippines remains a significant hotspot for livestreamed abuse, emerging evidence indicates growing instances in China, India, Indonesia, Thailand, and the United Kingdom.

Who are the offenders?

Offenders are often from developed countries, while victims are typically from developing countries. Facilitators of live child sexual exploitation and abuse are often women in the same country as the victim-survivor and may include family members or adults close to the child, driven by a desire to fund basic living costs or financial gain.

Victims and impact

Victims are primarily girls aged 13 and under, often from homes with financial insecurity. The abuse can involve severe acts, and the long-term effects, though under-researched, are known to cause severe cognitive, psychological and social issues.

Parents coercing their children into livestreaming abuse often underestimate the emotional harm caused by this form of abuse.

39% of respondents to a survey of dark web users reported they had viewed child sexual abuse livestreaming, indicating significant demand. (Suojellaan Lapsia)

Children aged 11-13 feature most in reports of ‘self-generated’ imagery, with girls in this age group representing 50% of all reports actioned in 2022 (Internet Watch Foundation)

Addressing the threat

Detection and prosecution

Detecting and prosecuting livestreamed child sexual abuse presents significant technological and legal challenges. The real-time nature of livestreaming, along with encryption and anonymous payments, complicates law enforcement efforts. Evidence is often limited, and regulating live content on platforms is difficult.

Investigations typically involve reviewing cached content and browsing history, but modern privacy features create additional barriers. Legislation must balance privacy and surveillance needs to be effective.

Legal challenges

Many international legal frameworks do not explicitly criminalize livestreaming child sexual abuse, and inconsistent legal definitions across borders complicate cross-border collaboration. Harmonized global legislation is essential to improve investigations and close these gaps.

]]>
Online grooming https://www.weprotect.org/thematic/online-grooming/ Wed, 06 Nov 2024 15:03:10 +0000 https://www.weprotect.org/?page_id=436929 Understanding online grooming

What is online grooming?

Online grooming refers to the tactics abusers use on the internet to sexually exploit children. This process can be swift or gradual, fundamentally involving building trust to manipulate and exploit, using fear and shame to keep the child silent. Recognizing and addressing this threat is crucial to safeguarding children.

The role of technology in grooming

While grooming has always existed, the rise of digital platforms has expanded abusers’ reach and opportunities. Predators follow children to their digital spaces, making online grooming a threat across various platforms. The internet has normalized communication with strangers, adding complexity to the threat.

Online grooming has evolved particularly insidiously within social gaming environments. Research from risk intelligence organisation Crisp (now Resolver) reveals that individuals seeking to abuse children in these environments are able to lock them into high-risk grooming conversations in as little as 19 seconds after the first message, with an average time of just 45 minutes.

Where does grooming happen?

Online grooming can occur almost anywhere children interact online. Many perpetrators identify targets on social media, in chat rooms, gaming environments, and other platforms that allow user-to-user communication. Predators may create fictional personas to build kinship or portray themselves as trustworthy adults, exploiting innocent interactions and pushing boundaries over time.

Perpetrators often move conversations to private messaging apps or end-to-end encrypted environments, a technique known as ‘off-platforming,’ to reduce the risk of detection.

“Perpetrators divert conversations to a private messaging app or an end-to-end encrypted environment due to the lower risk of detection”

Grooming and coercing children to produce ‘self-generated’ sexual material

Research suggests that prevalence rates for online grooming range between 9-19%. Most studies show greater grooming online among girls, though the gender difference is less marked among children under 13.

Perpetrators are less likely to continue grooming if they believe the children are under parental guardianship, highlighting the importance of examining risks and vulnerabilities in children’s lives. A multi-sectoral response between tech companies, law enforcement, and governments is necessary to detect and prevent online grooming. While parental care is a protective factor, the responsibility for preventing child sexual abuse cannot be placed solely on parents.

Our joint study with Economist Impact of 2,000 18-year-olds across four European countries, found 54% of respondents who received sexually explicit material received at least some through a private video sharing service, and 46% through a private messaging service.

Reporting data from the National Society for the Prevention of Cruelty to Children (NSPCC) shows that online grooming crimes have risen by 80% in the past four years.

Only 37% of tech companies surveyed used tools to detect the online grooming of children, according to a 2021 survey we conducted with the Tech Coalition.

The response: addressing the threat

Technical challenges and solutions

Detecting online grooming presents technical challenges, but solutions exist. Effective AI tools for detecting grooming need access to chat content to train algorithms. These tools must also detect grooming in different languages and understand slang and codewords.

Preventive measures and Safety by Design

Solutions that detect online grooming before it happens are most effective. Safety by Design solutions, such as age estimation and age verification tools, are at the forefront of this preventive approach. However, deeper knowledge of the threat is needed to implement better prevention and detection measures.

Online grooming is a complex and pervasive issue requiring global cooperation, technological advancements, and legal reforms to combat effectively. Awareness, prevention, and stringent measures are crucial in addressing and mitigating this threat.

]]>
Self-generated sexual material https://www.weprotect.org/thematic/self-generated-sexual-material/ Wed, 06 Nov 2024 15:13:16 +0000 https://www.weprotect.org/?page_id=436933 Understanding self-generated material

What is ‘self-generated’ sexual material?

Self-generated sexual material includes a wide range of images or videos created and shared by adolescents themselves. This can happen consensually between peers or under coercion, involving grooming, pressure, or manipulation.

While the term “self-generated” reflects current policy consensus, it is not universally understood or accepted. Some experts suggest using the term “image-based sexual exploitation and abuse of children” to avoid confusion with adult-produced abuse material. We refer to the phrase “self-generated” in quotation marks to avoid implying willingness on the part of the child or young person involved.

When does harm occur?

Self-generated sexual material isn’t inherently harmful. Adolescents might share such material as part of a normal developmental exchange. However, harm arises in situations such as:

  • coercion into producing sexual material
  • sharing material against someone’s wishes
  • misusing non-sexual material for sexual purposes.

Using quotation marks around “self-generated” emphasizes that children may not be willing participants.

Economic factors and exploitation

There is evidence that some young people produce sexual imagery to escape poverty. In research conducted in Ghana, children cited financial motivation as a primary reason for creating and selling sexual material. Although the overall rate of economically motivated sexual exploitation is low, economic hardship, exacerbated by factors like the COVID-19 pandemic, suggests this trend might persist.

Additionally, inadequate sexuality and healthy relationship education globally leaves many children seeking information from unreliable sources like social media or pornography, affecting their understanding of sexuality and relationships.

Changing norms and increasing detection

Changing societal norms partially explain the rise in detected self-generated sexual material. Diverse motives, from voluntary to coerced production, make this a complex issue. Most studies, including the Alliance’s research, indicate that voluntary motives are common, while fewer instances involve threats, grooming or financial gain.

Barriers to seeking help

Research we conducted with Praesidio Safeguarding in Ghana, Ireland, and Thailand revealed that fear of legal repercussions prevents many children from seeking help when dealing with self-generated sexual imagery. Often, children refer to this content as a ‘pic’ or ‘selfie’, without acknowledging its explicit nature.

Reasons for sharing sexualized images

Children and adolescents share sexualized images for various reasons, including:

  • experimentation and exploring their identity
  • as part of a romantic relationship
  • pressure from others
  • online grooming.

In some instances, younger children, out of curiosity, might innocently share images without understanding the implications. More serious cases involve exploitation by adults or peers, where children are groomed, deceived, or extorted into producing and sharing further content. Understanding these dynamics is crucial for addressing the issue effectively and supporting affected children and adolescents.

How common is the sharing of sexual images and messages among young people?

map of self generated material

The response: addressing the threat

Educational programmes

Addressing self-generated child sexual abuse material requires a multifaceted approach involving education, legal frameworks and robust support systems.

Educational programmes should emphasise the importance of consent, online safety, and the potential risks of sharing explicit content. These programmes must also encourage open communication between parents and children, helping young people understand the consequences of their online actions.

Schools and communities can play a pivotal role by providing comprehensive sexuality and healthy relationship education that reflects children’s lived experiences and equips them with the knowledge to navigate digital spaces safely.

Legal frameworks

Legal frameworks need to be robust and clear, targeting those who coerce, manipulate, or distribute self-generated child sexual abuse material while safeguarding the victims. Law enforcement agencies should be trained to handle such cases sensitively, ensuring that children feel safe to seek help without fear of criminalisation.

Technological solutions

Technological solutions, including advanced content moderation, age verification systems, and anonymous reporting mechanisms, are crucial in identifying, preventing, and removing self-generated material from online platforms.

Collaboration and public awareness

Collaboration among governments, tech companies, civil society and international organisations is essential to develop and enforce policies that protect children. Public awareness campaigns can help shift societal attitudes, making it clear that any form of child exploitation is unacceptable. Additionally, accessible mental health services and support networks are vital for helping affected children recover and rebuild their lives.

]]>
Sextortion https://www.weprotect.org/thematic/sextortion/ Thu, 13 Jun 2024 20:51:31 +0000 https://www.weprotect.org/?page_id=421475

The rise of sextortion and responses to a growing crime

The Issue: understanding sextortion

What is sextortion?

Financial sexual extortion – often referred to as ‘sextortion’- occurs when sexually explicit images or videos are exchanged online and the victim is subsequently blackmailed with threats to share the content with friends, family, or the wider internet. This crime is usually carried out by organised criminal gangs, often based overseas, who are motivated by financial gain.

Perpetrators typically pose as young girls using fake profile images or deception to convince teenage boys that they are interacting with a female peer. Once they receive a sexual image, the abusers threaten to share it unless they are paid money. Victims may also be pressured into providing more images. Blackmailers often include contact details of the victim’s friends and family in their threats and demand money via online payment apps.

Sextortion cases can escalate rapidly, sometimes unfolding in minutes or hours. The emotional impact on a child can be devastating, placing vulnerable victims at risk of self-harm or suicide.

Prevalence of sextortion

There has been a global increase in sextortion reports. In 2023, NCMEC’s CyberTipline received 26,718 reports, a jump from 10,731 reports in 2022 and up from 139 reports in 2021.

The increase in reports is also of growing concern to law enforcement. In December 2022, the Federal Bureau of Investigation (FBI) in the US issued a public safety alert about an ‘explosion’ of financial sexual extortion schemes targeting children and teens, followed by the National Crime Agency (NCA) in the UK issuing an alert to education professionals in April 2024.

Financial sexual extortion is often perpetrated by organised crime groups, predominantly from some West African and Southeast Asian countries. These criminals are primarily motivated by quick financial gain and can move from initial contact to blackmail in under an hour. Many blackmail messages follow scripts, allowing perpetrators to target numerous children simultaneously on popular social media platforms.

Who does financial sexual extortion impact?

Teenage boys are the most at-risk group for this type of online sexual exploitation. Of known cases, offshore criminal syndicates generally target children from more affluent countries.

Financial sexual extortion can cause serious harm to victims, affecting their mental health, trust, and relationships both in the short-term and long-term. The permanent nature of online images can multiply the impact, leading to self-harm and, in some tragic cases, suicide.

Sextortion statistics

7,200% increase in sextortion reports between 2021 and 2022

Source: 2023 Global Threat Assessment

Boys 13–17 years old are the group most at-risk for financial sexual extortion

Source: WeProtect Global Alliance briefing

Cyber criminals identified as operating from the Philippines, Nigeria, Cote d’Ivoire and Korea

Source: WeProtect Global Alliance briefing

$100 – $500 what most victims pay perpetrators, many demand payments over extended periods

Source: WeProtect Global Alliance briefing

The response: addressing the threat

To effectively address the escalating threat of sextortion on a global scale, collaborative efforts are essential. A concerted approach involving governments, law enforcement agencies, educational institutions, parents, and technology companies is crucial.

Key measures that must be taken include: 

Education and awareness

  • Equip young people to recognise, report, and protect against sextortion.
  • Help parents understand risks and communicate with their children.

International cooperation

  • Train law enforcement globally to investigate sextortion.
  • Share intelligence to track and apprehend perpetrators across borders.

Technology

  • Create tools that help detect and prevent sextortion.
  • Ensure platforms secure their spaces and address exploitation reports promptly.

Legislation

  • Push for consistent international laws targeting sextortion.
  • Support stricter penalties to deter offenders and secure justice for victims.

Data and research

  • More consistent reporting and data capture to understand the extent and prevalence of sextortion
  • More research to understand perpetrators, victims, the impacts and cost of sextortion, as well as prevention approaches

Involvement of children and young people

  • Consult children and young people to better understand how they engage online and uncover opportunities for prevention as well as reducing stigma and shame around reporting this type of exploitation.

Resources

sextortion briefing paper portrait 1
A web of deceit: Financial sexual extortion of children and young people

New briefing paper on financial sexual extortion that provides an overview of this rapidly growing threat, who is impacted, perpetrators and the typical pathways used to target children and young people.

snap research
Two-thirds of Gen Z targeted for online “sextortion” – New Snap research

Jacqueline Beauchere, Global Head of Platform Safety, Snap Inc. and WeProtect Global Alliance Board Member, presents new research from Snap, exploring how Gen Z teens and young adults have been targeted in online sextortion schemes.

What our members are doing

Below are just some of the ways our members are working to protect children and young people from sextortion.

no escape

No Escape Room campaign

The National Center for Missing and Exploited Children (NCMEC) in the US launched No Escape Room, a campaign and interactive film which takes the viewer through a life-life sextortion scenario. Containing messages from real cases, it is an excellent immersive educational tool that helps parents and young people understand what it is like to become a victim of financial sexual extortion.

For Your Eyes Only - sextortion and sexting

For Your Eyes Only: eLearning for youth in the Philippines

Stairway Foundation has developed, through consultations with youth, an e-learning course called For Your Eyes Only. The course uses animated film and interactive lessons to teach children and young people about sexting, sextortion and the non-consensual sharing of ‘self-generated’ intimate images.

SnapInc

Family Center and Safety Snapshot series

Snapchat’s Family Centre provides parents with tools and resources to help their teens use Snapchat safely. It includes parental controls, a checklist of safety tips to discuss with teens and access to expert resources. The Safety Snapshot series is a series of short educational videos aimed at teens using the platform, to equip users with an understanding of key topics including sextortion and how to report it.

canadian centre for child protection

Research analysis on financial sextortion victim posts

The Canadian Centre for Child Protection (C3P) conducted research analysis on more than 6,500+ firsthand open-source accounts that sextortion victims shared publicly about their experience of being financially sexually extorted. The report provides valuable insights into the methods used by cyber criminals to target young people online and effective strategies for responding to an extorter’s demands, as suggested by victim narratives.

sextortion webinar 2

Experts share insights on how financial sexual extortion is being tackled across borders

News of the webinar with speakers Jason Barry from Meta and Shelley Allwang from the National Center for Missing and Exploited Children explained why the rise of sextortion is a concern, and what can and is being done across different countries and sectors to tackle this urgent threat.

]]>