PLEASE! Pay Attention to Children, Phones, and the Internet

Warning: This text contains information that some readers may find disturbing.

When discussing policy-making, it is extremely important that gender-responsive solutions and approaches to the most vulnerable among us are considered when drafting proposals. A human-rights-based approach to cybersecurity means putting people at the center, building trust in the security of systems—including all their components (networks and devices)—in ways that strengthen, rather than endanger, human safety. In this way, the system itself becomes a manifestation of human rights (freedom of thought, expression, and association). Positioning people as the primary subject of cybersecurity makes human security the main objective of processes and regulations.

According to ITU, the International Telecommunication Union, globally the internet is used by 70% of men and 65% of women. This means that worldwide, there are 244 million more male internet users than female users. What does this mean for our research? It indicates that digital divides create differentiated vulnerabilities to cyberattacks.

Technology-facilitated gender-based violence includes any act of gender-based violence that is committed, promoted, or exacerbated, in whole or in part, through the use of computer technology. Technology adds the element of potential repetition of the act, multiplying the harm through multiple online shares. Women and girls experience violence at far higher rates than men. They are often disproportionately targeted by hate speech, sexualized online abuse, and cybercrime. These forms of violence represent the most modern—but only a part of—the extremely complex mosaic of violence against women. In today’s technological and communication era, the most vulnerable among us are children, women, and the elderly.

For us at Women4Cyber North Macedonia, issues targeting the safety of women and girls online are crucial. In this text, we focus on the most vulnerable among us: children. When we recall our own childhood, the advice from adults to avoid talking to strangers wherever we go was inevitable for our protection and safety. According to recent research in peer psychology and the digital age, threats often come from those we spend the most time with, whom we know. The internet creates an artificial illusion that we know the people on the other side of the screen.

Two weeks ago, U.S. newspapers were flooded with the news of teenager Caleb Moore, who committed suicide at home after a 35-minute conversation on Snapchat (according to forensic examination of his phone). Caleb, a teen from El Dorado, Kansas, was scrolling TikTok when he “met” someone who claimed to be a “14-year-old girl.” According to his mother, Morgan, the two began “flirting” before moving the conversation to Snapchat. The “girl” allegedly sent compromising images of herself, prompting Caleb to do the same. He was then threatened to send a large sum of money or the photos would be shared—an example of extortion known in cybercrime as “sextortion.” “They made him feel like his life was over because of this mistake,” said his mother, adding that she believes her son felt he had nowhere to turn. After the 35-minute Snapchat exchange, Caleb took his own life at home. Discussion could continue in many directions, but we will stop here.

Undress AI is a digital tool that uses artificial intelligence to virtually remove clothing from a person in a photo without their knowledge or consent. The tool analyzes age, pose, body proportions, and other details to realistically generate sexualized content. Self-generated sexual content refers to content created and shared by the person themselves, which can then be modified.

Undress AI is among the latest trends in peer abuse—children create content using this and similar apps, sharing it for ridicule, attention, manipulation, or other deviant purposes. For some, it may seem harmless; for others, it can lead to suicide. Even if the manipulated image does not show the actual body of the victim, it implies it—enough for an average young person to enter a downward spiral they may not escape. Young people are often not fully aware of legal regulations and may struggle to differentiate harmful tools from those promoting harmless fun.

Snapchat messages disappear once read. Apps like Secret Calculator mimic a regular calculator icon but are accessed with a PIN and can store explicit photos and files. Many everyday apps have hidden messaging capabilities. Sugar dating apps connect older users (so-called sugar daddies/mommies) with younger users. Although legal for adults, these apps carry serious risks when minors are involved—a reality that occurs more often than expected. Such apps may appear as “easy ways to earn money,” but in reality, they can have long-term effects on physical, emotional, and mental health.

To support the above with facts, we refer to the Internet Watch Foundation (IWF) 2024 report on online child sexual abuse:

  • 424,047 reports of sexual abuse received (8% increase from 2023)

  • 729,696 images classified as illegal

  • 424,031 from internet URLs or child welfare services

  • 16 from informal groups

  • 267,788 (63%) resulted from proactive IWF monitoring

  • On average, such an event occurs every 74 seconds

  • 291,273 reports contained criminal photos, links, or promotional material (6% increase from 2023)

  • 290,637 from URLs, 636 from child welfare services

  • 91% of reports involved self-generated images, 94% of which depicted girls

The annual report shows that on average, every 108 seconds a child experiences sexual abuse. The IWF, active since the early days of internet access for the public (29 years ago), aims for a world where no child is sexually abused or forced to face the possibility of their abuse being shared online. Survivors are often traumatized long after the physical abuse ends—a phenomenon called re-victimization.

The organization and its partners, through police and judicial channels, work to locate and remove online records of child sexual abuse. Perpetrators of creating, sharing, or selling illegal images are ruthless. Images removed by IWF analysts range from infant abuse to teenagers seeking connections online. Often, victims do not realize they are trapped until it is too late. This mission is extremely challenging but noble. Thanks to the work of these individuals, the internet is a safer place for children.

Most importantly: regardless of age, children learn by example. Limit your own screen time, monitor what you view online, and try to understand your child’s perspective. Focus on peer behavior rather than conclusions.

As we enjoy summer and share vacation moments online, remember: once online, always online. Future distribution is out of your control. Think twice—or thrice—before posting anything of yourself or your family. At the end of the day, ask yourself: would your child, spouse, or friend want this online?

This text is a small effort to raise awareness in an ocean of potential dangers from unregulated internet use. Technology brings many benefits, but the purpose here is not paranoia—it is to promote responsible and careful digital use, building digitally resilient youth. No child deserves to face such horrors or be subjected to double victimization. Perpetrators use every possible way to harm their victims, and this reality is part of everyday life.

Until next time,

Tamara

M.Sc. Tamara Lazarevska Siljanoska, Lieutenant, member of Women4Cyber North Macedonia, and enthusiast on how modern technology intersects with law and human rights.