Unseen and Unchecked: It’s What We Refuse to See

Author: Zaki Verino, Founder (DWK) The calls to regulate or even ban platforms like Roblox did not come out of nowhere. They are the result of mounting fear, public pressure, and a realization that something has been wrong for a long time. When I started looking into this issue, I did not begin with official statements or government briefings. I began where most people do today. I went straight to the comment sections. Across different news outlets reporting on the possible restriction of Roblox, the reactions were immediate and divided. Some supported the move and saw it as necessary to protect children. Others criticized it, arguing that banning a platform does not solve the real problem. At first glance, it looked like a typical online debate. But the more I read, the more I realized that everyone was reacting to what they could see, not what was actually happening underneath. https://iili.io/q4YAh2p.png https://iili.io/q4YAwvI.png (Facebook posts related to CSAM activities.) The government’s action, to be clear, is commendable. Taking a firm stance on child safety at a national level is not something to dismiss. For years, this issue has existed in the background, quietly growing while attention was focused elsewhere. Seeing action now matters. It shows urgency and intent. But at the same time, there is a lingering disappointment that this response comes at a point when the problem has already evolved far beyond a single platform. Because the uncomfortable truth is this. Banning a platform like Roblox is only a temporary disruption. It may remove some bad actors in the short term. It may reduce certain risks for a while. But it does not eliminate the system that allows these threats to exist. Predators do not depend on one game or one app. They study behavior. They follow where children gather. If one platform disappears, another will take its place, and by the time it becomes popular, it will already be too late again. This is where the frustration begins to build. We are dealing with a problem that has been present for nearly a decade, yet the level of response we are seeing now feels delayed. Compared to other countries, the Philippines is still catching up in terms of investigation, monitoring, and detection of online harms. The technology used by bad actors has evolved quickly, but the systems meant to stop them have not kept the same pace. What we are seeing today is not the beginning of the problem. It is the result of years of slow recognition. I say this not just from observation, but from experience. Before Deep Web Konek was founded in 2023, I was already exposed to online communities that operated in spaces most people would not normally see. Like many others, I started out of curiosity. I wanted to understand what existed beyond the surface of the internet. At first, it felt like discovery. Learning new things, exploring hidden parts of the web, interacting with different people. It was interesting and, at times, even exciting. But that curiosity did not take long to lead somewhere darker. Between 2015 and 2019, I witnessed how certain communities normalized content that should never have been normalized. Violence was not just present. It was shared, discussed, and in some cases even glorified. Videos and materials that showed harm became a form of engagement. Over time, prolonged exposure did something more dangerous. It desensitized people. It created a space where some individuals did not just watch anymore. They began to imitate. A new breed of offenders emerged, shaped not by isolation but by repeated exposure. This is not something that disappeared. It evolved. Recently, a Facebook post surfaced describing a platform that allegedly enables violent content. At first glance, it sounded alarming. But what stood out to me was not the platform itself. It was the realization that what was being described was only the surface. A small entry point into a much larger, older network that has existed for years. The same patterns I observed before are still there, only now they are more accessible, more refined, and more difficult to trace. https://iili.io/q4YAg44.png (List of Facebook Group Chats used for CSAM, Violence related activities) Even statements from authorities reflect this pattern. I recall a press conference by the Department of Information and Communications Technology where officials mentioned chatter within online groups sharing tutorials related to harmful activities. But that is not new. Those kinds of exchanges were already happening years ago in smaller, more hidden communities. The difference now is scale. Social media platforms like Facebook have made it easier for these networks to expand and recruit without needing deep technical knowledge. And yet, despite all of this, we continue to focus on singular platforms like Roblox as if removing them will solve the problem. IT WILL NOT. Another layer that makes this issue even more difficult to confront is the role of parents themselves. It is uncomfortable to say, but necessary. In many documented cases of online child sexual abuse material, the perpetrators are not strangers hiding behind anonymous profiles. They are parents or guardians. Individuals who are supposed to protect children but instead become part of the exploitation. https://iili.io/q4YAUGf.png (Facebook post of an user inviting people to join his/her group chat with video calls.) This is where the narrative shifts from neglect to direct involvement. If parents are both failing to supervise and, in some cases, actively participating in exploitation, then the problem becomes deeply rooted. It is no longer just about external threats. It is embedded within households. This is why there is a growing need not just for enforcement, but for parental counseling and intervention. Awareness campaigns alone are not enough if the people responsible for guiding children are themselves uninformed, unprepared, or involved. At the same time, the environment children are exposed to continues to expand. I have come across posts that suggest minors are being drawn into illegal activities such as purchasing abortion-related drugs or even encountering listings related to baby-selling schemes. These are not isolated incidents. They reflect a broader pattern where illegal markets adapt to social media ecosystems. The accessibility is what makes it dangerous. Transactions that once required deeper access to hidden networks can now begin with a simple search or message. This is no longer just a digital issue. It is a societal one. We cannot ignore the role of parenting in this equation. The phrase “kaka cellphone mo yan” used to be a warning. Today, it barely carries weight. Children hear it and move on. The problem is not the phrase itself, but what it represents. A gap between access and guidance. Devices have become part of everyday life, but the responsibility that comes with them has not been fully understood. Children today grow up with unlimited access to information. With the rise of artificial intelligence, that access has become even faster and more efficient. But speed does not equal understanding. In fact, what I have observed is the opposite. Many young users struggle with critical thinking despite having information at their fingertips. They can find answers quickly, but they cannot always evaluate what is safe, real, or manipulated. When that lack of critical thinking meets poor supervision, the result is predictable. Curiosity leads them deeper. Without guidance, they explore spaces they do not fully understand. Some of these spaces appear harmless at first. Chatrooms, for example, especially those framed as roleplaying communities, often attract younger users. On the surface, they are interactive and creative. But in some cases, they are being used for something far more dangerous. These environments have been exploited for the production of self generated or first person child sexual abuse material. Instead of traditional distribution methods, minors are manipulated into creating content themselves under the guise of interaction or roleplay. It is subtle, it is deceptive, and it is difficult to detect without direct observation. Despite stricter enforcement by companies like Meta Platforms, these activities continue. They persist because the people behind them adapt. They use coded language. They blend into normal conversations. They operate in spaces that appear ordinary to outsiders. So when I go back to the issue of banning Roblox, I cannot ignore everything else I have seen. The platform is not the root. It is just one of many entry points. The government’s actions deserve recognition. There is effort, there is intent, and there is a clear attempt to protect children. But there is also a need to confront a more difficult reality. The problem is larger, deeper, and more complex than a single policy can address. Because if there is one thing this investigation has made clear, it is this. The threat is not limited to one platform. It is not limited to one group of people. It exists across systems, across communities, and sometimes even within families. And unless those layers are addressed together, we will continue responding to the same problem over and over again, each time thinking we are seeing it for the first time.

Other contents

New Home For Deep Web Konek

New Home For Deep Web Konek

Unseen and Unchecked: It’s What We Refuse to See

Unseen and Unchecked: It’s What We Refuse to See

Dangerous Drugs Board Allegedly Breached, Sensitive Drug and Rehabilitation Records Exposed Online

Dangerous Drugs Board Allegedly Breached, Sensitive Drug and Rehabilitation Records Exposed Online

Data Breach Hits Home Health Care, Sensitive Records Exposed

Data Breach Hits Home Health Care, Sensitive Records Exposed

Department of Public Works and Highways Reportedly Targeted by Bashe Ransomware (APT73) in Suspected Data Breach

Department of Public Works and Highways Reportedly Targeted by Bashe Ransomware (APT73) in Suspected Data Breach

Professional Regulation Commission Records Allegedly Exposed in Online Data Leak

Professional Regulation Commission Records Allegedly Exposed in Online Data Leak

Ransomware Group Claims Breach of Telcom Live Content Inc., Allegedly Exfiltrating Databases and Source Code

Ransomware Group Claims Breach of Telcom Live Content Inc., Allegedly Exfiltrating Databases and Source Code

Tumblr Access in Philippines Sparks Online Debate After Being Flagged as Gambling Site

Tumblr Access in Philippines Sparks Online Debate After Being Flagged as Gambling Site

Employee Data from John Hay Management Corporation Allegedly Leaked Online

Employee Data from John Hay Management Corporation Allegedly Leaked Online

Quick-Fix Measures Are Failing to Protect Users Online

Quick-Fix Measures Are Failing to Protect Users Online