A cybersecurity executive confronts the threats she can’t control

I’ve spent two decades protecting banks from nation-state hackers and ransomware cartels. None of it prepared me for my nine-year-old daughter walking into the kitchen holding her iPad.

She wasn’t scared or hiding anything. She simply wanted permission to join TikTok so she could make dancing videos and “get more likes than her friends.”

Her request was innocent. My reaction wasn’t.

Behind her excitement, I saw the algorithms designed to shape how children see themselves. I saw the pressure to perform, the pull of validation, the vulnerability that predators, trolls, and opportunistic strangers online exploit long before kids understand the risks.

And I realized something unsettling: I could explain lateral movement in a compromised network to a board of directors, but I couldn’t explain to a fourth grader why chasing approval from millions of strangers might be dangerous.

The Dangers Are Already Inside

The second moment came more quietly. My five-year-old son sat near the window listening to the neighborhood kids play Roblox. He wasn’t playing—just absorbing the conversations. Sarcasm. Sharp language. Jokes and comments no kindergartener should recognize, let alone repeat.

He mimicked one under his breath. My heart dropped.

These weren’t bad kids. They were just kids inside digital spaces with no real age gates, no reliable moderation, and no guardrails for what the youngest members inadvertently absorb.

Suddenly, the risks I spent my career fighting in corporate environments weren’t abstract anymore. They were sitting on my couch.

The statistics confirm what many parents sense but struggle to articulate. Nearly all American tweens and teens are online every day, and younger kids now log several hours of recreational screen time daily, on top of schoolwork. A 2021 investigation by The Wall Street Journal found that TikTok’s algorithm can surface harmful content to minors in under three minutes. Roblox, with over 70 million daily users—most under 13—has faced mounting concerns about grooming-related incidents, and the National Center for Missing and Exploited Children handled 36.2 million reports of suspected child sexual exploitation in 2023 alone, a staggering increase from previous years.

Corporate breaches can cost millions. The erosion of a child’s confidence or safety can reverberate for a lifetime.

The Gap Between Protection and Parenting

In my professional life, I architect multilayered security programs across continents. In my personal life, I couldn’t find the language to explain to my daughter why “likes” can become addictive, why not everyone online is who they appear to be, why certain conversation patterns signal danger, or why her voice, her image, and her privacy matter in ways she won’t fully understand until it’s too late.

Kids don’t learn best through rules and restrictions alone. They learn through stories, characters, and scenarios that help them develop their own internal compass.

Adults get frameworks and incident response playbooks. Children need characters who face dilemmas, make choices, and model the kind of discernment that turns into instinct.

This realization changed how I approached the problem. If digital safety education continues to rely primarily on warnings and prohibitions, we’re fighting a losing battle. Research in educational psychology consistently shows that narrative-based learning significantly improves retention and application compared to rule-based instruction. Stories bypass defensiveness. They create language for uncomfortable situations before those situations arise.

We teach fire safety before a fire. We practice earthquake drills before the ground shakes. We must teach digital safety before the harm occurs.

Building an Internal Defense System

In cybersecurity, we use a concept called “defense in depth”—layered protection so if one barrier fails, others remain intact. Children need the same approach: parental guidance, technical safeguards, school-based digital citizenship, platform accountability, and emotional intelligence paired with intuition.

I wanted to give children what my industry gives executives: clarity, confidence, and a sense of agency. More importantly, I wanted to help them develop something the online world rarely provides—a pause button, a gut check, a trusted inner voice.

That’s why I wrote a children’s book. Not as a substitute for parental oversight or technical controls, but as one layer in a comprehensive approach to digital safety.

The book follows five kids, each representing different ways children navigate the digital world: the inventive problem-solver, the cautious double-checker, the impulsive risk-taker, the creative coder, and the loyal friend who simply wants to belong. Five digital archetypes. Five reminders that no child is “too smart” or “too careful” to be targeted.

Alongside them are three mentors—not authority figures who swoop in to rescue, but guides who model a child’s intuition. They represent the inner voice that whispers: “Does this feel right?” “Who benefits if I click this?” “Is this attention worth it?” “What would a trusted adult say?”

Technology isn’t the villain in this story. Disconnection from intuition is.

Through their adventure, the characters encounter scenarios many children face but few feel equipped to discuss: what manipulation feels like, how attention and validation can distort judgment, how to handle uncomfortable conversations in games, and when to pause, ask questions, and seek help.

Why Fiction Accomplishes What Warnings Cannot

The response from early readers confirmed what I’d hoped. Parents tell me their children are asking new questions: “Who can see my videos?” “Why does this app need my birthday?” “Is this a red flag?”

Teachers say the story opens conversations that previously felt preachy or overwhelming. Kids say they finally understand how to trust their instincts when something feels wrong online.

This is what effective prevention looks like for children online—not fear-based warnings, but tools for discernment woven into a narrative that respects children’s intelligence while acknowledging their vulnerability.

Stories create language for early warning signs. Children learn to articulate feelings like “It felt weird when that person asked for a picture” or “I didn’t like what they said in the game” or “I clicked because everyone else did, but it didn’t feel right.” That language becomes the foundation for disclosure, for seeking help, for making better choices the next time a situation escalates.

The Work That Remains

Writing a book was one response to a problem that demands many more. Digital safety requires sustained, coordinated effort across multiple domains: comprehensive parental involvement, robust technical safeguards, school-based digital citizenship curricula, genuine platform accountability, and yes, stories that help children strengthen their internal radar.

I’ve spent two decades protecting institutions from catastrophic loss. But protecting a child’s digital wellbeing—my own children included—may be the most critical security challenge I’ve encountered.

Our kids are growing up in a world we’re still learning to navigate. They deserve stories that prepare them. Characters who empower them. Adults who refuse to outsource their safety entirely to algorithms and app settings.

This book represents the most urgent incident response plan I’ve ever built—written before the breach, for the audience that needs it most.


Lakshmi Srinivasan is a cybersecurity executive with over 20 years of experience protecting financial institutions from digital threats. She has led security programs at Truist, BNP Paribas, and Bank of Montreal, managing teams across North America and Europe. She is the author of The Cyber Squad Chronicles: The Crowned Glitch and writes about digital safety on her website.

Leave a comment

Trending