The First Shot Against Persuasive Technology

Growing up in Zimbabwe, I never had the luxury of being raised by algorithms. There was no endless scroll. No dopamine slot machine in my pocket. Attention was shaped by people, not platforms. Boredom existed. Silence existed. Presence existed. At the time, it felt ordinary. In retrospect, it feels like an advantage I didn't know I had.
I don't use X. I have never used Snapchat in my life. I opened Facebook once in the last four months—out of curiosity, not habit. Instagram was the hardest to leave. I fought it longer than I want to admit. It's almost comical that even deleting your account requires a YouTube tutorial. The deletion option is buried so deeply it feels intentional. A maze. A quiet jail you're not supposed to notice. They did not design an exit. They designed containment.
I went from an hour a day, to an hour a week. My current goal is one hour a month. And no, this wasn't discipline. It was withdrawal. Anyone who pretends otherwise is lying to themselves.
The same pattern shows up in dating apps. I've never used them, but watching from the outside is enough. They sell abundance. Infinite options. The illusion that there is always someone better, one swipe away. It's seductive—and false. You don't meet more people. You devalue the ones in front of you. Choice becomes noise. Connection becomes disposable.
What's strange is that a small group of us—my friends in Ditto, in Social—are actively resisting this. Not loudly. Not performatively. Quietly. Intentionally. It feels like we are inheriting a world that has been booby-trapped. And instead of accepting it, we are trying to disarm it.
This isn't nostalgia. It's clarity. We are not anti-technology. We are anti-exploitation.
And I'm not writing this as an outsider.
I'm a software engineer. I know exactly how these systems are built. I understand the mechanics behind engagement loops, infinite scroll, variable rewards, notification engineering, A/B testing for compulsion, and the careful calibration of uncertainty. I know how the colors are chosen, how the timing is tuned, how the randomness is engineered. I know what it takes to get people hooked.
Once you see it, you can't unsee it.
That knowledge creates a problem. Because if you understand how the trap is built and still build it, you are no longer neutral. You are complicit.
So I started reading. Not casually. Seriously. About addiction. About adolescence. About cognition. About the long-term effects of persuasive design on a developing brain.
The teenage brain, between ages 10 and 19, is in a critical developmental window. The socio-affective system, which governs emotion and reward, is highly active. The prefrontal cortex, responsible for impulse control and long-term reasoning, is not fully developed until the mid-twenties. This mismatch makes teens uniquely vulnerable to engineered reward systems.
Modern apps exploit this. They use variable reward mechanisms—the same principle used in slot machines. You swipe without knowing whether you'll see a funny video, a like, or nothing at all. That uncertainty releases dopamine. The brain learns to check again. And again. And again.
Over time, focus erodes. Constant notification pinging fragments attention. Deep work becomes physically uncomfortable. Reading feels slow. Silence feels wrong.
Then comes the comparison trap.
Research consistently shows that how teens use social media matters more than how much they use it. Passive scrolling—watching other people live without participating—correlates with higher rates of depression than active interaction. The highlight reel becomes the baseline. Teens begin to believe their peers are happier, richer, more attractive, more fulfilled. Not because it's true, but because the algorithm is selective.
The result is not motivation. It is quiet inadequacy.
This effect is statistically more pronounced in teenage girls. Anxiety. Sadness. A sense of not measuring up. Not because they are failing, but because the system is lying.
Visual-first platforms make it worse. Filters, editing tools, and now real-time video manipulation create beauty standards that do not exist. Teens are taught, implicitly, that their worth is quantifiable. Likes become validation. Absence of likes becomes rejection. Body dysmorphia is no longer rare. Disordered eating is no longer shocking. Even brief exposure to "fitspiration" content has been shown to lower body esteem.
And unlike the past, there is no escape.
Bullying used to end when you went home. Now it follows you into your bedroom. Anonymity removes inhibition. Screens create distance. Words become sharper. Cruelty becomes easier. Once something is posted, it is permanent. The internet does not forget. And neither do teenagers.
Then there is what no one talks about enough: displacement.
Sleep is the first casualty. Blue light suppresses melatonin. Mental stimulation delays rest. Sleep-deprived teens are emotionally brittle. Resilience collapses. Small problems feel catastrophic.
In-person skills follow. Non-verbal cues fade. Tone is misread. Empathy weakens. Conflict resolution suffers. We are training a generation to communicate without presence.
This is not accidental. It is engineered.
And here is the uncomfortable part: I am in the same industry that designs these systems.
That is why I decided to build something different.
Not a productivity app. Not another habit tracker. Not another notification manager.
A social-consensus anti-addiction system.
The idea is simple, but intentionally uncomfortable.
If you want to unlock a blacklisted app during a focus session, you don't just tap a button. You must request approval from people you trust—your guardians. Two out of three must agree. And you must explain yourself. Out loud. In your own voice.
If consensus is reached, you get a temporary pass. Fifteen minutes. Then it locks again.
If you try to delete the app, it doesn't fail silently. It alerts your guardians. It shows them the streak you are about to destroy. It forces accountability into the open.
There is a shared garden. If you stay focused, it grows for everyone. If you break your commitment, it withers. Your choices affect other people. Just like in real life.
This is not gentle. It is not polite. It is not optimized for user delight.
It is optimized for truth.
We do not need more frictionless experiences. We need more meaningful resistance. We do not need more convenience. We need more consequences.
Persuasive technology has had a decade to prove that it will regulate itself. It has failed.
So this is my first shot.
Not against social media. Against the assumption that software must exploit human weakness to be successful. Against the belief that engagement is the same as value. Against the quiet normalization of addiction as a business model.
I am choosing to build on the side of human discipline, not human impulse. That will cost me growth. It will cost me users. It will cost me money.
I am fine with that.
Because some lines, once crossed, cannot be uncrossed. And some technologies, once built, cannot be justified.
This is where I stand.