The Road to Digital Minimalism
Reclaiming My Attention in a World That Tracks My Eyes

I used to think social media was just part of modern life - a noisy background, sometimes useful, mostly "for fun."
Now I see it differently: like walking into a casino where the house not only rigs the games, but also watches my eyes, heart rate, and micro-movements to redesign the room in real time.
That realization didn't come from a motivational quote. It came from reading actual research on eye tracking and contextual AI, and from listening to people who walked away from the slot machines for long enough to see clearly.
I'm fascinated by digital minimalism because I don't just want to use technology - I want to understand what it's doing back to me. My goal is simple, but not easy: one full month with zero social media, documenting how my mind, mood, focus, and relationships change, and then asking honestly whether going back is even worth it.
To understand why this experiment matters, you have to look at what social media has become - and what the latest AI + eye-tracking research says about where it's going.
Scrolling Is the New Smoking
In their TEDxFargo talk, "Scrolling Is the New Smoking," The Minimalists describe something that felt uncomfortably familiar: the sense of being trapped in digital consumerism - constantly wanting "more" even when nothing satisfying ever really arrives.
They weren't casual users.
They reached half a billion people and gained 4 million followers in a single year. By all conventional metrics, social media was "working" for them. And yet they still walked away from it for a year.
They shared a few numbers that cut through the denial:
- The average person spends nearly two hours a day on social media.
- Those who spend more than two hours are almost three times more likely to experience depression.
- Social media use is associated with about a 40% higher risk of sleep problems.
- 46% of adolescents feel worse about their body image because of it.
- 60% of users say it negatively affects their self-esteem.
These aren't just side effects. They're signals.
What struck me is how they described the quality of the experience: stimulation without satisfaction. The constant sense of being plugged into something - news, trends, opinions - but not actually nourished by any of it.
They broke down three core problems with social media:
- Obligation - The pressure to keep up with every headline, hot take, and update, as if missing one makes you irrelevant.
- Overconsumption - The endless scroll, engineered to convert boredom into compulsive engagement.
- Discontent - The quiet shift from intentional searching ("I need this information") to aimless scrolling ("Show me something to feel something").
When they quit for a year, they didn't just "use their phones less." Their behavior rewired:
- They stopped instinctively reaching for their phones.
- Relationships deepened through actual calls and texts.
- They had more time and energy for creative work.
- They felt calmer and had more room for ideas to incubate.
They paid a price - their revenue dropped 21%, and their online reach went to zero. But they gained something harder to measure: agency.
And that's before we even get to the part that genuinely changes the stakes: eye tracking and AI.
From Scroll Data to Eye Data
Most people understand that platforms track clicks, likes, watch time, and scroll behavior. That's old news.
What's less visible is how far this has pushed into attention-level signals - and how close it is to literal gaze tracking, whether you call it that or not.
Consider what Instagram and TikTok already do with "behavioral signals":
- How long you hover before scrolling.
- When your scroll slows or stops.
- Whether your face is oriented toward the screen.
- How you react (expressions, micro-pauses) to different types of content.
- The rhythm and timing of your interactions.
Even without formal "eye tracking," those patterns approximate where your attention is and what holds your gaze longest. Add in high-resolution front cameras (already used for AR filters, face landmarks, effects), and the gap between "behavioral analytics" and "de facto gaze tracking" shrinks quickly.
But then I read something that pulled this into sharp focus.
What Meta's Eye-Gaze Paper Actually Shows
In 2025, researchers at Meta Reality Labs published a paper called "Eye Gaze as a Signal for Conveying User Attention in Contextual AI Systems."
Stripped of academic phrasing, the core message is straightforward:
Eye gaze is a powerful signal for understanding what a user cares about and what they are about to do - and AI models can exploit that.
They did two main things:
- Measured how accurate eye tracking needs to be to reliably know which real-world object you're looking at.
- Tested how much eye-gaze history helps a vision-language model (VLM) answer questions like: "What am I looking at?" and "What am I going to interact with?"
1. How accurate is "good enough"?
They used the Aria Digital Twins dataset - egocentric recordings of people doing everyday tasks in a fully furnished apartment: cooking, cleaning, decorating, working. Every object in the scene was digitally tracked in 3D, and the glasses captured eye tracking data.
They looked at:
- Near-field objects: within 1 meter.
- Mid-field objects: between 1-2 meters.
- Interacted objects: things people actually touched, pressed, picked up.
- Fixated objects: what people deliberately looked at.
What they found:
- For fixated objects within 2 meters, and for near-field objects, the object's angular size is usually larger than the tracking error. The system can reliably tell what you're looking at most of the time.
- For objects you interact with (grabbing, pressing, etc.), the visual footprint is even larger (10+ degrees). When you're about to do something with an object, eye tracking can almost always identify it correctly.
- Mid-field objects (1-2 meters) are more borderline; around half of them are too small to reliably identify at 3 degree error.
Translation: For anything close to you - especially things you touch - today's wearable eye tracking is already accurate enough to know what you're focused on. This isn't speculative. It's measured.
2. How much does gaze history help AI understand you?
Next, they fed gaze data into a Meta Llama 3.2 90B VLM to see how much better it could understand user attention and predict actions.
They used scanpaths - the sequence of previous fixations - as context.
Task E1: "What am I looking at?"
Results:
- With no gaze context, the model got the right object only 10.3% of the time.
- When they added scanpath history, accuracy climbed and peaked at about 24.8% with 6 prior fixations.
So just by telling the model what you looked at over the last few moments, they more than doubled its ability to figure out what you're currently focused on.
Task E2: "What am I going to interact with?"
They only looked at frames where the user would physically interact with an object within the next second.
Results:
- With gaze context, accuracy reached around 49.5% at its peak.
Almost 50/50 odds of correctly predicting what object you're about to touch - based just on where your eyes have been looking and the current scene image.
The authors describe this as proof that:
- Gaze isn't just about the present. It encodes intent, task context, and action plans.
- Scanpaths have temporal structure, similar to how language has patterns over sequences. And VLMs can leverage that structure.
They also acknowledge something most users will never read: "Eye movements reveal personal information and preferences, and any contextual AI system using eye tracking must be secure and privacy-preserving." Whether systems will actually be built that way is another question.
What This Means for the "Casino"
Put this together with how social platforms already work:
- They optimize for what holds your gaze longest.
- They tune the feed using scroll rhythm, micro-pauses, hesitation, and replays.
- High-resolution cameras and sensors can capture micro-expressions, face orientation, and pupil changes.
- Now we know: eye tracking + VLMs can infer what you're looking at right now, predict what you will interact with next, and use your gaze history like a contextual prompt.
The casino has moved from:
"Which thumbnail will you click?"
to:
"Given your recent scanpath, what object, topic, or content will capture you next - and how can we get there first?"
That's the world I'm choosing to test myself against when I say: "I want to go one month with zero social media."
This is not a productivity challenge. It's a sovereignty test.
The Road to Digital Minimalism
The Minimalists offered a useful metaphor: social media as a digital table with three legs:
- Content Consumption - Ideally: intentional, replenishing, not draining.
- Creativity - Using tools to make things, not just react.
- Connection - Using platforms to express care, love, and purpose - not comparison and status.
But most people aren't sitting at a stable three-legged table.
They're standing at a slot machine that pays out dopamine, insecurity, distraction, and occasional validation.
After their year offline, The Minimalists didn't swear off technology forever. They set rules:
- Casino Rule: Decide how much time you'll spend before you step into the app.
- Replace Scrolling with Searching: Never go online without a question or purpose.
- Make Your Phone Boring: Grayscale, no hyper-stimulating icons.
- Declutter Apps: Remove anything that exists purely to hijack time and attention.
- Entryway Rule: Leave the phone by the door so it doesn't follow you everywhere.
These aren't gimmicks. They're friction - deliberate resistance against systems designed to remove all friction between impulse and behavior.
My One-Month Experiment
Given everything above, my experiment is simple:
One month. Zero social media.
Not a "detox." A deliberate interruption of the attention market.
During that month, I plan to track:
- Sleep quality: When I go to bed, how well I sleep, and whether my mind feels quieter at night.
- Mood and anxiety: How often I feel restless, scattered, or "itchy" for stimulation.
- Focus blocks: How many uninterrupted deep work sessions I get per day.
- Cravings: When and how often I feel the urge to reach for social media - and what triggers it.
- Relationships: Whether I text, call, or see people in person more.
- Creativity: Whether ideas come more easily when I'm not constantly consuming.
I'll also apply some of the Minimalists' rules to everything else digital:
- Phone on grayscale.
- No addictive apps on the home screen.
- Phone parked away from me when I'm working, reading, praying, or resting.
- Online time driven by questions and projects, not boredom.
At the end of the month, I want to be able to answer honestly:
- Do I think more clearly?
- Do I feel more grounded in my own life, not everyone else's?
- Do I miss social media - or do I just miss the easy escape it gave me?
- Now that I know how far eye-tracking and AI have gone, do I actually want to keep feeding them my attention?
Conclusion: Walking Away from the Table
The research is clear:
- Our eyes are not just seeing; they are speaking a language that AI systems are learning to read.
- Our attention is not just drifting; it is being measured, modeled, and monetized with increasing precision.
- Our "free" platforms are not free; the cost is paid in fragments of focus, sleep, self-esteem, and agency.
Digital minimalism, for me, is no longer about being "less online" or "more productive."
It's about refusing to be an unthinking data source for systems that can already guess what I'm about to do next.
The road to digital minimalism starts with a decision:
Step away from the machine long enough to remember what it feels like when my nervous system, my gaze, and my mind belong to me - and not to an algorithm that sees my eyes as just another input stream.
One month is the first step.
The real question is what I'll see when I finally look back with my eyes fully my own.