Back to Podcast Guides

Digital Distraction: Reclaim Attention Without Quitting Technology

Jonny Miller with Jay Vidyarthi·2025-09-30·Podcast Guide

About the guest

Jay Vidyarthi

Jay Vidyarthi is a writer, meditator, designer, and product leader working to restore attention as a depleted natural resource. He is the founder of Still Ape and the author of Reclaim Your Mind: Seven Strategies to Enjoy Tech Mindfully, with a body of work focused on mindfulness, human-centred design, and healthier relationships with technology.

Learn more →

Listen to the episode

Episode 78 · Jay Vidyarthi · 1:09:40

Reclaiming attention starts with honest contact, not another rule

Digital distraction is usually framed as a failure of discipline. Delete the app. Turn off the notifications. Buy the dumb phone. Be stronger.

Jay Vidyarthi brings a more useful frame to this conversation with Jonny Miller. He loves technology, designs technology, and still takes the attention economy seriously. That combination matters. The aim is not to become pure enough to escape screens forever. The aim is to notice how specific technologies interact with your body, needs, attention, relationships, and values — then design a more honest relationship with them.1

The practical question underneath the episode is simple: when a device, platform, game, feed, inbox, or AI tool pulls your attention, can you feel what is actually happening before you obey the pull?

That one moment of awareness is the first “spell” Jay names in his defence against the dark arts of distraction.2 Not a grand life overhaul. Not a purity vow. A pause long enough to recognise: something is trying to move my attention, and I get to participate in what happens next. 1For NSM readers, this is nervous-system work as much as productivity work: attention narrows, breath catches, urgency spikes, the hand moves toward the phone, and the body starts acting before the mind has chosen.

There is research context for taking this seriously without exaggerating it. Reviews on smartphones and cognition suggest that heavy mobile-tech habits are associated with attention, memory, and delay-of-gratification questions, while also noting that the evidence base is still developing and many findings are correlational.3 Other experiments suggest that even phone presence or notifications can carry attentional costs in certain settings.4 Jay’s contribution is not “science proves all tech is bad.” It is more grounded: learn the difference between technology that nourishes you and technology that keeps recruiting your unexamined loops.

Stop treating “technology” as one thing

One of the cleanest moves in the episode is Jay’s refusal to talk about technology as a single category.5

For one person, Substack may feel reflective and wholesome. For another, it might trigger hypervigilance around being seen, received, liked, or respected. TikTok might be a compulsive escape for one nervous system and a genuinely bounded play break for another. Work email might be a useful tool at 10am and a worthiness trap at 10pm.

So the better question is not “Is this app good or bad?” It is:

  • What state am I in before I open it?
  • What part of me is reaching for it?
  • What promise does it seem to be making?
  • What happens in my body while I am using it?
  • How do I feel ten minutes after I stop?

This is where Jonny and Jay connect technology use to parts, unmet needs, and self-compassion. Jay gives examples of healthy needs — play, agency, being useful, being valued — getting routed through digital loops that only partially satisfy them.6 The need itself may be legitimate. The loop may still be costly.

That distinction reduces shame. If late-night email is trying to deliver a sense of worth, yelling at yourself for checking email rarely resolves the deeper pattern. A more precise inquiry might be: What am I hoping this inbox will tell me about my value, safety, or belonging?

Watch for the body-signature of false urgency

The attention economy often works by manufacturing urgency. Countdown timers, badges, unread counts, breaking-news banners, view metrics, hearts, streaks, read receipts, personalised ads, and auto-play all say some version of: now, now, now.

Jay’s second “spell” is clarity: seeing through the conceptual illusions of the interface.7

A heart does not necessarily mean someone deeply received your work. A view may mean someone watched for two seconds. A “match” may not be real acceptance. A bot that talks warmly may still be unable to be aware of you in the way a person can be. A countdown timer is not a neutral feature; it is a small coercive structure asking your nervous system to keep going.

The body often registers this before the story catches up:

  • the thumb hovering before you realise you have decided;
  • a tiny clench in the jaw when a notification arrives;
  • breath becoming shallow while refreshing analytics;
  • a collapse after scrolling past something that triggered comparison;
  • a spike of heat or pressure when an ad hooks insecurity;
  • the restless sense that the next swipe, email, or video might finally settle something.

None of these signals mean you are broken. They are data. If you can feel them while they are happening, the spell is already weaker.

Build boundaries that make the allowed thing more beautiful

Jay is not arguing for harsher self-control. His most practical reframe is that a good boundary should create a better ritual on the side of the line you actually want to live on.8

A brittle rule says: “I am not allowed to watch more than one episode.”

A ritualised boundary says: “I am going to choose one film or episode with care, put other distractions away, make a snack, enjoy it fully, and let it end cleanly.”

That difference matters. The first version depends on resistance. The second version increases presence, satisfaction, and completion. The boundary is still real, but it is no longer organised around deprivation.

Jay gives the example of intentionally watching one show or movie per day, not as punishment, but as a way of restoring the magic of the experience: choosing well, settling in, removing competing inputs, and enjoying it without guilt.9 He also describes creative constraints with music — treating his listening like an art gallery with a limited number of “resident” artists, so attention can deepen instead of scattering across infinite options.10

The nervous-system logic is clear. Infinite choice can keep the system scanning. A clean container lets the body receive what is here.

Practice

Run the before–during–after technology audit

Jay’s suggested experiment is simple: for one day, or ideally one week, track the felt arc of your technology use instead of only tracking minutes.11

  1. Choose a tracking window. One full day is enough to learn something. A week will reveal stronger patterns.
  2. Before each interaction, note the entry state. Ask: “What made me reach for this? What am I feeling in my body? What am I hoping this will give me?”
  3. During the interaction, watch the pull. Notice urgency, ease, pleasure, tightening, comparison, connection, curiosity, numbness, or relief-seeking.
  4. Afterwards, record the residue. Do you feel clearer, nourished, connected, flat, agitated, avoidant, inspired, or scattered?
  5. Look for one pattern, not a total life diagnosis. Which one app, inbox, feed, game, AI tool, or media habit most obviously wants a boundary, ritual, redesign, or conversation?
  6. Design one kinder constraint. Make the allowed side of the boundary more beautiful: better timing, better setting, fewer inputs, more intention, or a clearer ending.

The goal is not constant self-monitoring. The goal is to build enough awareness that your next design choice comes from lived evidence rather than guilt.

AI makes awareness more important, not obsolete

Jonny raises the concern that AI-generated and AI-personalised content will increasingly learn how to press very specific buttons in us.12 Jay agrees that AI intensifies the need for awareness because the internet was already a constellation of tools triggering desire, insecurity, and fear; AI accelerates the precision and intimacy of that process.13

The useful stance here is not panic. It is discernment.

When you interact with an AI system, ask two things at once:

  1. What is this actually? A model trained on human data, not a conscious friend. A useful tool in some contexts, but not a being that can truly know you.
  2. What is happening in me? Am I becoming clearer, more capable, and more connected to reality? Or am I outsourcing contact with my own uncertainty, body, relationships, and choices?

This matters especially when the tool feels emotionally responsive. Jay points out that AI can function like a parasocial relationship: the system may simulate attunement, but it is not aware of you in the reciprocal way another human can be.14

That does not make AI useless. It makes clarity essential. A spell that looks like companionship can still be a tool. A tool that feels magical still needs a boundary.

Teach self-reference before screen-time obedience

Near the end of the episode, Jay brings the conversation into parenting. The point generalises beyond children.

He argues that the deeper skill is not simply counting minutes. The deeper skill is learning to notice internally when a technology has become too much, then practising the act of stopping.15

For adults, that might sound like:

  • “My eyes are tired and I am no longer enjoying this.”
  • “I am refreshing because I feel anxious, not because I need new information.”
  • “I said I wanted connection, but this is making me feel more alone.”
  • “This game was fun for an hour. Now I am chasing control.”
  • “This AI thread started useful. Now I am using it to avoid a real conversation.”

External rules can help. Timers, blockers, app limits, and notification settings are all forms of design. But the long-term capacity is self-reference: the ability to consult your own body, values, and consequences before the platform’s momentum decides for you.

That is why this episode lands as nervous-system work. Digital agency is not just a thought. It is the trained ability to notice activation, craving, narrowing, and relief-seeking early enough to choose a cleaner next move.

Key takeaways

  • Jay’s approach to digital distraction is tech-positive and agency-focused: enjoy the best of technology while becoming clearer about what captures you.
  • “Technology” is too broad a category. Track specific apps, feeds, inboxes, games, media, and AI tools by their before–during–after effect on your nervous system.
  • Digital loops often hook healthy needs — play, agency, worth, belonging, productivity — through partial substitutes that may not truly satisfy.
  • Awareness is the first defence: notice the moment your attention, breath, posture, urgency, or emotional state begins to shift.
  • Strong boundaries work better when the allowed side becomes a ritual, not a deprivation strategy.
  • AI increases the need for clarity because emotionally responsive tools can feel relational without being reciprocally aware.
  • The practical experiment is to audit your technology interactions for one day or one week, then design one kinder constraint from real evidence.

Free assessment

Take the free nervous system assessment.

If digital urgency keeps pulling you out of choice, the assessment can help you map your current stress patterns and find a steadier next step for regulation.

Take the assessment →

Continue exploring

References

  1. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 06:53–10:05.
  2. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 15:18–16:30 and 37:14–37:31.
  3. For research context, Wilmer, Sherman, and Chein review evidence linking smartphone habits with attention, memory, and delay of gratification, while repeatedly cautioning that the literature was still limited and often correlational. See “Smartphones and Cognition: A Review of Research Exploring the Links between Mobile Technology Habits and Cognitive Functioning,” Frontiers in Psychology (2017), https://pmc.ncbi.nlm.nih.gov/articles/PMC5403814/.
  4. Ward et al. found in two experiments that the mere presence of one’s own smartphone reduced available cognitive capacity in some tasks, especially among people reporting higher smartphone dependence: “Brain Drain,” Journal of the Association for Consumer Research (2017), https://doi.org/10.1086/691462. Stothart, Mitchum, and Yehnert found that receiving phone notifications alone disrupted performance on an attention-demanding task: “The Attentional Cost of Receiving a Cell Phone Notification,” Journal of Experimental Psychology: Human Perception and Performance (2015), https://doi.org/10.1037/xhp0000100. These findings support caution around device salience and alerts; they do not imply every technology interaction is harmful.
  5. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 20:23–23:24.
  6. Jonny Miller and Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 23:24–26:20.
  7. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 33:38–35:42 and 37:14–38:10.
  8. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 43:16–46:19.
  9. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 43:53–45:06.
  10. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 52:49–54:13.
  11. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 1:06:13–1:07:11.
  12. Jonny Miller, Defence Against the Dark Arts of Distraction, 28:24–29:46.
  13. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 29:46–33:25.
  14. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 29:46–33:25.
  15. Jay Vidyarthi, Defence Against the Dark Arts of Distraction, 1:02:58–1:05:46.