LearningFebruary 15, 202610 min read

How Reading Changes the Brain (2026 Study)

A landmark 2026 study reveals that learning to read fundamentally changes how your brain processes spoken words — even without text present. Here is what this means for language learners and why live captions may be the most underrated brain training tool available today.


TL;DR — A February 2026 study from Baycrest and the University of São Paulo found that learning to read permanently rewires how the brain processes spoken language. This has direct implications for anyone using captions, subtitles, or transcription for language learning through comprehensible input. Reading while listening — the exact mechanism behind live captions — creates dual memory traces that can boost vocabulary retention by up to five times.


Key Takeaways

  • Reading rewires the brain for spoken language, not just written text — even in the absence of any visual words
  • Phonemic awareness (hearing individual sounds within words) is significantly stronger in literate adults
  • Children's vocabulary is declining due to screen time replacing reading, according to a 2023 Oxford University Press report
  • Reading while listening creates "double traces" in memory (Dual Coding Theory), boosting retention dramatically
  • Same-language captions are more effective for language acquisition than translated subtitles

The 2026 Study That Rewrites What We Know

A reader absorbed in a book while wearing headphones — the intersection of reading and listening that rewires neural pathways

Person reading a book while wearing headphones in a warm study environment with plants and natural lighting

In early February 2026, Dr. Jed Meltzer and colleagues at Baycrest's Rotman Research Institute published a study that challenges a long-held assumption: that reading and listening are separate cognitive skills.

The study, titled "Literacy modulates engagement of the right inferior frontal gyrus in phonological processing of spoken language" and published in Cortex, compared two groups of older adults:

  • Group A: Functionally illiterate adults learning to read later in life
  • Group B: Educated adults who had been reading since childhood

The literate group showed fundamentally different brain activation patterns when processing spoken words — not written ones. Reading had reshaped the neural circuitry responsible for phonological processing — the ability to break spoken language into individual sounds.

Three Ways Reading Changes the Brain

Dr. Meltzer explained that reading "unlocks more sophisticated cognitive abilities that extend beyond reading itself":

  1. Enhanced phonemic awareness — Literate adults can distinguish individual sounds within words more precisely, which is critical for processing both native and foreign languages
  2. Stronger short-term verbal memory — The capacity to hold spoken information (such as a new word or sentence) while processing it improves measurably
  3. Greater cognitive flexibility — The brain builds transferable skills that support learning across domains

This study provides the clearest evidence yet that literacy is not just about decoding text. It transforms the brain's entire language processing architecture — with direct implications for how we learn new languages at any age.


Why Children's Vocabulary Is Shrinking — And What Neuroscience Says

The Baycrest findings arrive at a moment of growing concern. In February 2026, lexicographer Susie Dent warned in The Guardian that children's vocabulary is shrinking as screen time replaces reading.

A 2023 Oxford University Press study found that two in five pupils had fallen behind in vocabulary development. The Baycrest research now provides the neuroscience to understand why: less reading means weaker phonemic awareness, reduced verbal memory, and a brain less equipped to process spoken language efficiently.

The Cascade Effect of Reduced Reading

StageWhat HappensConsequence
1. Less readingBrain gets fewer opportunities to build phonological pathwaysWeakened sound discrimination
2. Weakened phonemic awarenessDifficulty distinguishing similar sounds in speechSlower language processing
3. Reduced verbal memoryHarder to hold complex sentences and instructionsAcademic and social challenges
4. Vocabulary gap widensNew words lack the reinforcing "double trace" reading providesCompounding learning deficit

Source: Extrapolated from Meltzer et al. (2026) and Oxford University Press Word Gap Report (2023)

Dent recommended several remedies: reading aloud, listening to audiobooks, playing word games, and — notably — learning another language. She argued that studying a foreign language "can significantly aid in developing a love and understanding of English" itself.

The question becomes: what is the most effective way to combine reading and listening for language development?


Reading While Listening: The Dual Coding Advantage

The answer lies in a strategy called Reading While Listening (RWL) — simultaneously reading text while hearing the same content spoken aloud. And the research behind it is compelling.

Why Two Channels Beat One

Psychologist Allan Paivio's Dual Coding Theory (1991) proposes that the mind processes information through two distinct systems:

  • Verbal system: spoken words, linguistic structure, phonological patterns
  • Visual system: written text, images, spatial information

When both systems engage simultaneously — which is exactly what happens during Reading While Listening — the brain creates "double traces" that are measurably stronger than single-mode input. This is why combining reading and listening produces better outcomes than either alone.

Research Evidence for Reading While Listening

OutcomeReading OnlyReading While ListeningSource
New vocabulary retainedBaselineUp to 5x more wordsFabuly Research (2024)
Listening comprehensionNot engagedSignificantly improvedUniversity of North Dakota
Reading fluencyStandard paceEnhanced speed and accuracyDyslexic Advantage
Cognitive loadCan be high for L2 learnersReduced — audio scaffolds readingAmara Research
Learner motivationVariableIncreased — content feels engagingTransynergy.org

The Connection to How Reading Changes the Brain

The Baycrest study showed that reading strengthens the neural pathways for processing spoken language. RWL takes this further: by simultaneously engaging both reading and listening, it accelerates the very neural rewiring that makes literate adults better at understanding speech.

For language learners, this is transformative. Research on watching films for language acquisition has demonstrated that combining audio with text — even through simple subtitles — produces measurably better outcomes than audio alone.


How Captions Turn Everyday Audio Into Brain Training

Dual coding in action — a learner watches foreign content with live captions, engaging both visual and auditory processing simultaneously

Woman at a desk watching foreign language video with captions on her laptop while taking notes in a journal

Every time you watch a foreign film with subtitles, follow a podcast with transcription, or join a video call with live captions, your brain is doing exactly what the research says works: engaging both processing systems to build stronger language connections.

This is where tools like FluentCap become relevant. It transcribes any audio playing on your computer and displays live captions — creating an automatic Reading While Listening experience for any audio source.

The Cognitive Mechanism

When captions appear alongside audio:

  1. Auditory cortex processes the spoken words — training phonological awareness
  2. Visual cortex processes the written captions — reinforcing word recognition
  3. Both systems create dual traces — strengthening memory formation
  4. Cross-modal connections build — linking sounds to spelling to meaning

Same-Language vs. Translated Captions

An important nuance: research consistently shows that intralingual captions (same language as the audio) are more effective for language acquisition than translated subtitles. Same-language captions reinforce the direct connection between sounds and written forms, which is precisely the mechanism the Baycrest study identified as central to how reading changes the brain.

Watching foreign films with real-time same-language subtitles creates the ideal conditions for this dual-coded learning.


5 Science-Backed Ways to Strengthen Language Through Reading

Based on the convergence of the Baycrest study, Dual Coding Theory, and RWL research, here are five evidence-backed strategies:

1. Watch Foreign Content With Same-Language Captions

Instead of translated subtitles, use captions in the original language. This forces your brain to connect spoken sounds with their written forms — the exact mechanism the Baycrest study identified as critical. Start with content you have seen before; familiarity with the plot reduces cognitive load.

2. Follow Podcasts and Audiobooks With Transcription

Turn passive listening into active brain training by adding a text layer. When you see words as you hear them, your brain creates dual-coded memory traces. Listening to foreign audiobooks with live subtitles is one of the purest applications of this approach.

3. Read Along With Music in Another Language

Music provides an additional encoding channel beyond text and speech. Look up lyrics while listening to songs in your target language — combining auditory, visual, and musical processing for enhanced retention.

4. Practice "Echo Reading" — Read, Then Listen, Then Both

A technique from literacy education: first read a passage silently. Then listen to it spoken. Then do both simultaneously. This three-stage approach builds confidence before engaging the full dual coding mechanism. Using delayed captions for active listening training applies a similar principle — the slight lag trains your brain to predict before confirming.

5. Join Conversations With Captions Visible

For language learners in real conversations — video calls, meetings, or virtual events — live captions provide a real-time reading scaffold. You hear a new word, see it spelled, and your brain immediately starts building cross-modal connections. This is especially valuable for multilingual family video calls where multiple languages intersect.


Frequently Asked Questions

Does reading really change how the brain processes spoken language?

Yes. A 2026 study from Baycrest and the University of São Paulo found that literacy reshapes how the brain responds to spoken words. Literate adults showed enhanced phonemic awareness and stronger activation in phonological processing regions — even during purely auditory tasks with no text present. The changes operate at the neural architecture level, not just the behavioral level.

What is Reading While Listening and why does it work?

Reading While Listening means engaging with both text and audio simultaneously. It works because it activates two cognitive systems — verbal and visual — as described by Paivio's Dual Coding Theory (1991). This creates stronger memory traces than either reading or listening alone. Research from Fabuly (2024) found that RWL can improve vocabulary acquisition by up to five times compared to reading alone.

Are same-language captions better than translated subtitles?

For language acquisition, yes. Same-language captions reinforce the direct connection between spoken sounds and their written forms, which strengthens the phonological processing pathways identified in the Baycrest study. Translated subtitles help with comprehension but create weaker neural connections for the target language. For practical guidance, see how to learn languages by watching films with research-backed methods.

How does screen time affect children's vocabulary?

The relationship depends on the type of screen engagement. Passive screen time displaces reading and reduces vocabulary development — a 2023 Oxford University Press study found two in five children falling behind. However, active screen engagement with same-language captions or interactive reading apps can actually strengthen reading brain processing by providing the dual-coded input that builds phonemic awareness.

Can adults improve their brain's language processing through reading?

Yes. The Baycrest study included older adults who were learning to read later in life, demonstrating that the brain remains capable of this neural reorganization well beyond childhood. Combining reading with spoken language input — through comprehensible input methods or Reading While Listening — can strengthen language processing at any age. The cognitive benefits of bilingualism are also well-documented in adults.


Scientific References

  1. Meltzer, J. et al. (2026). Literacy modulates engagement of the right inferior frontal gyrus in phonological processing of spoken language. Cortex. DOI: 10.1016/j.cortex.2026.01.005 | EurekAlert Summary

  2. Paivio, A. (1991). Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 45(3), 255–287. DOI: 10.1037/h0084295

  3. Oxford University Press (2023). Why Closing the Word Gap Matters: Oxford Language Report. Report

  4. Dent, S. (2026, February 9). Children's vocabulary is shrinking as reading is replaced by screen time. The Guardian. Article

  5. Vanderplank, R. (2016). Captioned Media in Foreign Language Learning and Teaching. Palgrave Macmillan.



FluentCap is made possible by speech-to-text providers who offer generous free tiers — up to 750 hours of transcription at no cost. When credits run out, their rates are just $0.15–0.40 per hour. We encourage you to learn more about our providers and support their work.


— FluentCap Team

Built to bring good things to the world.

Ready to Try FluentCap?

Download for free and start transcribing in under 2 minutes.

Download Now →

— FluentCap Team

Written by our team of language technology specialists with expertise in applied linguistics, speech recognition, and cross-cultural communication. We're dedicated to making audio accessible to everyone.