Elsevier

Brain and Language

Volume 52, Issue 2, February 1996, Pages 342-364
Brain and Language

Regular Article
Phonological Processing and the Role of Strategy in Silent Reading: Behavioral and Electrophysiological Evidence

https://doi.org/10.1006/brln.1996.0016Get rights and content

Abstract

To examine the contribution of phonological processing in silent reading, 51 native English speakers made decisions about targets presented either in word pairs or in sentences. The target words were homophonically (plain–plane), orthographically (plane–place), or semantically (plane–jet) related. N200 was enhanced only to homophonic targets, suggesting the use of phonological information in silent reading. Memory load did not affect the N200 amplitude. N400 was enhanced toallsemantically incongruent words and was larger in the word pair condition. Reaction times were influenced by both experimental condition and target relationships; homophonic stimuli elicited the fastest RTs in the word pair condition and the slowest RTs in the sentence condition, suggesting the use of different strategies. Thus, ERP components and behavioral responses registered different aspects of language processing.

References (0)

Cited by (54)

  • Measuring the influence of phonological neighborhood on visual word recognition with the N400: Evidence for semantic scaffolding

    2020, Brain and Language
    Citation Excerpt :

    However, the work of Deacon et al. (2004) suggests that the N400 itself is likely comprised of the cumulative effects of the prelexical orthographic and phonological processes. Several ERP studies have employed homophones and/or pseudohomophones to investigate the role of phonological information in word recognition (Connolly, Phillips, & Forbes, 1995; Newman & Connolly, 2004; Niznikiewicz & Squires, 1996; Ziegler, Benraïss, & Besson, 1999). In the Niznikiewicz and Squires (1996) study, critical stimuli were either semantically, orthographically, homophonically, or unrelated to a prime stimulus or preceding sentence context.

  • The interplay of phonology and orthography in visual cognate word recognition: An ERP study

    2012, Neuroscience Letters
    Citation Excerpt :

    Specifically, modulations between 50 and 100 ms after stimulus onset were typically interpreted as indicating that the initial access code for word recognition is phonological in nature [1]. Later modulations (150–250 ms) were taken as an index of the activation of conflicting codes at the prelexical stage [14]. In addition, the activation of phonological information was also observed in the time window between 350 and 550 ms [14,12].

  • Modulation of brain regions involved in word recognition by homophonous stimuli: An fMRI study

    2011, Brain Research
    Citation Excerpt :

    A recent MEG study suggested, however, that under certain task requirements activation in the STGp and IFG may occur too late to be involved in phonological processes that are involved, for instance, in assessing a letter string's lexicality (Simos et al., 2009). Several EEG studies are particularly relevant to the current study, as they involved the use of homophonous stimuli to assess phonology's role in reading-related tasks (Braun et al., 2009; Newman and Connolly, 2004; Niznikiewicz and Squires, 1996; Ziegler et al., 1999). While the majority of these studies found evidence of homophonous effects, the timing of such effects varied considerably across studies, ranging from effects occurring as early as 150 ms post-stimulus (Braun et al., 2009) to those occurring at 400 ms and even later (Newman and Connolly, 2004; Ziegler et al., 1999).

View all citing articles on Scopus
View full text