Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
2,954
result(s) for
"Syntactic Processing"
Sort by:
It's All in the Family: Brain Asymmetry and Syntactic Processing of Word Class
2015
Although left-hemisphere (LH) specialization for language is often viewed as a key example of functional lateralization, there is increasing evidence that the right hemisphere (RH) can also extract meaning from words and sentences. However, the right hemisphere's ability to appreciate syntactic aspects of language remains poorly understood. In the current study, we used separable, functionally well-characterized electrophysiological indices of lexico-semantic and syntactic processes to demonstrate RH sensitivity to syntactic violations among right-handers with a strong manual preference. Critically, however, the nature of this RH sensitivity to structural information was modulated by a genetically determined factor—familial sinistrality. The right hemisphere in right-handers without left-handed family members processed syntactic violations via the words' accompanying lexico-semantic unexpectedness. In contrast, the right hemisphere in right-handers with left-handed family members could process syntactic information in a manner qualitatively similar to that of the left hemisphere.
Journal Article
Distinguishing Syntactic Operations in the Brain: Dependency and Phrase-Structure Parsing
by
Lopopolo, Alessandro
,
van den Bosch, Antal
,
Petersson, Karl-Magnus
in
anterior temporal pole
,
Brain
,
Brain mapping
2021
Finding the structure of a sentence—the way its words hold together to convey meaning—is a fundamental step in language comprehension. Several brain regions, including the left inferior frontal gyrus, the left posterior superior temporal gyrus, and the left anterior temporal pole, are supposed to support this operation. The exact role of these areas is nonetheless still debated. In this paper we investigate the hypothesis that different brain regions could be sensitive to different kinds of syntactic computations. We compare the fit of phrase-structure and dependency structure descriptors to activity in brain areas using fMRI. Our results show a division between areas with regard to the type of structure computed, with the left anterior temporal pole and left inferior frontal gyrus favouring dependency structures and left posterior superior temporal gyrus favouring phrase structures.
Journal Article
Event-related brain potentials and second language learning: syntactic processing in late L2 learners at different L2 proficiency levels
2010
There are several major questions in the literature on late second language (L2) learning and processing. Some of these questions include: Can late L2 learners process an L2 in a native-like way? What is the nature of the differences in L2 processing among L2 learners at different levels of L2 proficiency? In this article, we review studies that addressed these questions using eventrelated brain potentials (ERPs) in late learners and that focused on syntactic processing. ERPs provide an on-line, millisecond-bymillisecond record of the brain's electrical activity during cognitive processing. ERP measures can thus provide valuable information on the timing and degree of neural activation as language processing (here: syntactic processing in L2) unfolds over time. After discussing the use of ERPs for the study of L2 learning and processing, we review electrophysiological studies on syntactic and morphosyntactic processing in late L2 learners with different levels of L2 proficiency. The currently available evidence indicates that patterns of neural activity in the brain during syntactic and morphosyntactic processing can be modulated by various, possibly interrelated, factors including the similarity or dissimilarity of syntactic structures in L2 and LI, the exact nature of the syntactic structure L2 learners seek to comprehend and the concomitant expectancies they can generate with regard to violations in this structure, and the L2 learners' level of L2 proficiency. Together these studies show that ERPs can successfully elucidate subtle differences in syntactic processing between L2 learners and native speakers, and among L2 learners at different levels of L2 proficiency, which are difficult to detect or that might have remained undetected with behavioural measures.
Journal Article
Rapid Expectation Adaptation during Syntactic Comprehension
2013
When we read or listen to language, we are faced with the challenge of inferring intended messages from noisy input. This challenge is exacerbated by considerable variability between and within speakers. Focusing on syntactic processing (parsing), we test the hypothesis that language comprehenders rapidly adapt to the syntactic statistics of novel linguistic environments (e.g., speakers or genres). Two self-paced reading experiments investigate changes in readers' syntactic expectations based on repeated exposure to sentences with temporary syntactic ambiguities (so-called \"garden path sentences\"). These sentences typically lead to a clear expectation violation signature when the temporary ambiguity is resolved to an a priori less expected structure (e.g., based on the statistics of the lexical context). We find that comprehenders rapidly adapt their syntactic expectations to converge towards the local statistics of novel environments. Specifically, repeated exposure to a priori unexpected structures can reduce, and even completely undo, their processing disadvantage (Experiment 1). The opposite is also observed: a priori expected structures become less expected (even eliciting garden paths) in environments where they are hardly ever observed (Experiment 2). Our findings suggest that, when changes in syntactic statistics are to be expected (e.g., when entering a novel environment), comprehenders can rapidly adapt their expectations, thereby overcoming the processing disadvantage that mistaken expectations would otherwise cause. Our findings take a step towards unifying insights from research in expectation-based models of language processing, syntactic priming, and statistical learning.
Journal Article
The ubiquity of frequency effects in first language acquisition
by
KIDD, EVAN
,
ROWLAND, CAROLINE F.
,
THEAKSTON, ANNA L.
in
Acquisition
,
Child Language
,
Children
2015
This review article presents evidence for the claim that frequency effects are pervasive in children's first language acquisition, and hence constitute a phenomenon that any successful account must explain. The article is organized around four key domains of research: children's acquisition of single words, inflectional morphology, simple syntactic constructions, and more advanced constructions. In presenting this evidence, we develop five theses. (i) There exist different types of frequency effect, from effects at the level of concrete lexical strings to effects at the level of abstract cues to thematic-role assignment, as well as effects of both token and type, and absolute and relative, frequency. High-frequency forms are (ii) early acquired and (iii) prevent errors in contexts where they are the target, but also (iv) cause errors in contexts in which a competing lower-frequency form is the target. (v) Frequency effects interact with other factors (e.g. serial position, utterance length), and the patterning of these interactions is generally informative with regard to the nature of the learning mechanism. We conclude by arguing that any successful account of language acquisition, from whatever theoretical standpoint, must be frequency sensitive to the extent that it can explain the effects documented in this review, and outline some types of account that do and do not meet this criterion.
Journal Article
Syntactic processing is distributed across the language system
2016
Language comprehension recruits an extended set of regions in the human brain. Is syntactic processing localized to a particular region or regions within this system, or is it distributed across the entire ensemble of brain regions that support high-level linguistic processing? Evidence from aphasic patients is more consistent with the latter possibility: damage to many different language regions and to white-matter tracts connecting them has been shown to lead to similar syntactic comprehension deficits. However, brain imaging investigations of syntactic processing continue to focus on particular regions within the language system, often parts of Broca's area and regions in the posterior temporal cortex. We hypothesized that, whereas the entire language system is in fact sensitive to syntactic complexity, the effects in some regions may be difficult to detect because of the overall lower response to language stimuli. Using an individual-subjects approach to localizing the language system, shown in prior work to be more sensitive than traditional group analyses, we indeed find responses to syntactic complexity throughout this system, consistent with the findings from the neuropsychological patient literature. We speculate that such distributed nature of syntactic processing could perhaps imply that syntax is inseparable from other aspects of language comprehension (e.g., lexico-semantic processing), in line with current linguistic and psycholinguistic theories and evidence. Neuroimaging investigations of syntactic processing thus need to expand their scope to include the entire system of high-level language processing regions in order to fully understand how syntax is instantiated in the human brain.
[Display omitted]
•Participants matched sentences differing in syntactic complexity to pictures.•fMRI revealed widespread syntactic complexity effects across the language system.•We found these effects using individual functional localization of language regions.•Traditional group analyses were less sensitive, finding only few, localized effects.•Across regions, effect size correlated with sensitivity to language more broadly.
Journal Article
Neural dynamics differentially encode phrases and sentences during spoken language comprehension
2022
Human language stands out in the natural world as a biological signal that uses a structured system to combine the meanings of small linguistic units (e.g., words) into larger constituents (e.g., phrases and sentences). However, the physical dynamics of speech (or sign) do not stand in a one-to-one relationship with the meanings listeners perceive. Instead, listeners infer meaning based on their knowledge of the language. The neural readouts of the perceptual and cognitive processes underlying these inferences are still poorly understood. In the present study, we used scalp electroencephalography (EEG) to compare the neural response to phrases (e.g., the red vase) and sentences (e.g., the vase is red), which were close in semantic meaning and had been synthesized to be physically indistinguishable. Differences in structure were well captured in the reorganization of neural phase responses in delta (approximately <2 Hz) and theta bands (approximately 2 to 7 Hz),and in power and power connectivity changes in the alpha band (approximately 7.5 to 13.5 Hz). Consistent with predictions from a computational model, sentences showed more power, more power connectivity, and more phase synchronization than phrases did. Theta–gamma phase–amplitude coupling occurred, but did not differ between the syntactic structures. Spectral–temporal response function (STRF) modeling revealed different encoding states for phrases and sentences, over and above the acoustically driven neural response. Our findings provide a comprehensive description of how the brain encodes and separates linguistic structures in the dynamics of neural responses. They imply that phase synchronization and strength of connectivity are readouts for the constituent structure of language. The results provide a novel basis for future neurophysiological research on linguistic structure representation in the brain, and, together with our simulations, support time-based binding as a mechanism of structure encoding in neural dynamics.
Journal Article
Differential Electrophysiological Signatures of Semantic and Syntactic Scene Processing
2013
In sentence processing, semantic and syntactic violations elicit differential brain responses observable in event-related potentials: An N400 signals semantic violations, whereas a P600 marks inconsistent syntactic structure. Does the brain register similar distinctions in scene perception? To address this question, we presented participants with semantic inconsistencies, in which an object was incongruent with a scene's meaning, and syntactic inconsistencies, in which an object violated structural rules. We found a clear dissociation between semantic and syntactic processing: Semantic inconsistencies produced negative deflections in the N300-N400 time window, whereas mild syntactic inconsistencies elicited a late positivity resembling the P600 found for syntactic inconsistencies in sentence processing. Extreme syntactic violations, such as a hovering beer bottle defying gravity, were associated with earlier perceptual processing difficulties reflected in the N300 response, but failed to produce a P600 effect. We therefore conclude that different neural populations are active during semantic and syntactic processing of scenes, and that syntactically impossible object placements are processed in a categorically different manner than are syntactically resolvable object misplacements.
Journal Article
The nature and frequency of relative clauses in the language children hear and the language children read: A developmental cross-corpus analysis of English complex grammar
by
DAWSON, Nicola J.
,
BANERJI, Nilanjana
,
HSIAO, Yaling
in
Animacy
,
Child-directed speech
,
Children
2023
As written language contains more complex syntax than spoken language, exposure to written language provides opportunities for children to experience language input different from everyday speech. We investigated the distribution and nature of relative clauses in three large developmental corpora: one of child-directed speech (targeted at pre-schoolers) and two of text written for children – namely, picture books targeted at pre-schoolers for shared reading and children’s own reading books. Relative clauses were more common in both types of book language. Within text, relative clause usage increased with intended age, and was more frequent in nonfiction than fiction. The types of relative clause structures in text co-occurred with specific lexical properties, such as noun animacy and pronoun use. Book language provides unique access to grammar not easily encountered in speech. This has implications for the distributional lexical-syntactic features and associated discourse functions that children experience and, from this, consequences for language development.
Journal Article
The cocktail-party problem revisited: early processing and selection of multi-talker speech
How do we recognize what one person is saying when others are speaking at the same time? This review summarizes widespread research in psychoacoustics, auditory scene analysis, and attention, all dealing with early processing and selection of speech, which has been stimulated by this question. Important effects occurring at the peripheral and brainstem levels are mutual masking of sounds and “unmasking” resulting from binaural listening. Psychoacoustic models have been developed that can predict these effects accurately, albeit using computational approaches rather than approximations of neural processing. Grouping—the segregation and streaming of sounds—represents a subsequent processing stage that interacts closely with attention. Sounds can be easily grouped—and subsequently selected—using primitive features such as spatial location and fundamental frequency. More complex processing is required when lexical, syntactic, or semantic information is used. Whereas it is now clear that such processing can take place preattentively, there also is evidence that the processing depth depends on the task-relevancy of the sound. This is consistent with the presence of a feedback loop in attentional control, triggering enhancement of to-be-selected input. Despite recent progress, there are still many unresolved issues: there is a need for integrative models that are neurophysiologically plausible, for research into grouping based on other than spatial or voice-related cues, for studies explicitly addressing endogenous and exogenous attention, for an explanation of the remarkable sluggishness of attention focused on dynamically changing sounds, and for research elucidating the distinction between binaural speech perception and sound localization.
Journal Article