Live
- BGT 2024-25: Kohli’s body was completely relaxed when he came to bat in second innings, says Gavaskar
- Andhra Deputy CM Pawan Kalyan condemns arrest of priest in Bangladesh
- Zeenat, Shabana, Abhay Deol-starrer ‘Bun Tikki’ to have World Premiere at Palm Springs 2025
- "Hyderabad 2050 Master Plan: A Game-Changer for the City's Future Development"
- Centre sanctions Rs 1,000 crore outlay for disaster mitigation projects in 15 states
- Claim Free Fire Rewards: Daily Redeem Codes for November 2024"
- Charge sheet completely silent, lacks details: Mukul Rohatgi slams 'vague' US indictment
- Obesity, diabetes may up dementia risk 10 years earlier in men: Study
- AP Mega DSC 2024: Syllabus Released Early to Help Candidates Prepare
- Assam: Trading scam victims fume after key accused get bail
Just In
One of the foundational elements of Chomsky\'s work is that we have a grammar in our head, which underlies our processing of language,\" explains David Poeppel, the study\'s senior researcher and a professor in New York University\'s Department of Psychology.
A team of neuroscientists has found new support for MIT linguist Noam Chomsky's decades-old theory that we possess an "internal grammar" that allows us to comprehend even nonsensical phrases.
New York: "One of the foundational elements of Chomsky's work is that we have a grammar in our head, which underlies our processing of language," explains David Poeppel, the study's senior researcher and a professor in New York University's Department of Psychology. "Our neurophysiological findings support this theory: we make sense of strings of words because our brains combine words into constituents in a hierarchical manner--a process that reflects an 'internal grammar' mechanism."
The research builds on Chomsky's 1957 work, Syntactic Structures (1957). It posited that we can recognise a phrase such as "Colourless green ideas sleep furiously" as both nonsensical and grammatically correct because we have an abstract knowledge base that allows us to make such distinctions even though the statistical relations between words are non-existent.
Neuroscientists and psychologists predominantly reject this viewpoint, contending that our comprehension does not result from an internal grammar; rather, it is based on both statistical calculations between words and sound cues to structure. That is, we know from experience how sentences should be properly constructed--a reservoir of information we employ upon hearing words and phrases. Many linguists, in contrast, argue that hierarchical structure building is a central feature of language processing.
In an effort to illuminate this debate, the researchers explored whether and how linguistic units are represented in the brain during speech comprehension.
To do so, Poeppel, who is also director of the Max Planck Institute for Empirical Aesthetics in Frankfurt, and his colleagues, conducted a series of experiments using magnetoencephalography (MEG), which allows measurements of the tiny magnetic fields generated by brain activity, and electrocorticography (ECoG), a clinical technique used to measure brain activity in patients being monitored for neurosurgery.
The study's subjects listened to sentences in both English and Mandarin Chinese in which the hierarchical structure between words, phrases, and sentences was dissociated from intonational speech cues--the rise and fall of the voice--as well as statistical word cues. The sentences were presented in an isochronous fashion--identical timing between words--and participants listened to both predictable sentences (e.g., "New York never sleeps," "Coffee keeps me awake"), grammatically correct, but less predictable sentences (e.g., "Pink toys hurt girls"), or word lists ("eggs jelly pink awake") and various other manipulated sequences.
The design allowed the researchers to isolate how the brain concurrently tracks different levels of linguistic abstraction--sequences of words ("furiously green sleep colorless"), phrases ("sleep furiously" "green ideas"), or sentences ("Colorless green ideas sleep furiously")--while removing intonational speech cues and statistical word information, which many say are necessary in building sentences.
Their results showed that the subjects' brains distinctly tracked three components of the phrases they heard, reflecting a hierarchy in our neural processing of linguistic structures: words, phrases, and then sentences--at the same time.
"Because we went to great lengths to design experimental conditions that control for statistical or sound cue contributions to processing, our findings show that we must use the grammar in our head," explains Poeppel. "Our brains lock onto every word before working to comprehend phrases and sentences. The dynamics reveal that we undergo a grammar-based construction in the processing of language."
This is a controversial conclusion from the perspective of current research, the researchers note, because the notion of abstract, hierarchical, grammar-based structure building is rather unpopular.
© 2024 Hyderabad Media House Limited/The Hans India. All rights reserved. Powered by hocalwire.com