Improvisation for two, language for children is music sung by their mother

Improvisation for two, language for children is music sung by their mother

For a 9-month-old baby, the rules are the music sung by the mother. A pilot study published April 12 in the journal Nature says so Science advances By a team of researchers from the Universities of Padua and Barcelona, ​​it has been reproduced on the website www.icsem.it. Until now, it was believed that children, even the most precocious, can understand the construction of sentences only after they are one year old. Instead, the rhythm of the mother tongue (vocal presentation) allows the little ones to understand even two-syllable sentences.

For example – say the authors of the research – we understand very well that in the sentence “she sleeps well”, “she” is the subject of “she sleeps”, but we also see the same link in the sentence “she, who does not drink coffee and sleeps well, where my word is separated ” “she” and “sleep” than others. But how does a young child’s brain find connections between words separated from other words in the formation of two or more sentences?

Relationships between distant words

According to the study conducted by Ruth de Diego Balaguer and Ferran Ponce of the Neurosciences Institute of the University of Barcelona, ​​in collaboration with Ana Martínez-Alvarez and Judit Gervin, of the University of Padua and CNRS in Paris, the brain of 9-month-old babies is already sensitive to the phonemic regularities that distinguish sentences and understands sequence Hierarchical by listening to the language melody. By observing their behavior and observing brain responses, the researchers noticed that when grammatically related words were distinguished by intonation – the children were able to better understand the relationships between distant words.

Boolean gates

The research, published by Science Advances, refers to the etymology debate that we discussed in connection with Noam Chomsky and Andrea Moreau’s book Secrets of Words (La nave de tisio). Chomsky launched the idea that the general rules respected by all languages ​​are innate and found in the cognitive mechanisms of our brains, just as the logic gates “and” or “or” and a few other rules, together with the recursive structures, make up every computer program. The starting point for his “generative grammar”.

See also  Francesco Maurolico - Ichthyologist, Book of the Sea and Science

Experimental studies

As for Andrea Moro, through experimental studies he has shown that “even without instruction, the brain is able to recognize possible rules rather than impossible ones, that is, rules based on hierarchy compared to rules based on linear order”. Not only that: “The network that the brain activates when faced with impossible rules was not the same as used for possible rules.” Chomsky sums it up: Andrea Moreau’s experiment found “neural correlates of distinguishing between possible and impossible languages ​​by focusing on a crucial property: the role of linear order and structures created by the mind.” Andrea Moreau talks about children’s “stem brain”: an open neural foundation that can develop into any of the thousands of existing and present languages.

From the brain to the vocal cords

It is clear that the brain and language are inextricably linked: just consider what subtle work the brain has to do to control the vocal apparatus in pronouncing words and sentences, and what no less complex work the auditory apparatus has to do to select it. Even an interpretable voice message. But debate is still open about how these abilities evolved in Homo sapiens and how they were acquired by children.

The role of “short memory”

A simple but interesting suggestion is the one developed by Morten H Christiansen (Professor of Psychology at Cornell University) and Nick Chater (Professor of Behavioral Sciences at Warwick Business School, UK) in the article “The Language Game” (Ponte Allen) Grazie, 364 pg., 22 euro). Their idea is that language is essentially a two-person improvisation interspersed with the short memory timings of the human mind. The guessing ‘game’ that only later consolidates itself into normative structures (vocabulary, grammar, grammar) is extended to groups of speakers. The analogy between natural language and computer language is not valid because computers operate sequentially on very long strings of raw information. On the other hand, natural language processes complex blocks of meaning that are made up of short sequences that can be managed by our short memory.

See also  Aeolus, a controlled return of the Windcatcher satellite

Complexity without a project

The explicit reference is to the famous Santa Fe, New Mexico, institute founded by physicist Murray Gell-Mann, for the study of complex systems, where the authors set up a study. Languages ​​accumulate and organize spontaneously and casually. It is then institutionalized, while continuing a continuous transformation. “Nobody fashions language—Christiansen and Schutter write—its complexity and ordering emerge from the chaos of countless linguistic mime games. In all of them, speakers are exclusively concerned with making a particular person intelligible to themselves on a particular occasion. Yet, generation after generation, schemes emerge from Incredible wealth and sophistication.

virtuous circuit

The evolutionary advantage that Homo sapiens derived from the “gift” of language is enormous. Anthropologist Yuval Harari emphasizes this well in his best-selling book “Sapiens” and in a recent article in The Economist. Without the ability to exchange complex messages (first gestural and oral, and then also written) our civilization would not exist, we would not have literature, philosophy, science and technology. Christiansen and Chater assert that this is the result of the spontaneous “virtuous circle” of mutual reinforcement that has developed between reason and language.

request question

Understanding a word, a sentence, or a period made up of several sentences, is conditioned by the “short memory” that speakers are able to exercise in linguistic exchange: between subject, verb, complement, and time cannot be exceeded as far as mental mechanisms allow, although cadence And pauses, accents and conjunctions make it possible to lengthen chains of words and sentences. The cognitive mechanisms of language would be reduced to this extent, which is much simpler than Chomsky’s hypothesis of generative grammar, in which language is roughly an “organ of the body” comparable to a bird’s wing unfolding while executing an organic program”. This does not mean, as Christiansen and Chater point out, that some grammatical structures have proven themselves better than others, without relying on deep neural mechanisms: for example, statistically, among the seven thousand languages ​​considered, the word order is subject-object-verb in 43 percent of cases, subject-verb – Object in 40 percent and only 0.3 percent object – subject – verb.

See also  According to science, this harmless drink significantly accelerates aging

artificial intelligence

In the past few pages, it is inevitable to mention ChatGPT’s artificial intelligence, language ability, and similar algorithms. Christiansen and Chater’s conclusion is that no AI can truly simulate natural language because the latter, while drawing on a practically unlimited repertoire of information, cannot distinguish between meaning and nonsense. Algorithms that speak and write “as is” play no more mime than a chimpanzee. Mime is about being able to improvise, using knowledge flexibly and creatively – and each mime can build on the precedent through all kinds of metaphorical turns (…) Words have no meanings static, but evoke loose webs of interconnected meanings.”

Terms vs. Words

Paraphrasing Roland Barthes, we can say that ChatGPT speaks and writes using “terms”, not words. Idioms are defined strictly by vocabulary and words by the infinite contexts and expressive registers in which they can be used. “GPT doesn’t simulate a human brain, it doesn’t have a brain, that’s all,” Christiansen and Chater say. This is why, according to the reassuring conclusion of their book, artificial intelligence will never surpass natural intelligence. Natural and artificial play different games on different terrains. There is no “singularity” in sight because “language will save us”.

“Positive Cues Enhance Infant Sensitivity to Noncontiguous Regularity” Science Advances 2023. Authors: Ana Martínez-Alvarez, Judith Gervin, Elena Colaguena, Ferran Ponce, Ruth de Diego-Balaguer

Leave a Reply

Your email address will not be published. Required fields are marked *