Washington: In contrary to popular belief that language is not limited to speech, a recent study reveals that people also apply the rules of their spoken language to sign language. According to Northeastern University, language is not simply about hearing sounds or moving our mouths, says ANI.
When our brain is “doing language,” it projects abstract structure. The modality (speech or sign) is secondary. “There is a misconception in the general public that sign language is not really a language,” said researcher Iris Berent. “Part of our mandate, through the support of the NSF, is to reveal the complex structure of sign language, and in so doing, disabuse the public of this notion.”
To come to this conclusion, Berent’s lab studied words (and signs) that shared the same general structure. She found that people reacted to this structure in the same way, irrespective of whether they were presented with speech or signs.
In the study, Berent studied words and signs with doubling, ones that show full or partial repetition. She found that responses to these forms shift, depending on their linguistic context. When a word is presented by itself (or as a name for just one object), people avoid doubling. For example, they rate slaflaf (with doubling) worse than slafmak (with no doubling). But when doubling signaled a systematic change in meaning, participants now preferred it.
Next, Berent asked what happens when people see doubling in signs (signs with two identical syllables). The subjects were English speakers who had no knowledge of a sign language. To Berent’s surprise, these subjects responded to signs in the same way they responded to the words.
They disliked doubling for singular objects, but they systematically preferred it if (and only if) doubling signaled plurality. Hebrew speakers showed this preference when doubling signaled a diminutive, in line with the structure of their language. “It’s not about the stimulus, it’s really about the mind, and specifically about the language system,” said Berent.
Also Read: Learn two languages for a sharper brain
“Sign language has a structure, and even if you examine it at the phonological level, where you would expect it to be completely different from spoken language, you can still find similarities. What’s even more remarkable is that our brain can extract some of this structure even when we have no knowledge of sign language. We can apply some of the rules of our spoken language phonology to signs,” said Berent.
Berent says these findings show that our brains are built to deal with very different types of linguistic inputs. The results from this paper confirm what some scientists have long thought, but hasn’t truly been grasped by the general public—language is language no matter what format it takes.