Radio Inspire

How To Learn Sign Language

The Neuroscience of Language

Language is a pretty key part of what it means to be human. We’re really social, so we use language to communicate and connect with our fellow human beings. But language is complex. And that complex communication requires some pretty complex neural architecture to handle the job. I’m Alie Astrocyte. And this week on Neuro
Transmissions, let’s talk about lex…icon. Language is generally considered to be universal and unique to all humans. Amazingly, babies learn their primary language only by being exposed to it. It doesn’t require any special teaching. And deaf kids who grow up without formal sign language will often invent their own “home signs” to communicate with family and friends. Of course, language is an extremely complex behavioral system. So it’s not very easy to figure out exactly where it’s coming from. Early neuroscientists didn’t have tools
like fMRI or EEG to look inside the brain the way that we do today. This made it pretty challenging to study the neuroscience of language. It’s really difficult to get a close look at what’s going on in the brain during language production. The earliest research on brain regions involved in language came from case studies of patients who had suffered from brain lesions, injuries resulting in tissue death in the brain. This kind of research helped doctors discover two key brain regions. Broca’s area and Wernicke’s area. Broca’s area was discovered by a man named Paul Broca, a French physician. Also called expressive aphasia, patients with an injury to Broca’s area know exactly what they want to say. They just can’t find the words or grammar to properly express it. Wernicke’s area was discovered by the German neurologist Carl Wernicke, who first noticed the link between damage to the back of the superior temporal gyrus and a particular type of difficulty with speech. This aphasia makes it difficult for a patient to properly understand language. But they have no problems producing language, using grammatical sentences that don’t make any sense. This is why Wernicke’s aphasia is sometimes called fluent aphasia. Using more modern techniques, researchers have begun to learn new things about other brain regions that seem to play important roles in language. A particular kind of test, called the Wada test, has helped demonstrate how language is localized to one side of the brain or the
other. Don’t try this test at home. Patients are injected with a barbituate through the carotid artery, which puts that half of the brain to sleep. Doctors can then perform different cognitive tests on the patient to determine what kinds of things they can and can’t do when half of their brain isn’t functioning. This kind of testing has found that most people mostly process language in their dominant hemisphere, which for most people is usually the left hemisphere, which controls the right side of the body. When that hemisphere is put to sleep, patients have a hard time speaking and understanding language. In an earlier video, we talked about how we hear using the auditory cortex, which it should come as no surprise, is one of the earliest places where speech is processed. This is where our brain consciously registers the sound of words and begins to process them. Humans mostly use speech to communicate with each other, so language is mostly an auditory so language is mostly an auditory ability. It turns out that a lot of the brain regions that we use to hear sounds are also used to interpret speech. Some more recent neuroimaging studies have led researchers to propose that language information gets split into what they called a dual stream model. Similar to the two processing streams we talked about in the visual system. When it comes to language, this model says that information is sent along the dorsal pathway to brain regions on both sides of
the brain. This pathway is sort of the “speech” part of the system. It translates the information it gets from speech input, and and allows us to reproduce those same speech patterns. Kind of like how the dorsal stream in the visual system is the “where” pathway that helps us navigate space, these dorsal streams help us determine our movements. Including our mouth movements! The ventral stream passes information along the dominant hemisphere. This stream is more the “what” pathway. With language, this means that the ventral stream allows us to identify the content of speech and understand what those words mean. So the ventral stream is what lets you hear the word “ball”, identify it, and visualize a ball in your own mind, providing you with context for the words you’re hearing. Lots of researchers are using many different tools to continue breaking down the ways our brain hears, processes, and interprets
language. Some scientists are even using animal models to answer questions about different aspects of language. Different bird species use unique bird songs to communicate, and are studied as a model for complex vocabulary and grammatical processing. Some primates, like the tiny marmoset, are used to study the neural basis of social communication. These models won’t answer all of the questions we have about language in the brain. But they can help us better understand it. So with time, we’ll know more about this complex and unique behavior! Thanks for joining us for this episode of
Neuro Transmissions! If you liked it, please click the thumbs up
button. And if you haven’t already, hit subscribe to become a Brainiac! If you really love what we do, please consider donating to our Patreon account. We love creating these videos, and every little bit of support helps us keep doing what we’re doing. Until next time, I’m Alie Astrocyte. Over and out!

Leave a Reply

Your email address will not be published. Required fields are marked *