During speech processing, human listeners can separately analyze lexical and intonational cues to arrive at a unified representation of communicative content. The evolution of this capacity can be best investigated by comparative studies. Using functional magnetic resonance imaging, we explored whether and how dog brains segregate and integrate lexical and intonational information. We found a hemispheric bias for processing meaningful words, independently of intonation; an auditory brain region for distinguishing intonationally marked and unmarked words; and increased activity in primary reward regions only when both lexical and intonational information were consistent with praise. Neural mechanisms to separately analyze and integrate word meaning and intonation in dogs suggest that this capacity can evolve in the absence of language. The first study to investigate how dog brains process speech shows that our best friends in the animal kingdom care about both what we say and how we say it. Dogs, like people, can separately process words and intonation, and praise activates dog’s reward center only when both words and intonation match, according to a study in Science. (http://science.sciencemag.org/content...) [Correction note (6 April 2017) -- The authors noticed that the directions left and right were inadvertently switched in reporting the results from dogs’ brains in this study. In fact, dogs showed right hemispheric bias for processing words, and a left hemisphere brain region to process intonation. This is now corrected in the online version of the paper. Importantly, this direction change does not affect the study's conclusions. The authors apologize for this error and any confusion it may have caused.] |