Broad Neurons Feed the Belts Auditory

Summary

To form a coherent percept of the environment, our brain combines information from different senses. Such multisensory integration occurs in higher association cortices; but supposedly, it also occurs in early sensory areas. Confirming the latter hypothesis, we unequivocally demonstrate supra-additive integration of touch and sound stimulation at the second stage of the auditory cortex. Using high-resolution fMRI of the macaque monkey, we quantified the integration of auditory broad-band noise and tactile stimulation of hand and foot in anaesthetized animals. Integration was found posterior to and along the lateral side of the primary auditory cortex in the caudal auditory belt. Integration was stronger for temporally coincident stimuli and obeyed the principle of inverse effectiveness: greater enhancement for less effective stimuli. These findings demonstrates that multisensory integration occurs early and close to primary sensory areas and—because it occurs in anaesthetized animals—suggests that this integration is mediated by preattentive bottom-up mechanisms.

Introduction

Our different senses provide complementary views of the environment. Merging information across senses provides a comprehensive "picture" of sensory objects and is necessary for a reliable interaction with our environment (

). For example, a stimulus presented to one sense facilitates detection of novel events in other senses by directing attention to the proper location (

); and a stimulus presented to several senses simultaneously expedites processing (

). Incongruities between senses, however, can lead to unexpected percepts. Prominent examples are the erroneous perceived spatial location of a speaker in the ventriloquist effect and the altered perception of touch by sound (the parchment-skin illusion) (

). These and further examples demonstrate the importance of proper multisensory integration for successful performance in every day tasks—yet our understanding of the underlying processes is still limited (

,

,

,

).

To understand multisensory phenomena, we need to determine where and how information from different senses is combined. Pioneering experiments revealed multisensory convergence in several subcortical nuclei (

,

,

) and, at the cortical level, in areas of the parietal, temporal, and frontal lobes (

,

,

,

,

). These results fit well with the notion that multisensory convergence occurs in higher association cortices, from which multisensory signals are relayed to (subcortical) areas involved in planning and executing actions (

). Following this hypothesis, multisensory integration occurs only after unisensory information has been thoroughly processed along its specific sensory hierarchy (

).

Recent results, however, contrast this view and suggest that multisensory interactions can occur in early sensory areas. Several studies that used functional magnetic resonance imaging (fMRI) described multisensory activations in or close to areas considered unisensory (

,

,

) (and see

,

,

,

for reviews). Also, electrophysiological recordings in monkeys demonstrated responses to touch and visual stimulation in areas of the auditory cortex (

Fu et al., 2003

  • Fu K.M.
  • Johnston T.A.
  • Shah A.S.
  • Arnold L.
  • Smiley J.
  • Hackett T.A.
  • Garraghty P.E.
  • Schroeder C.E.

Auditory cortical neurons respond to somatosensory stimulation.

,

,

,

). Some of these multimodal activations in early sensory areas can be attributed to top-down signals from association areas (

,

,

). Others, however, have been argued to result from feed-forward processing (

,

,

,

,

Murray et al., 2004

  • Murray M.M.
  • Molholm S.
  • Michel C.M.
  • Heslenfeld D.J.
  • Ritter W.
  • Javitt D.C.
  • Schroeder C.E.
  • Foxe J.J.

Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.

). These results suggest that early multisensory interaction exists and might not depend on feedback from association cortices. Instead, integration might arise directly from feed-forward processing and could occur in supposedly unisensory cortices.

Here, we substantiate this hypothesis of early multisensory integration and unequivocally demonstrate integration of sound and touch signals in the auditory cortex of anaesthetized macaque monkeys. Using high-resolution fMRI, we uncover responses to tactile and auditory stimulation in the auditory belt and demonstrate supra-additive enhancement of responses to combinations of these stimuli. This enhancement meets the classical requirements for multisensory integration: it is stronger for temporally coincident stimuli and obeys the principle of inverse effectiveness. Demonstrating such integration in anaesthetized animals rules out attentive effects and suggests a pure feed-forward origin of this multisensory integration. By probing multisensory integration with high-resolution fMRI of the macaque auditory cortex, the present results bridge the gap between human-imaging studies demonstrating multisensory interactions in a variety of behavioral paradigms and neurophysiological as well as anatomical studies in monkeys demonstrating nonauditory input to auditory areas.

Results

Responses to Auditory, Tactile, and Combined Stimulation

MR images of the blood oxygenation level-dependent (BOLD) response were obtained after stimulation with auditory broad-band noise (Sound condition), tactile stimulation of the palm and foot (Tactile), and combination of these stimuli (Tactile&Sound) (Figure 1A). These stimuli were presented in an alternating block design separated by baseline periods devoid of stimulation (Figure 1Ab). MR image slices were either oriented parallel to the lateral sulcus (Figure 1A) or were coronally oriented (Figure 1B). The auditory cortex covers the lower bank of the lateral sulcus, and, thus, a few slices parallel to this structure capture the relevant area. Figure 1 displays the responses to each stimulus condition together with an approximate anatomy-based delineation of the auditory cortex.

Figure thumbnail gr1

Figure 1 Data from Two Example Experiments with Tactile, Sound, and Combined Tactile&Sound Stimulation

Show full caption

(A) Experiment with slices oriented parallel to the lateral sulcus (animal M1). (a) Activation maps for the three conditions overlaid on anatomical images. Colored voxels represent significant activations (see colorbar). Prominent anatomical structures have been labeled to facilitate interpretation (white), and an outline of the auditory cortex based on anatomical landmarks is indicated (magenta). (b) Schematic of imaging paradigm. Stimuli were presented randomized in a block paradigm and separated by rest periods. (c) Activation maps for Tactile condition in a more dorsal slice showing activations in somatosensory cortex. (d) Positioning of image slices. Lateral and superior temporal sulci have been enhanced and delineated for better visibility, and yellow slices correspond to those displayed in (a). (e) Time course of those voxels responding in the Sound condition (voxels shown colored in the respective subpanel of [a]). The time course was first averaged across all voxels; solid and dashed lines indicate the mean and SD across repeats of the stimulus.

(B) Experiment with coronal slices (animal M2). Activation maps, anatomical landmarks, and the time courses follow the same conventions as in (A). Slices are consecutive as displayed, except where a gap between slices indicates that one slice has been omitted from display. In the Tactile condition, activation was observed in one slice in somatosensory cortex (arrow) but not in auditory cortex. Anatomical structures: Ec, External/Extreme capsule; Cis, Circular sulcus; Sts, Superior temporal sulcus; Ls, Lateral sulcus; Ips, Intraparietal sulcus; Stg, Superior temporal gyrus.

Sound stimulation led to activity throughout auditory cortex on the temporal plane. The activated voxels were distributed both in rostro-caudal and medio-lateral directions as demonstrated in horizontal and coronal sections (see Sound condition in Figures 1Aa and 1B). This broad strip of activity is in agreement with neurophysiological findings in macaques that both primary and hierarchically higher auditory areas' neurons respond to broad-band stimulation (

,

,

).

Tactile stimulation of hand and foot led to activations in the somatosensory cortex (posterior central sulcus; see Tactile condition in Figures 1Ac and 1B). These limb activations were well localized in agreement with a somatotopic parcellation of this area and illustrate the effectiveness of the somatosensory stimulus. Surprisingly, the same stimulus led also to activations in portions of the auditory cortex, as demonstrated in Figure 1Aa. These somatosensory responses, however, were weaker in comparison to sound-induced activity in the same experiment (these activations covered a volume of 94 μl compared to 172 μl in the Sound condition) and were not always evident—for example, no significant tactile response in auditory cortex was observed in the experiment of Figure 1B (but still there was a response in somatosensory cortex in this experiment).

To probe the interaction of sound and touch, both stimuli were delivered simultaneously. As for sound-alone stimulation, robust responses were observed throughout auditory cortex. Combining these stimuli, however, led to more extensive activations. In the case of Figure 1A, the Tactile&Sound condition activated 351 μl compared to 172 μl for the Sound condition and 114 μl compared to 102 μl in the case of Figure 1B. Thus, adding tactile stimulation to an auditory stimulus led to an enhancement of the responses.

Analysis of ten experiments demonstrated that this observation was reliable (Figure 2). In every experiment, we observed responses to auditory stimulation, weaker or no responses to tactile-alone stimulation, and enhanced responses to the combined multisensory stimulus within auditory cortex. Quantitatively, this was confirmed by analysis of the activated cortical volume as well as of the activation strength (percent signal change). Across experiments, auditory, tactile, and combined stimulation activated 88, 15, and 201 μl (median values) of auditory cortex (Figure 2A). To test whether these numbers differ significantly, we used Friedman's nonparametric ANOVA, which confirmed a significant effect of stimulus (χ2 = 18.2, p < 0.001). Post-hoc sign-rank tests revealed significant differences between the conditions Sound and Tactile and between Tactile and Tactile&Sound (both n = 10, p < 0.01). Further, the combined stimulus activated a larger volume than the auditory stimulus alone in eight of ten experiments, resulting in a significant difference (Sound versus Tactile&Sound, p < 0.05). Because the volume designated active depends on the particular statistical threshold used, we confirmed that these results were independent of the threshold chosen (c.f., Experimental Procedures); for a wide range of thresholds, the volume activated in the Tactile&Sound condition was significantly larger than the volume in the Sound condition. Analysis of the activation strength confirmed these findings. First, responses in the Tactile&Sound were stronger compared to the Sound condition in all ten experiments (Figure 2B; median values 0.86%, 0.56%, and 1.27% for Sound, Tactile, and Tactile&Sound, respectively), resulting in a significant effect of stimulus (χ2 = 18.2, p < 0.001). Post-hoc tests revealed significant differences between all conditions (p < 0.01, all comparisons). An additional summary of the activations in individual animals is provided in Table S1 (available with this article online). Based on these findings, we conclude that adding a simultaneous tactile stimulus to an auditory stimulus significantly enhances the activations observed within auditory cortex.

Figure thumbnail gr2

Figure 2 Activations—Group Data

Show full caption

(A) Activated volume. Bars indicate the volume of auditory cortex activated in the three conditions and in ten experiments. Volume is indicated in μl. Boxes on the right indicate the median (horizontal bar inside the box) and lower and upper quartiles (edge of box). Thin lines indicate the data range, and symbols refer to a statistical significant comparison (sign-rank tests): asterisk, p < 0.05; double asterisk, p < 0.01. Experiment three is the same as shown in Figure 1A, and experiment one is the same as shown in Figure 1B.

(B) Activation strength. Same type of display as in (A), but here, the percent signal change is shown averaged across all active voxels.

This enhancement cannot be explained by a simple superposition of responses. In fact, in eight of ten experiments, the activated volume in the Tactile&Sound condition was larger than the sum of the volumes activated in the unimodal conditions. Hence, the extent of activation to the combined stimulus consisted of more than the aggregate of the activations in the unimodal conditions, suggesting that a nonlinear interaction between tactile and sound stimulation occurs within auditory cortical areas.

Supra-Additive Integration of Touch and Sound

In the following, we specifically demonstrate a supra-additive enhancement of auditory responses by simultaneous tactile stimulation within well-localized regions of auditory cortex. Classically, multisensory integration is assumed if there is a significant difference between the responses to the multimodal stimulus compared to the most effective of each stimuli considered individually (

). This definition assumes multisensory enhancement if the following holds: Tactile&Sound > max(Tactile, Sound). Although such an implementation is perfectly valid for electrophysiological studies, fMRI suffers from the drawback of promoting statistically false-positive results because of the large number of voxels. Thus, several studies have argued that fMRI studies on multisensory processing need to use a more stringent analysis in which integration is assumed if the response to the multisensory stimulus is larger than the sum of the responses to these sensory stimuli presented in isolation (

,

,

,

Laurienti et al., 2005

  • Laurienti P.J.
  • Perrault T.J.
  • Stanford T.R.
  • Wallace M.T.
  • Stein B.E.

On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies.

): mathematically this requires Tactile&Sound > (Tactile + Sound). We implemented this stringent statistical test taking into account both response strength and its spatial extent (c.f., Experimental Procedures).

Figure 3 displays the result of this test applied to the same experiments as presented in Figure 1. In both experiments, statistically significant multisensory integration was revealed in well-localized clusters, mostly at the caudal end of the auditory cortex. As demonstrated by the time courses, integrating voxels respond strongest to the Tactile&Sound condition, weaker to the Sound alone, and showed no significant response to the Tactile condition. Across experiments, we consistently found such multisensory integration in each of ten experiments, covering a volume between 7 μl and 76 μl (n = 10, median 48 μl). This integrating volume corresponded to 42% ± 18% (mean ± SD) of volume activated by Sound alone and constituted 24% ± 6% (mean ± SD) of that responding to the Tactile&Sound stimulus. This leads us to conclude that there exists a region of auditory cortex in which supra-additive integration of sound and touch occurs.

Figure thumbnail gr3

Figure 3 Integration of Auditory and Tactile Responses

Show full caption

Statistical maps display voxels with significant multisensory integration overlaid on anatomical images. Integration was assumed if the response to the combined stimulus was larger than the sum of the responses to the unisensory stimuli: Tactile&Sound > (Tactile + Sound). (A) and (B) refer to the same experiments as displayed in Figures 1A and 1B, and the definition of anatomical landmarks follows the same conventions as there. The time courses display the average activation profiles of all significant voxels in the respective experiment for the three different stimuli (mean and SD across repeats).

To better define the location of this multisensory integration anatomically and functionally, we used an additional auditory stimulus—tones of random frequencies. Tone stimuli generally better excite neurons in the primary auditory cortex but excite those in the belt areas only weakly (

,

,

). Figure 4 displays the loci of multisensory integration in four experiments together with the activation to tone stimuli. In all cases, tone stimulation led to activations roughly in the center of auditory cortex on the superior temporal plane. This is expected as the primary auditory fields are located in the center, when viewed through horizontal slices such as these (Figure 4E) (

). The loci of multisensory integration, in contrast, occurred at more caudal locations and on the lateral side (Figures 4A and 4C) but sometimes also extended to the medial side (Figures 4B and 4D). Together, these results suggest that integration occurs outside the auditory core—hence, not in primary auditory cortex—but in areas of the auditory belt and/or parabelt (

,

,

). Prominent candidate belt areas are CM and CL, which have been shown previously to receive input about tactile stimulation (

Fu et al., 2003

  • Fu K.M.
  • Johnston T.A.
  • Shah A.S.
  • Arnold L.
  • Smiley J.
  • Hackett T.A.
  • Garraghty P.E.
  • Schroeder C.E.

Auditory cortical neurons respond to somatosensory stimulation.

).

Figure thumbnail gr4

Figure 4 Responses to Tone Stimuli and Multisensory Integration

Show full caption

(A–D) Activation maps for tone stimulation (red color code) overlaid on an anatomical image together with voxels exhibiting significant multisensory integration (blue). Each panel belongs to a different experiment; (B) refers to the same experiment as in Figures 1 and 3A (animal M1), and (C) refers to the same experiment as in Figures 1 and 3B (animal M2); (A) and (D) are from animals M3 and M4, respectively. For more convenient display, the activation maps of different slices have been concatenated and are displayed on one anatomical image: in (A), (B), and (D), the three consecutive slices covering the auditory cortex were collapsed to one slice, with the maximal activation value across slices. In (C), the two most caudal and all other slices were collapsed separately.

(E) Schematic of the organization of auditory cortex when viewed on horizontal sections such as those in (A), (B), and (D). Typical anatomical landmarks are sketched (labels as in Figure 1), and the colored areas display the core, belt, and parabelt, together with prominent areas within these regions (adaptor after

).

Principles of Temporal Coincidence and Inverse Effectiveness

The classical rules for multisensory integration demand that enhancement occurs only for stimuli that are temporally coincident and propose that enhancement is strongest for those stimuli that individually are least effective (

). The first criterion is intuitive because perception can benefit only from integrating such stimuli that actually belong to the same sensory object and, hence, should occur at the same point in time. The second requirement, known as principle of inverse effectiveness, is motivated by the idea that integration is especially helpful if any sense alone cannot form a meaningful picture of the sensory scene. This is the case if isolated stimuli are weak by themselves. We tested whether the auditory-tactile paradigm conforms to these classical requirements.

We performed experiments in three animals with pulsed stimuli (alternating 2 s on with 2 s off), which could either be presented simultaneously or in alternation. In the first case, auditory and tactile stimulation occurred temporally coincident (synchronous), in the second case not (asynchronous). Importantly, on average, the stimulation was identical in both cases. Figure 5 displays the results from such an experiment. In the synchronous condition, we found robust activations to the combined stimulus along the lateral sulcus and spots of nonlinear integration at the caudo-lateral end, in agreement with the above results. In the asynchronous condition, we found similar robust activations, however, spatially more localized (69 μl compared to 105 μl) and smaller clusters of integration (32 μl compared to 51 μl). Similar results were found in two other animals. This suggests that temporally asynchronous stimuli lead to weaker responses and less multisensory integration. Analysis of individual voxels confirmed this (Figure 6, left). From each experiment, we selected those voxels exhibiting significant integration in both the synchronous and asynchronous condition. For these voxels, we then compared the integration strength defined as the difference between the activation in the Tactile&Sound condition and the sum of the activations in the unimodal conditions (Tactile&Sound − [Tactile + Sound]) (

,

). A nonparametric replicated two-way ANOVA revealed a significant effect of condition (df = 1, H = 9.91, p < 0.01) and no effect of experiments (df = 2, H = 1.6, p = 0.19). A post-hoc sign test revealed a significant difference between conditions (n = 257, p < 0.05) with the majority of voxels showing stronger integration in the synchronous condition (148 out of 257). Hence, for voxels consistently demonstrating multisensory integration, this integration was significantly stronger if auditory and tactile stimuli were presented temporally coincident.

Figure thumbnail gr5

Figure 5 Principles of Temporal Coincidence and Inverse Effectiveness—Example Data

Show full caption

Activation maps from an experiment testing the effect of temporal coincidence (synchronous versus asynchronous stimulation) and the principle of inverse effectiveness (louder versus softer auditory stimulation). The left panels display activation maps for the Tactile&Sound condition, and the right panels display the statistical maps for multisensory integration. The top middle panel indicates the slice positioning. Conventions and labeling of anatomical structures as in Figure 1.

Figure thumbnail gr6

Figure 6 Principles of Temporal Coincidence and Inverse Effectiveness—Group Data

Show full caption

Left: integration is stronger for synchronous compared to asynchronous stimuli. The scatter plot displays the strength of integration for individual voxels in the two conditions; the effect size was defined as the difference of activations in units of percent signal change (Tactile&Sound − [Tactile + Sound]). The color code separates voxels with stronger integration in the synchronous condition (black) from those with stronger integration in the asynchronous condition (gray). Right: integration is stronger for less effective stimuli. The scatter plot displays the strength of integration for individual voxels in experiments with louder and softer auditory stimulation; the latter providing the less effective stimulus. The color code separates voxels with stronger integration in the softer condition (black) from those with stronger integration in the louder condition (gray).

To test the principle of inverse effectiveness, we used the same tactile stimulus throughout but varied the effectiveness of the auditory stimulus by changing the sound intensity. In one condition (termed louder), the auditory stimulus was 10 dB SPL louder than in the other condition (termed softer). Figure 5 displays data from one of these experiments. As expected from a less-effective stimulus, the spatial extend of the activation was smaller compared to the more effective stimulus (77 μl compared to 105 μl). Also, the spatial extend of the integration was decreased (22 μl compared to 51 μl). However, analysis of all three experiments demonstrated that the integration strength was higher in the softer condition (Figure 6, right). The ANOVA revealed a significant effect of conditions (df = 1, H = 8.1, p < 0.01) and no effect of experiments (df = 2, H = 0.3, p = 0.71). The sign test revealed a significant difference between conditions (n = 318, p < 0.01) with the majority of voxels showing stronger integration in the softer condition (182 out of 318). Thus, although a less effective stimulus drives a smaller cortical volume to integration, the strength of this integration is stronger—thus obeying the law of inverse effectiveness. Together, these results demonstrate that interaction of auditory and tactile stimulation within regions of the auditory cortex conforms to the principles of temporal coincidence and inverse effectiveness and thus meets the classical requirements for multisensory integration.

Auditory-Tactile Integration Is Specific to Auditory Cortex

The above analysis concentrated on activations in auditory cortex. However, the auditory-tactile paradigm led to activations in other than auditory areas as well.

As noted above (Figure 1), we found robust but weak responses to the tactile stimulus in regions of the somatosensory cortex (6.7 μl activated volume and 0.52% signal chance, median values across n = 10 experiments). This region also showed similar levels of activations to the combined Tactile&Sound stimulus (8.7 μl and 0.47%). However, these activations were not significantly different from those observed in the Tactile condition, and no significant supra-additive integration was observed in somatosensory cortex. A summary of these activations for individual animals is provided in Table S1. Hence, in the present study, integration of sound and touch stimulation was observed in auditory cortex but not in somatosensory areas.

In the vicinity of auditory cortex, several classical multimodal areas can be found. One prominent example is the claustrum, a subcortical structure with extensive connectivity to all sensory areas (

,

). Another example is area TPO in the polysensory superior temporal sulcus (

Padberg et al., 2003

  • Padberg J.
  • Seltzer B.
  • Cusick C.G.

Architectonics and cortical connections of the upper bank of the superior temporal sulcus in the rhesus monkey: an analysis in the tangential plane.

), the human homolog of which has been analyzed in imaging studies on multisensory integration (

). However, we did not observe any reliable activation within these areas (c.f., Figure S1 and Table S1). As a matter of fact, reliable and consistent (across subjects) activations were found only within somatosensory and auditory cortices, and responses within other areas occurred only sporadic for individual scans. This lack of activity in higher areas of the frontal, parietal, or temporal lobes is a result of the anesthesia and, for the present study, is advantageous: although no classical multisensory areas exhibited significant responses, we still found reliable supra-additive integration of sound and touch stimulation within areas of the auditory cortex.

Discussion

Our fMRI-BOLD measurements in anaesthetized macaque monkeys revealed that the processing of sound in the auditory cortex can be influenced by the simultaneous presentation of a tactile stimulus. Although the tactile stimulus itself caused weak activations in auditory cortex, it enhanced those to the auditory stimulus. This enhancement manifested itself as an increased activated volume, enhanced responses, and, in specific regions of the auditory cortex, as supra-additive integration. This multisensory integration was most prominent in the caudal belt areas. Further, this integration of sound and touch obeyed the criteria of temporal coincidence and inverse effectiveness and thus has the characteristics classically required for multisensory integration.

Multisensory Integration in Auditory Cortex

The present results are consistent with previous investigations addressing responses to tactile stimulation in primate auditory cortex. For example, human functional-imaging studies supported tactile activations over auditory cortices as revealed by an overlap of fMRI-BOLD activations to auditory and somatosensory stimulation (

Foxe et al., 2002

  • Foxe J.J.
  • Wylie G.R.
  • Martinez A.
  • Schroeder C.E.
  • Javitt D.C.
  • Guilfoyle D.
  • Ritter W.
  • Murray M.M.

Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study.

). And recent EEG/MEG studies demonstrated audio-tactile interactions close to auditory areas (

,

,

,

,

Murray et al., 2004

  • Murray M.M.
  • Molholm S.
  • Michel C.M.
  • Heslenfeld D.J.
  • Ritter W.
  • Javitt D.C.
  • Schroeder C.E.
  • Foxe J.J.

Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.

). However, the low resolution in these human-imaging studies did not allow an exact localization of the observed multisensory interactions. On the neurophysiological level, Schroeder and colleagues reported somatosensory input to the macaque auditory cortex in a series of studies (

Fu et al., 2003

  • Fu K.M.
  • Johnston T.A.
  • Shah A.S.
  • Arnold L.
  • Smiley J.
  • Hackett T.A.
  • Garraghty P.E.
  • Schroeder C.E.

Auditory cortical neurons respond to somatosensory stimulation.

,

,

). They found responses to cutaneous and proprioceptive stimulation in local field potentials and multi-unit activity with the strongest effect occurring in layer four, suggesting that the somatosensory input arrives as bottom-up input (

Schroeder et al., 2003

  • Schroeder C.E.
  • Smiley J.
  • Fu K.G.
  • McGinnis T.
  • O'Connell M.N.
  • Hackett T.A.

Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing.

). Importantly, these somatosensory responses were not found in the primary auditory cortex but occurred only in caudal areas of the belt, most prominently in the caudo-medial area. The present study bridges the gap between these results from human imaging and monkey physiology because we combine the use of high-resolution functional imaging with knowledge of anatomical and functional properties of this primate's auditory cortex. Our results demonstrate supra-additive integration of touch and sound within areas of the caudal belt and, at the same time, provide improved localization of the multisensory enhancement observed in human imaging studies.

Complementing these investigations of audio-tactile integration, other studies have revealed audio-visual interactions in areas of the auditory cortex. Imaging and EEG studies with human subjects have emphasized the importance of visual signals on speech processing within auditory cortex (

,

,

,

,

,

), and electrophysiological recordings in macaque monkeys demonstrated audio-visual interactions both with simplistic (

) as well as ecologically relevant stimuli (

). Further, the activity of single neurons in the auditory cortex can be modulated by the positioning of the eyes (

). Because this property is not found in the input to auditory cortex, it supposedly arises from the local processing within this area (

Fu et al., 2004

  • Fu K.M.
  • Shah A.S.
  • O'Connell M.N.
  • McGinnis T.
  • Eckholdt H.
  • Lakatos P.
  • Smiley J.
  • Schroeder C.E.

Timing and laminar profile of eye-position effects on auditory responses in primate auditory cortex.

). Also, auditory cortex and its neighboring insular regions are innervated by fibers from the vestibular nuclei, introducing another nonauditory signal to this region that can be integrated with sound-related activity (

,

). Together, these results suggest that the auditory cortex is involved in multisensory processing in a number of ways, thus providing an interesting playground for future investigations and calling into question what can be regarded as true auditory cortex.

Integration of Sound and Touch

To form a coherent perception of our environment, our brain needs to combine information from all available senses. Classical experiments, however, are usually conducted with audio-visual stimuli, and most well-known multisensory illusions are from these domains—for example, the McGurk and the ventriloquist effects. However, when we have to act in the absence of vision, either in the dark or when manually handling an occluded object, we have to rely on the remaining senses. For example to crack a nut in dim light or to snap the fingers, we (or monkeys) rely on feed back from touch and proprioception combined with auditory signals. Psychophysical studies demonstrated that humans are able to identify different materials or judge their roughness based on auditory signals alone by combining auditory and touch information (

,

Lederman et al., 2002

  • Lederman S.J.
  • Klatzky R.L.
  • Hamilton C.
  • Morgan T.

Integrating multimodal information about surface texture via a probe: relative contributions of haptic and touch-produced sound sources.

). A compelling interaction between audition and touch was demonstrated by Jousmaki and Hari (

)—the so-called parchment-skin illusion. When rubbing our hands back and forth together, we can judge their smoothness or dryness. If, however, the sound produced by this rubbing is manipulated, we perceive our skin as having a different level of dryness. These results suggest that auditory and tactile information can be combined similarly as auditory and visual signals. The present findings suggest that the first step of such integration occurs in the auditory cortex and thus close to the beginning of the auditory cortical processing.

In our experiments, we did not observe any integration of sound and touch within the somatosensory cortex. One reason might be that our stimulation protocol activated only a small portion of all somatosensory receptors, i.e., those on one hand and foot. Consequently, the spatial extend of somatosensory responses was limited. It might be that multisensory enhancement occurring in such a localized area is too weak to reach statistical significance. On the other hand, it might also be the case that the integration of sound and touch occurs only in auditory cortex, perhaps as a means of reducing redundant processing. Further experiments are needed to resolve these possibilities.

Multisensory Integration—Early or Late?

Classically, multisensory integration is supposed to occur relatively late, after the sensory information has been thoroughly processed along its specific sensory hierarchy (

). Several recent results, however, demonstrate that early sensory cortices are involved in the representation of specific multisensory phenomena. One example is provided by the illusory flash. When a single brief visual flash is accompanied by auditory beeps, the single flash is perceived as multiple flashes. Based on measurements of evoked potentials it was suggested that this illusory percept has a neurophysiological correlate in primary visual cortex (

,

). A similar involvement of the visual cortex in multisensory processes has been demonstrated with fMRI (

). There, visual activations were increased by a simultaneously presented tactile stimulus and supposedly are due to feed-back projections from parietal association areas. However, not all multisensory phenomena can be explained by feed back from higher areas. Especially, several EEG studies observed traces of multisensory processes as early as the onset times of unisensory activations (

,

,

,

Murray et al., 2004

  • Murray M.M.
  • Molholm S.
  • Michel C.M.
  • Heslenfeld D.J.
  • Ritter W.
  • Javitt D.C.
  • Schroeder C.E.
  • Foxe J.J.

Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.

) making it unlikely that feed-back signals are the main cause. Further, studies on multimodal attention showed that effects like the ventriloquist occur preattentively and, thus, without information from higher cognitive association areas (

). These and related results suggest that multimodal integration can arise from feed-forward processing and in early sensory areas (

). This suggestion is further corroborated by the discovery of direct anatomical projections between (primary) auditory and visual cortices (

,

).

The present study strongly supports the notion of early and feed-forward multisensory integration for two reasons. First, we demonstrate integration of auditory and tactile signals in the caudal belt area, which is only one stage above the primary auditory cortex and, thus, relatively early along the processing hierarchy (

,

,

). Second, we demonstrate this integration in the anaesthetized animal, in which both attentive effects and top-down influences from higher association areas are minimized (

). Indeed, we observed robust activations within the sensory modalities stimulated, the auditory and somatosensory cortex, but we did not see reliable activations within any higher cortical areas in the parietal, frontal, or temporal lobes. This strongly suggests that the observed multisensory integration in the auditory cortex is a result of feed-forward processing and not due to feed-back signals from higher areas.

How could feed-forward somatosensory input be relayed to auditory areas? A number of areas in and around the lateral sulcus have been shown to feature somatosensory organization and responses. Leinonen and colleagues (

) described somatosensory representations in a parabelt area covering the posterior portion of the superior temporal plane. Krubitzer et al. studied representations in regions SII and PV, which terminate close to, or might even extend into the superior temporal plane (

Krubitzer et al., 1995

  • Krubitzer L.
  • Clarey J.
  • Tweedale R.
  • Elston G.
  • Calford M.

A redefinition of somatosensory areas in the lateral sulcus of macaque monkeys.

). Similar results were reported for somatosensory fields in areas 7b and insular cortex (

). Some of these somatosensory representations might well overlap with auditory cortical areas, a matter which is hard to decide based on these existing studies. But if this is the case, somatosensory responses in auditory cortex might well be due to feed-forward input from a number of somatosensory areas, the pulvinar complex, or also directly from the thalamus (

,

,

).

It is a matter of speculation why multisensory integration occurs already during early sensory processing. The caudal areas of the auditory belt supposedly are part of an auditory "where" processing stream, analogous to the dorsal stream in the visual system (

). Thus, one potential role of early auditory-tactile integration could be the localization and "binding" of objects. The binding problem arises as features extracted in different processing stream need to be assigned to a common object in order to form a coherent percept. It can be argued that early integration might facilitate binding, by tagging the activity in one stream based on activity in a different stream, whereas late integration might pose difficulties for such a process (

). Early and preattentive interference of sound and touch in determining spatial location is also supported by a recent clinical case on multisensory alloesthesia—a disorder in which sensory stimuli are experienced at a difference location then they were applied (

Ortigue et al., 2005

  • Ortigue S.
  • Jabaudon D.
  • Landis T.
  • Michel C.M.
  • Maravita A.
  • Blanke O.

Preattentive interference between touch and audition: a case study on multisensory alloesthesia.

). Besides spatial cues, temporal information similarly determines whether two sensations can originate from the same object. Our finding that synchronous stimuli elicit stronger multisensory integration thus supports the hypothesis that early integration serves to bind multisensory objects into a consistent interpretation of our environment.

Experimental Procedures

This study presents data from fMRI experiments with five macaque monkeys (Macaca mulatta) weighing 5 to 8 kg. All procedures were approved by the local authorities (Regierungspräsidium) and were in full compliance with the guidelines of the European Community (EUVD 86/609/EEC) for the care and use of laboratory animals. An extensive description of the procedures can be found elsewhere (

).

Animal Preparation

The handling and anesthesia protocol used ensures stress-free treatment of the animal while, at the same time, preserving neural responses to sensory stimulation. After premedication with glycopyrolate (i.m. 0.01 mg per kg) and ketamine (i.m. 15 mg per kg), an IV catheter was inserted to the saphenous vein. Animals were preoxygenated, and anesthesia was induced with fentanyl (3 μg per kg), thiopental (5 mg per kg), and succinylcholine chloride (3 mg per kg). The trachea was intubated and the lungs ventilated. Anesthesia was maintained with remifentanil (0.5–2 μg per kg per min). Muscle relaxation was induced with mivacurium chloride (5 mg per kg per hr). Lactated Ringer's solution was given intravenously at a maximum rate of 10 ml per kg per hr. Physiological parameters (heart rate, blood pressure, body temperature, blood oxygenation, and expiratory CO2) were monitored and kept in the desired range. Headphones for sound presentation were secured over the ears and covered with foam (Tempur-Pedic, Kentucky) to attenuate outside sounds. The animal's hand, and sometimes the foot, were brought in contact with the tactile stimulator and secured as well.

Stimulus Presentation

Sound and touch stimuli were presented with custom-written software and controlled with a QNX real-time operating system (QNX Software Systems, Canada) to ensure correct timing. Sound stimuli were stored as WAV files, played from a PC, amplified with a Yamaha amplifier (AX-496), and delivered with MR-compatible headphones (MR Confon, Magdeburg, Germany) at an intensity of 100 dB SPL (90 dB SPL during the experiments testing the principle of inverse effectiveness). The sound presentation was calibrated with an MR-compatible condenser microphone (Brüel & Kjær 4188 and a 2238 Mediator sound-level meter) to ensure a linear transfer function. The headphone cups together with the foam placed around were measured to attenuate the scanner noise (105 dB SPL) by approximately 30 dB SPL. As a result, the sound stimuli were presented about 25 dB above the background noise level. Sound stimuli consisted of broad-band noise (250–22,100 Hz) either played continuously or played as 2 s pulses alternating with 2 s of silence (during the experiments testing the principle of temporal coincidence). In addition, tone stimuli were used to localize primary auditory areas. These stimuli consisted of a series of pseudorandom tones between 250 Hz and 22,100 Hz with each tone lasting 50 ms and sound stimuli presented at 8 Hz. Touch stimuli were delivered with a custom-built stimulator consisting of a rotating brush that could be placed on the animal's palm or foot. The brush was delivering tactile stimulation at 1.5–2 Hz on a grid of four by four locations resulting in a perception of tickling or slight scratching when tested on humans. Tactile stimuli were either delivered continuously or presented as 2 s pulses alternating with 2 s of silence (during the experiments testing the principle of temporal coincidence). The engine driving the brush was located at the bottom of the animal chair and outside of the magnet. The noise from this engine, measured close to the head of the animal, was only 4 dB above the ambient noise inside the (not operating) scanner (52 dB SPL). Given headphones and foam covering the animal's ears, which reduce this noise by about 25 dB, and given the scanner noise and auditory stimulus, it is extremely unlikely that any sound produced by the tactile stimulator would cause activations in the auditory cortex. Sound + touch stimuli consisted of the simultaneous presentation of sound and tactile stimuli. During a scan block, these three types of stimuli were presented for 36 s (40 s in some of the earlier experiments), interleaved and separated by baseline periods of no stimulation (same duration as stimulus). Each stimulus was presented three times during in each block, and for each experiment, several such blocks were acquired—typically we acquired at least 27 repeats of each stimulus condition. The tone stimuli were presented in an additional paradigm consisting of four repeats of this stimulus. This paradigm was repeated at least five times.

MRI Data Collection

Measurements were made on a vertical 4.7 T scanner equipped with a 40 cm diameter bore (Biospec 47/40v, Bruker Medical, Inc., Ettlingen, Germany) and a 50 mT/m actively shielded gradient coil (Bruker, B-GA 26) of 26 cm inner diameter. A primate chair and a special transport system were used for positioning the animal within the magnet. During the experiment, the animal's head was positioned with a custom-made plastic head holder (Tecapeek, Ensiger GmbH, Germany) previously implanted on the cranium of each animal. Signals were acquired with an 70 or 85 mm diameter surface coil placed over the auditory cortex of one hemisphere. Slices were oriented parallel to the lateral sulcus (see Figure 1), but coronal slices were used as well in one experiment. Functional data were acquired with a multishot (eight segments) gradient-recalled echo planar imaging sequence (GE-EPI) with typical parameters (TE: 16 ms, TR: 750 ms, flip angle: 40, spectral width 100 kHz, on a grid of 128 × 128 voxels, 2 mm slice thickness, 14–17 slices). The field of view was adjusted for each animal and was between 7.2 × 7.2 cm and 9.6 × 9.6 cm, resulting in voxel sizes of 0.5–1.2 μl. Activation volumes are expressed in units of μl throughout this paper to allow a better comparison between experiments. Anatomical images (T1-weighted) were acquired with an eight-segment 3D-MDEFT (three-dimensional modified driven equilibrium with Fourier transform) pulse sequence with the following parameters: TE, 4 ms; TR, 22 ms; flip angle, 20; spectral width 75 kHz, 384 × 384 voxels and with five averages. These anatomical images were acquired on the same field of view as the functional data but covered a larger extend in z direction. Hence, despite different absolute resolutions, functional and anatomical images were acquired in register, alleviating the problem of post-hoc alignment. Navigator scans were used to correct resonance frequency fluctuations. For each scan, an autoshim algorithm was used for optimizing the linear and higher-order shim coils in a selected volume based on the anatomical scan. Lastly, each scan started with 6 s of RF pulsing without acquisition to avoid effects of transient magnetization.

MRI Data Analysis

The data was analyzed offline with custom-written programs in Matlab (Mathworks, Inc.). Multislice data (volumes) were converted into time points, linear drifts were removed, and the data were normalized to units of standard deviations compared to baseline. In a second analysis, the data were normalized to percent signal chance compared to baseline. The data were averaged across individual scan blocks to quantify responses. Functional maps for individual stimuli were computed by crosscorrelation with a boxcar-shaped, zero-phase shift waveform. The cycle of each block began and ended with stimulus off condition. Single stimulus activation maps were thresholded at a p value of 0.05 (uncorrected), and spurious activations were removed by spatial clustering (15 voxels in a 5 × 5 × 5 neighborhood). These activations maps served for a gross comparison between conditions (Figure 1) and for selecting candidate voxels for potential multisensory integration. For voxels active in at least one of the three conditions (Sound, Tactile, or Tactile&Sound), the following test for multisensory integration was carried out. Multisensory integration was assumed if the response to the combined stimulus was stronger than the sum of the responses to the two unimodal stimuli: Sound&Tactile > (Sound + Tactile) (

,

). To test the significance of this difference, we used a permutation test taking into account both voxel value and spatial cluster extend (

,

,

). The advantage of such a test is that it naturally combines information about neighboring voxels before computation of the statistics of interest, thus incorporating cluster size into the significance test and reducing the need for post-hoc clustering. For each active voxel, we computed an integration index as the difference of the response to the combined stimulus minus the sum of the unimodal stimuli, summed over a spatial neighborhood of 3 × 3 × 3 voxels:

I ( k ) = i neighbours of k [ Tactile&Sound ( i ) ( Tactile ( i ) + Sound ( i ) ) ]

The index for the true activations was compared to a distribution of indices obtained from 1000 randomizations of all voxels within the brain. For each randomization and voxel, the time course of the average response was shuffled in time, and the correlations and interaction index I were recomputed. For each voxel, the significance level of supra-additive integration was obtained by counting the number of randomized samples with integration index larger than this voxel. Only voxels with a p value smaller than 0.01 were considered to exhibit multisensory integration. For statistical comparisons of activation volumes or strength, we used nonparametric methods, such as Friedmans's method for randomized blocks and Scheirer-Hare's extension of the Kruskal-Wallis ANOVA (replicated two-way anova) as well as post-hoc sign-rank tests.

In Figure 2, we compare the activated volumes for the different conditions across experiments. The size of the cortical volume determined as "active" depends on the particular threshold used (p < 0.05). In order to verify that the comparison across stimuli is not affected by the particular threshold, we tested a range of hypothetical thresholds (0.001, 0.01, 0.05, 0.1, 0.2). For each threshold chosen, we found the same qualitative behavior as displayed in Figure 2. Especially, the reported differences between conditions were significant for each of these thresholds.

Acknowledgments

This work was supported by the Max Planck Society, the Deutsche Forschungsgemeinschaft (to CK, KA 2661/1-1), and the Alexander von Humboldt foundation (to C.P.).

Supplemental Data

References

    • Akbarian S.
    • Grusser O.J.
    • Guldin W.O.

    Corticofugal connections between the cerebral cortex and brainstem vestibular nuclei in the macaque monkey.

    J. Comp. Neurol. 1994; 339 : 421-437
    • Beauchamp M.S.

    See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex.

    Curr. Opin. Neurobiol. 2005; 15 : 145-153
    • Beauchamp M.S.

    Statistical criteria in FMRI studies of multisensory integration.

    Neuroinformatics. 2005; 3 : 93-114
    • Benevento L.A.
    • Fallon J.
    • Davis B.J.
    • Rezak M.

    Auditory-visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey.

    Exp. Neurol. 1977; 57 : 849-872
    • Bhattacharya J.
    • Shams L.
    • Shimojo S.

    Sound-induced illusory flash perception: role of gamma band responses.

    Neuroreport. 2002; 13 : 1727-1730
    • Bruce C.
    • Desimone R.
    • Gross C.G.

    Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque.

    J. Neurophysiol. 1981; 46 : 369-384
    • Bullmore E.T.
    • Suckling J.
    • Overmeyer S.
    • Rabe-Hesketh S.
    • Taylor E.
    • Brammer M.J.

    Global, voxel, and cluster tests, by theory and permutation, for a difference between two groups of structural MR images of the brain.

    IEEE Trans. Med. Imaging. 1999; 18 : 32-42
    • Callan D.E.
    • Jones J.A.
    • Munhall K.
    • Kroos C.
    • Callan A.M.
    • Vatikiotis-Bateson E.

    Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.

    J. Cogn. Neurosci. 2004; 16 : 805-816
    • Calvert G.
    • Spence C.
    • Stein B.E.

    The handbook of multisensory processes.

    MIT Press, Cambridge, MA 2004
    • Calvert G.A.

    Crossmodal processing in the human brain: insights from functional neuroimaging studies.

    Cereb. Cortex. 2001; 11 : 1110-1123
    • Calvert G.A.
    • Bullmore E.T.
    • Brammer M.J.
    • Campbell R.
    • Williams S.C.
    • McGuire P.K.
    • Woodruff P.W.
    • Iversen S.D.
    • David A.S.

    Activation of auditory cortex during silent lipreading.

    Science. 1997; 276 : 593-596
    • Calvert G.A.
    • Hansen P.C.
    • Iversen S.D.
    • Brammer M.J.

    Detection of audio-visual integration sites in humans by application of electrophysiological criteria to the BOLD effect.

    Neuroimage. 2001; 14 : 427-438
    • Driver J.
    • Spence C.

    Crossmodal attention.

    Curr. Opin. Neurobiol. 1998; 8 : 245-253
    • Duhamel J.R.
    • Colby C.L.
    • Goldberg M.E.

    Ventral intraparietal area of the macaque: congruent visual and somatic response properties.

    J. Neurophysiol. 1998; 79 : 126-136
    • Falchier A.
    • Clavagnier S.
    • Barone P.
    • Kennedy H.

    Anatomical evidence of multimodal integration in primate striate cortex.

    J. Neurosci. 2002; 22 : 5749-5759
    • Felleman D.J.
    • Van Essen D.C.

    Distributed hierarchical processing in the primate cerebral cortex.

    Cereb. Cortex. 1991; 1 : 1-47
    • Foxe J.J.
    • Schroeder C.E.

    The case for feedforward multisensory convergence during early cortical processing.

    Neuroreport. 2005; 16 : 419-423
    • Foxe J.J.
    • Morocz I.A.
    • Murray M.M.
    • Higgins B.A.
    • Javitt D.C.
    • Schroeder C.E.

    Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping.

    Brain Res. Cogn. Brain Res. 2000; 10 : 77-83
    • Foxe J.J.
    • Wylie G.R.
    • Martinez A.
    • Schroeder C.E.
    • Javitt D.C.
    • Guilfoyle D.
    • Ritter W.
    • Murray M.M.

    Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study.

    J. Neurophysiol. 2002; 88 : 540-543
    • Fu K.M.
    • Johnston T.A.
    • Shah A.S.
    • Arnold L.
    • Smiley J.
    • Hackett T.A.
    • Garraghty P.E.
    • Schroeder C.E.

    Auditory cortical neurons respond to somatosensory stimulation.

    J. Neurosci. 2003; 23 : 7510-7515
    • Fu K.M.
    • Shah A.S.
    • O'Connell M.N.
    • McGinnis T.
    • Eckholdt H.
    • Lakatos P.
    • Smiley J.
    • Schroeder C.E.

    Timing and laminar profile of eye-position effects on auditory responses in primate auditory cortex.

    J. Neurophysiol. 2004; 92 : 3522-3531
    • Ghazanfar A.A.
    • Maier J.X.
    • Hoffman K.L.
    • Logothetis N.K.

    Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex.

    J. Neurosci. 2005; 25 : 5004-5012
    • Giard M.H.
    • Peronnet F.

    Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study.

    J. Cogn. Neurosci. 1999; 11 : 473-490
    • Gobbele R.
    • Schurmann M.
    • Forss N.
    • Juottonen K.
    • Buchner H.
    • Hari R.

    Activation of the human posterior parietal and temporoparietal cortices during audiotactile interaction.

    Neuroimage. 2003; 20 : 503-511
    • Gordon B.

    Receptive fields in deep layers of cat superior colliculus.

    J. Neurophysiol. 1973; 36 : 157-178
    • Guldin W.O.
    • Grusser O.J.

    Is there a vestibular cortex?.

    Trends Neurosci. 1998; 21 : 254-259
    • Hackett T.A.
    • Stepniewska I.
    • Kaas J.H.

    Subdivisions of auditory cortex and ipsilateral cortical connections of the parabelt auditory cortex in macaque monkeys.

    J. Comp. Neurol. 1998; 394 : 475-495
    • Hackett T.A.
    • Stepniewska I.
    • Kaas J.H.

    Thalamocortical connections of the parabelt auditory cortex in macaque monkeys.

    J. Comp. Neurol. 1998; 400 : 271-286
    • Hackett T.A.
    • Preuss T.M.
    • Kaas J.H.

    Architectonic identification of the core region in auditory cortex of macaques, chimpanzees, and humans.

    J. Comp. Neurol. 2001; 441 : 197-222
    • Hayasaka S.
    • Nichols T.E.

    Combining voxel intensity and cluster extent with permutation test framework.

    Neuroimage. 2004; 23 : 54-63
    • Heinke W.
    • Schwarzbauer C.

    Subanesthetic isoflurane affects task-induced brain activation in a highly specific manner: a functional magnetic resonance imaging study.

    Anesthesiology. 2001; 94 : 973-981
    • Herrero M.T.
    • Barcia C.
    • Navarro J.M.

    Functional anatomy of thalamus and basal ganglia.

    Childs Nerv. Syst. 2002; 18 : 386-404
    • Hershenson M.

    Reaction time as a measure of intersensory facilitation.

    J. Exp. Psychol. 1962; 63 : 289-293
    • Hyvarinen J.
    • Shelepin Y.

    Distribution of visual and somatic functions in the parietal associative area 7 of the monkey.

    Brain Res. 1979; 169 : 561-564
    • Jousmaki V.
    • Hari R.

    Parchment-skin illusion: sound-biased touch.

    Curr. Biol. 1998; 8 : R190
    • Kaas J.H.
    • Hackett T.A.

    Subdivisions of auditory cortex and processing streams in primates.

    Proc. Natl. Acad. Sci. USA. 2000; 97 : 11793-11799
    • Kaas J.H.
    • Hackett T.A.
    • Tramo M.J.

    Auditory processing in primate cerebral cortex.

    Curr. Opin. Neurobiol. 1999; 9 : 164-170
    • Kosaki H.
    • Hashikawa T.
    • He J.
    • Jones E.G.

    Tonotopic organization of auditory cortical fields delineated by parvalbumin immunoreactivity in macaque monkeys.

    J. Comp. Neurol. 1997; 386 : 304-316
    • Krubitzer L.
    • Clarey J.
    • Tweedale R.
    • Elston G.
    • Calford M.

    A redefinition of somatosensory areas in the lateral sulcus of macaque monkeys.

    J. Neurosci. 1995; 15 : 3821-3839
    • Laurienti P.J.
    • Perrault T.J.
    • Stanford T.R.
    • Wallace M.T.
    • Stein B.E.

    On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies.

    Exp. Brain Res. 2005; ()
    • Lederman S.J.

    Auditory texture perception.

    Perception. 1979; 8 : 93-103
    • Lederman S.J.
    • Klatzky R.L.
    • Hamilton C.
    • Morgan T.

    Integrating multimodal information about surface texture via a probe: relative contributions of haptic and touch-produced sound sources.

    Proceedings of the 10th Annual Symposium on Haptic Interfaces for Teleoperators and Virtual Environment Systems. 2002; : 97-104
    • Leinonen L.

    Functional properties of neurones in the parietal retroinsular cortex in awake monkey.

    Acta Physiol. Scand. 1980; 108 : 381-384
    • Levanen S.
    • Jousmaki V.
    • Hari R.

    Vibration-induced auditory-cortex activation in a congenitally deaf adult.

    Curr. Biol. 1998; 8 : 869-872
    • Logothetis N.K.
    • Guggenberger H.
    • Peled S.
    • Pauls J.

    Functional imaging of the monkey brain.

    Nat. Neurosci. 1999; 2 : 555-562
    • Lutkenhoner B.
    • Lammertmann C.
    • Simoes C.
    • Hari R.

    Magnetoencephalographic correlates of audiotactile interaction.

    Neuroimage. 2002; 15 : 509-522
    • Macaluso E.
    • Driver J.

    Multisensory spatial interactions: a window onto functional integration in the human brain.

    Trends Neurosci. 2005; 28 : 264-271
    • Macaluso E.
    • Frith C.D.
    • Driver J.

    Modulation of human visual cortex by crossmodal spatial attention.

    Science. 2000; 289 : 1206-1208
    • Meredith M.A.
    • Stein B.E.

    Spatial determinants of multisensory integration in cat superior colliculus neurons.

    J. Neurophysiol. 1996; 75 : 1843-1857
    • Molholm S.
    • Ritter W.
    • Murray M.M.
    • Javitt D.C.
    • Schroeder C.E.
    • Foxe J.J.

    Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study.

    Brain Res. Cogn. Brain Res. 2002; 14 : 115-128
    • Murray M.M.
    • Molholm S.
    • Michel C.M.
    • Heslenfeld D.J.
    • Ritter W.
    • Javitt D.C.
    • Schroeder C.E.
    • Foxe J.J.

    Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.

    Cereb. Cortex. 2004; 15 : 963-974
    • Nichols T.E.
    • Holmes A.P.

    Nonparametric permutation tests for functional neuroimaging: a primer with examples.

    Hum. Brain Mapp. 2002; 15 : 1-25
    • Olson C.R.
    • Graybiel A.M.

    Sensory maps in the claustrum of the cat.

    Nature. 1980; 288 : 479-481
    • Ortigue S.
    • Jabaudon D.
    • Landis T.
    • Michel C.M.
    • Maravita A.
    • Blanke O.

    Preattentive interference between touch and audition: a case study on multisensory alloesthesia.

    Neuroreport. 2005; 16 : 865-868
    • Padberg J.
    • Seltzer B.
    • Cusick C.G.

    Architectonics and cortical connections of the upper bank of the superior temporal sulcus in the rhesus monkey: an analysis in the tangential plane.

    J. Comp. Neurol. 2003; 467 : 418-434
    • Pekkola J.
    • Ojanen V.
    • Autti T.
    • Jaaskelainen I.P.
    • Mottonen R.
    • Tarkiainen A.
    • Sams M.

    Primary auditory cortex activation by visual speech: an fMRI study at 3 T.

    Neuroreport. 2005; 16 : 125-128
    • Rauschecker J.P.
    • Tian B.

    Mechanisms and streams for processing of "what" and "where" in auditory cortex.

    Proc. Natl. Acad. Sci. USA. 2000; 97 : 11800-11806
    • Rauschecker J.P.
    • Tian B.

    Processing of band-passed noise in the lateral auditory belt cortex of the rhesus monkey.

    J. Neurophysiol. 2004; 91 : 2578-2589
    • Rauschecker J.P.
    • Tian B.
    • Hauser M.

    Processing of complex sounds in the macaque nonprimary auditory cortex.

    Science. 1995; 268 : 111-114
    • Rauschecker J.P.
    • Tian B.
    • Pons T.
    • Mishkin M.

    Serial and parallel processing in rhesus monkey auditory cortex.

    J. Comp. Neurol. 1997; 382 : 89-103
    • Recanzone G.H.
    • Guard D.C.
    • Phan M.L.

    Frequency and intensity response properties of single neurons in the auditory cortex of the behaving macaque monkey.

    J. Neurophysiol. 2000; 83 : 2315-2331
    • Rizzolatti G.
    • Scandolara C.
    • Gentilucci M.
    • Camarda R.

    Response properties and behavioral modulation of "mouth" neurons of the postarcuate cortex (area 6) in macaque monkeys.

    Brain Res. 1981; 225 : 421-424
    • Robinson C.J.
    • Burton H.

    Organization of somatosensory receptive fields in cortical areas 7b, retroinsula, postauditory and granular insula of M. fascicularis.

    J. Comp. Neurol. 1980; 192 : 69-92
    • Rockland K.S.
    • Ojima H.

    Multisensory convergence in calcarine visual areas in macaque monkey.

    Int. J. Psychophysiol. 2003; 50 : 19-26
    • Schroeder C.E.
    • Foxe J.

    Multisensory contributions to low-level, 'unisensory' processing.

    Curr. Opin. Neurobiol. 2005; 15 : 454-458
    • Schroeder C.E.
    • Foxe J.J.

    The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex.

    Brain Res. Cogn. Brain Res. 2002; 14 : 187-198
    • Schroeder C.E.
    • Lindsley R.W.
    • Specht C.
    • Marcovici A.
    • Smiley J.F.
    • Javitt D.C.

    Somatosensory input to auditory association cortex in the macaque monkey.

    J. Neurophysiol. 2001; 85 : 1322-1327
    • Schroeder C.E.
    • Smiley J.
    • Fu K.G.
    • McGinnis T.
    • O'Connell M.N.
    • Hackett T.A.

    Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing.

    Int. J. Psychophysiol. 2003; 50 : 5-17
    • Shams L.
    • Kamitani Y.
    • Thompson S.
    • Shimojo S.

    Sound alters visual evoked potentials in humans.

    Neuroreport. 2001; 12 : 3849-3852
    • Sherk H.

    The claustrum and the cerebral cortex.

    in: Jones E.G. Peters A. Cerebral Cortex, Volume 5. Plenum, New York 1986
    • Spence C.
    • Driver J.

    Attracting attention to the illusory location of a sound: reflexive crossmodal orienting and ventriloquism.

    Neuroreport. 2000; 11 : 2057-2061
    • Stein B.E.
    • Meredith M.A.

    Merging of the Senses..

    MIT Press, Cambridge, MA 1993
    • Stein B.E.
    • Meredith M.A.
    • Wallace M.T.

    The visually responsive neuron and beyond: multisensory integration in cat and monkey.

    Prog. Brain Res. 1993; 95 : 79-90
    • van Atteveldt N.
    • Formisano E.
    • Goebel R.
    • Blomert L.

    Integration of letters and speech sounds in the human brain.

    Neuron. 2004; 43 : 271-282
    • van Wassenhove V.
    • Grant K.W.
    • Poeppel D.

    Visual speech speeds up the neural processing of auditory speech.

    Proc. Natl. Acad. Sci. USA. 2005; 102 : 1181-1186
    • Werner-Reiss U.
    • Kelly K.A.
    • Trause A.S.
    • Underhill A.M.
    • Groh J.M.

    Eye position affects activity in primary auditory cortex of primates.

    Curr. Biol. 2003; 13 : 554-562
  • View Large Image
  • Download Hi-res image

ensleyherhatiought51.blogspot.com

Source: https://www.cell.com/fulltext/S0896-6273%2805%2900785-3

0 Response to "Broad Neurons Feed the Belts Auditory"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel