<P> There have been studies done that show that a dynamic neural mechanism exists for matching the auditory and visual inputs from an event that stimulates multiple senses . One example of this that has been observed is how the brain compensates for target distance . When you are speaking with someone or watching something happen, auditory and visual signals are not being processed concurrently, but they are perceived as being simultaneous . This kind of multisensory integration can lead to slight misperceptions in the visual - auditory system in the form of the ventriloquist effect . An example of the ventriloquism effect is when a person on the television appears to have his voice coming from his mouth, rather than the television's speakers . This occurs because of a pre-existing spatial representation within the brain which is programmed to think that voices come from another human's mouth . This then makes it so the visual response to the audio input is spatially misrepresented, and therefore misaligned . </P> <P> Hand eye coordination is one example of sensory integration . In this case, we require a tight integration of what we visually perceive about an object, and what we tactilely perceive about that same object . If these two senses were not combined within the brain, then one would have less ability to manipulate an object . Hand - eye coordination is the tactile sensation in the context of the visual system . The visual system is very static, in that it doesn't move around much, but the hands and other parts used in tactile sensory collection can freely move around . This movement of the hands must be included in the mapping of both the tactile and visual sensations, otherwise one would not be able to comprehend where they were moving their hands, and what they were touching and looking at . An example of this happening is looking at an infant . The infant picks up objects and puts them in his mouth, or touches them to his feet or face . All of these actions are culminating to the formation of spatial maps in the brain and the realization that "Hey, that thing that's moving this object is actually a part of me ." Seeing the same thing that they are feeling is a major step in the mapping that is required for infants to begin to realize that they can move their arms and interact with an object . This is the earliest and most explicit way of experiencing sensory integration . </P> <P> In the future, research on sensory integration will be used to better understand how different sensory modalities are incorporated within the brain to help us perform even the simplest of tasks . For example, we do not currently have the understanding needed to comprehend how neural circuits transform sensory cues into changes in motor activities . More research done on the sensorimotor system can help understand how these movements are controlled . This understanding can potentially be used to learn more about how to make better prosthetics, and eventually help patients who have lost the use of a limb . Also, by learning more about how different sensory inputs can combine can have profound effects on new engineering approaches using robotics . The robot's sensory devices may take in inputs of different modalities, but if we understand multisensory integration better, we might be able to program these robots to convey these data into a useful output to better serve our purposes . </P>

Area of the brain where sensory input is processed