






Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
How can we have perception, perception of psychology
Typology: Lecture notes
1 / 12
This page cannot be seen from the preview
Don't miss anything!
Behavioral Neuroscience– Lecture 4 – Perception II
In this lecture we will go through the auditory system and the Mechanosensory / Somatosensory systems.
This lecture has three main objectives:
By the end of this lecture, we aim students to have learn to:
Lecture Literature Purves, D., Cabeza, R., Huettel, S. A., LaBar, K. S., Platt, M. L., and Woldorff, M. G. (2013). Principles of cognitive neuroscience. 2nd Ed. (chapter 4). Sunderland: Sinauer Associates Inc.
Human perception is a mental construction that depends on previous experience. That is, the subjective experience of a stimulus is generated by the interaction of information originating from various sensory modalities and previous experience. In the videos available on the links below, we can see the McGurk effect (https://www.youtube.com/watch?v=G-lN8vWm3m0) and the the rubber-hand illusion (https://www.youtube.com/watch? v = sxwn1w7MJvk). These two phenomena are widely studied by cognitive neuroscience and exemplify the effect of multisensoriality (competition between different
sensorial modalities), the predominance of the visual system over the remaining perceptual systems and neuroplasticity - the brain ability to accommodate functioning to perceptual experience.
Sound
Sound is a brief change in local air pressure, produced by the movement of air molecules. This movement may be sudden (such as when a branch is broken) or maintained over time due to a reverberating / resonant body that emits a sound persisting in time. Resonance or reverberation is the property that causes objects to vibrate. The same idea applies to the hydrogen atoms in MRI - that is, at the moment of returning to the equilibrium position, the atoms in MRI emit radio waves. Unlike water waves that are transverse (up and down), sound waves are longitudinal (right and left), that is, their vibration is parallel to their travel direction, corresponding to a forward and backward movement. When we have a graphical representation of sound waves, where x is represented by time, then the graph represents how much each molecule of air moves in a cycle of movement back and forth from its equilibrium point (https : //www.youtube.com/watch? v = -_ xZZt99MzY). Depending on the vibrations’ components frequency, longer stimuli can be perceived as a tone (if the vibration is periodic ) or as noise (if the vibration is aperiodic ). Regardless of the perception as tone or noise, the compression and rarefaction of the air molecules always generates a wave of sound. Only a few sounds are generated by simple sinusoidal waves and periodic frequency. Most of the sounds in nature are closer to the extreme of the noise in the sound spectrum. That is, they are aperiodic. But the human ear is particularly sensitive to periodic stimuli and these become important components of the auditory environment. As such, the human brain in terms of harmonic series perceives complex auditory stimuli, as if it were possible to decompose each sound into periodic single waves.
is, it is the audition organ par excellence, responsible for sound coding. The information coming from the vestibular systems is transmitted by the vestibular nerve and the information coming from the cochlea is transmitted by the acoustic or auditory nerve. Each cochlear tube is divided into three sections, which are filled with cochlear (water-like) fluid: the scala vestibuli , the scala tympani , and the cochlear duct. The scala vestibuli (attached to the oval window) and the scala tympani (attached to the round window) join at the apex / termination of the cochlea. The cochlear duct is found between the two. The cochlear duct is separated from the scala by two membranes. The separation with the scala vestibuli ramp is done by Reissner's membrane and the separation with the scala tympani is done by the basilar membrane. The basilar membrane supports the Corti Organ where the hair cells (the sound receptors) are located. The hair cells transform the sound waves into electrical pulses, passing acoustic information. There are inner and outer hair cells , covered by the tectorial membrane. There are 3 rows of outer hair cells (corresponding to 12,000 cells, which receive 5% of the auditory inputs and have no coding function) and one row of inner hair cells (with 3500 cells, which receive 95% of the auditory input. When the basilar membrane moves due to an auditory stimulus, the tectorial membrane also moves, stimulating the hair cells. The movement of the hair cells results in membrane depolarization, eliciting action potentials in the axons of the auditory nerve, whose neuronal nuclei lie in the spiral ganglion. Action potentials inform the primary auditory cortex about the frequency, amplitude and phase of the stimulus. Throughout the cochlea, the basilar membrane is sensitive to different frequencies, being more sensitive to high frequencies in the base region and more sensitive to low frequencies in the termination region or apex - tonotopic organization. The tonotopic organization of the cochlea appears to exist throughout the auditory system, particularly in the primary auditory cortex.
Primary auditory pathways
Sound travels through the auditory pathway from the outer ear to the CNS, through several "stations". Each station is responsible for different levels of acoustic information processing. There are 6 main stations: The cochlear nuclei ; the superior olivary complex , lateral lemniscus , inferior colliculus , the medial geniculate body of the thalamus and the primary auditory cortex (A1). The first target of the neurons that constitute the auditory nerve and that carry the information from the basilar membrane is the cochlear nucleus. The cochlear nucleus is located in the rostral region of the medulla of the brainstem. These are the first information integrating centers and receive information only ipsilateral that is, coming from the ear of the same side. Its function is mainly to reduce noise and encode the message for its duration, intensity and frequency.
Sound perception
In all sensory modalities, conscious sensory stimuli are defined by subjective characteristics / qualities. As for vision the characteristics of luminance, brightness, color, shape, depth and movement are important definers of the stimuli, for acoustics the characteristics of loudness / intensity, pitch and timbre allow the description of the signal perceived.
The brain has the ability to "fill" missing information, providing consistency to the message. For example, even in experimental situations where the fundamental frequency is absent, the brain is able to intuit sound from the available harmonics. Likewise, although most of the fundamental frequencies of the human voice are below 300 Hz, the auditory system is able to fill in the missing information, giving rise to the perception of a continuous sound (for example, during a telephone call). This is behind the clinical success of cochlear implants.
Neurons that simultaneously process fundamental frequency, and pure tones are not the same that respond only to pure tones. The center of tonality (the region that processes pure sounds and fundamental frequency when absent, filling in gaps when fundamental frequency fails) is shared by the primary and secondary auditory cortices. On the contrary, neurons that specifically process pure sounds are from the primary auditory cortex. The Tone Center is clearly a more complex and higher processing center.
As seen in vision, the auditory perception is complex and is not limited to the processing of pure individual stimuli and their characteristics of loudness, pitch and timbre. What our brain allows us to perceive at every moment is always what is behaviorally and ontologically more relevant and is associated with the stimuli that are most for survival - food, group, danger, etc. However, in contrast to acoustic discrete qualities, what we perceive in fact are acoustic scenes.
Acoustic Signal Localization
The human brain has the extraordinary ability of spatial localizing stimuli with approximately 1 degree of error - the equivalent of 10 microseconds of processing difference between ears. This is particularly true in the horizontal plane (e.g. in the azimuth ) and depends on the distance traveled by the sound waves. Sound waves travel at approximately 340m per second. The greater the distance from the source of the sound to the receiving ear, the longer it will take the sound to be perceived. For example, if the stimulus is located closer to the left ear, information is more readily perceived by the left ear, allowing the brain to compute the left direction as the stimulus location - Interaural time differences (particularly important for low frequencies). For higher frequencies (> 3000Hz), the same principle applies. That is, the loudness decreases with distance. Thus, the proximity of one ear allows a greater loudness to be perceived, compared to the other side ear – Interaural intensity difference. The same does not apply to low frequency stimuli because low frequency waves are less affected by the head as an obstacle. Only at high frequencies the head acts as an "acoustic shadow". The brain location accuracy is high in the longitudinal plane, but less accurate in the coronal plane and in the axial plane.
Mechanosensory / Somatosensory Systems
Beyond sight and hearing, there are other perception systems, perhaps even more important from the point of view of species development and survival. Mechanosensory systems are particularly important because they
As in vision, the receptive field of each receiver type is also different. Some receivers benefit from high resolution and others from low resolution. The smaller the receptive field the greater its resolution (the sensitivity of this receptor to nearby tactile stimuli). For example, receptive fields of mechanosensory neurons responding to fingertip receptors are approximately 1 to 2 mm, receptors in the palms of the hands are 5 to 10 mm, and a few centimeters for those responding to the touch on the forearm.
Central pathways of information of the cutaneous / subcutaneous and proprioceptive systems.
The processing of stimuli originating in the cutaneous / subcutaneous system and in the proprioceptive system follows the same central pathways. Somatosensory processing includes ascending pathways that start at the peripheral receptors of the cranial nerves entering the brainstem. The information is then processed through a series of subthalamic nuclei , arriving at the posterior ventral complex of the thalamus. Finally, the axons of the thalamic neurons then project to the cortical neurons of the primary somatosensory cortex or S1. The primary somatosensory cortex is located behind the central sulcus, already in the parietal lobe, immediately behind the primary motor cortex. It includes the areas of Brodmann 1, 2 and 3a and 3b.
Somatosensory information, as it occurs in other sensory modalities, is distributed from the primary somatosensory cortex to adjacent cortical regions, for further processing and integration. That is, the information processed in its basic constituents is subjected to more complex and associative processing in the Secondary Somatosensory Cortex or S2 , as well as in other areas of the Parietal Cortex.
These secondary areas, in turn, project to limbic structures , namely the amygdala and the hippocampus , responsible for the processing of emotional responses and memory. Similarly, the motor cortex, located in the frontal lobe and immediately anterior to S1, receives projections from S2 and provides feedback for most of the somatosensory secondary regions in fronto-striate-parietal loops of continuous information flow, allowing online behavioral adjustments. In this way, it is possible to coordinate the behavioral responses with the somatosensory input, such as the coordination of attention and touch.
In the primary somatosensory cortex we can find a cortical mapping similar to what happens in the primary auditory and visual cortices - the somatotopic organization. The cortical representation of the regions of the body is however distorted, not corresponding to the real proportions of each of the parts. The disproportionality has to do with the sensitivity and density of receptors and sensory processing routes, relative to each region. The greater the importance and density of receptors the greater its cortical representativeness.
Neuroplasticity Neuroplasticity is one of the main characteristics of our brain and relates to its adaptive capacity, revealed at the level of acquisition of new skills such as mnemonist ability, learning new languages or in improving athletic skills resulting from training. Also it can be seen as a result of psychotherapy or as a sequence of a drug protocol. The somatosensory system has shown evidence of great plasticity in both animals and humans. For example, the pain associated to the phantom limb effect reveals the brain struggle for adapting to the absence of the lost limb. On the contrary, the reduction of phantom limb pain after therapy results from processes of cerebral adaptation - neuroplasticity potentiated by training and promoted in therapeutic context. The cerebral plasticity is mainly due to the stimulation of the motor and somatosensory regions and the promotion of their plasticity in the corresponding neuronal regions. It is known, however, that neuronal plasticity is