Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 5 of 5

Full-Text Articles in Entire DC Network

Individual Auditory Categorization Abilities Are Shaped By Intrinsic And Experience-Driven Neural Factors, Kelsey Mankel Jan 2021

Individual Auditory Categorization Abilities Are Shaped By Intrinsic And Experience-Driven Neural Factors, Kelsey Mankel

Electronic Theses and Dissertations

To make sense of the auditory world, listeners must organize diverse, continuously varying sounds into meaningful perceptual categories. The auditory categorization process is believed to be a foundational skill for language development and speech perception. Despite decades of behavioral research, neuroscientific evidence is only beginning to uncover where, when, and how auditory categories arise in the brain. Although it has been proposed that categorical perception is shaped by both innate (nature) and experience-driven (nurture) factors, it is unclear how these features manifest neurally at the individual level. In the first study of this dissertation, we recorded multi-channel electroencephalography (EEG) in …


Assessing The Relationship Between Talker Normalization And Spectral Contrast Effects In Speech Perception., Ashley Atri Assgari May 2018

Assessing The Relationship Between Talker Normalization And Spectral Contrast Effects In Speech Perception., Ashley Atri Assgari

Electronic Theses and Dissertations

Speech perception is influenced by context. This influence can help to alleviate issues that arise from the extreme acoustic variability of speech. Two examples of contextual influences are talker normalization and spectral contrast effects (SCEs). Talker normalization occurs when listeners hear different talkers causing speech perception to be slower and less accurate. SCEs occur when spectral characteristics change from context sentences to target vowels and speech perception is biased by that change. It has been demonstrated that SCEs are restrained when contexts are spoken by different talkers (Assgari & Stilp, 2015). However, what about hearing different talkers restrains these effects …


Predicting Speech Recognition Using The Speech Intelligibility Index (Sii) For Cochlear Implant Users And Listeners With Normal Hearing, Sung Min Lee Nov 2017

Predicting Speech Recognition Using The Speech Intelligibility Index (Sii) For Cochlear Implant Users And Listeners With Normal Hearing, Sung Min Lee

Electronic Theses and Dissertations

Although the AzBio test is well validated, has effective standardization data available, and is highly recommended for Cochlear Implant (CI) evaluation, no attempt has been made to derive a Frequency Importance Function (FIF) for its stimuli. In the first phase of this dissertation, we derived FIFs for the AzBio sentence lists using listeners with normal hearing. Traditional procedures described in studies by Studebaker and Sherbecoe (1991) were applied for this purpose. Fifteen participants with normal hearing listened to a large number of AzBio sentences that were high- and low-pass filtered under speech-spectrum shaped noise at various signal-to-noise ratios. Frequency weights …


Neural Coding Of Natural And Synthetic Speech., Allison Brown May 2017

Neural Coding Of Natural And Synthetic Speech., Allison Brown

Electronic Theses and Dissertations

The present study examined whether natural and synthetic speech are differentially encoded in the auditory cortex. Auditory event-related potential (ERP) waveforms were elicited by natural and synthetic fricative-vowel stimuli (/sɑ/ and /ʃɑ/) in a passive listening paradigm in adult listeners with normal hearing. ERP response components were compared across conditions. The results indicated that peak latencies to natural speech were significantly earlier than those to synthetic speech. Natural speech also produced significant electrode hemisphere site effects, whereas synthetic speech activated left, midline, and right electrode hemisphere sites equally. Overall, the results suggest that cortical processing of natural and synthetic speech …


Speech Perception And The Mcgurk Effect : A Cross Cultural Study Using Event-Related Potentials., Jia Wu May 2009

Speech Perception And The Mcgurk Effect : A Cross Cultural Study Using Event-Related Potentials., Jia Wu

Electronic Theses and Dissertations

Previous research has indicated the important role of visual information in the speech perception process. These studies have elucidated the areas of the brain involved in the processing of audiovisual stimuli. The McGurk effect, an audiovisual illusion, has been demonstrated to be a useful tool in the study of audiovisual integration. Brain imaging research suggests that the McGurk effect is modulated by brain structures in the Superior Temporal Gyrus, Supplemental Temporal Cortex and Broadman area 41. Electrophysiological studies suggest that the McGurk effect generates a different brainwave than the standard audiovisual congruent condition in frontal, central and parietal regions among …