Service of SURF
© 2025 SURF
The aim of the present investigation was to evaluate the effect of visual feedback on rating voice quality severity level and the reliability of voice quality judgment by inexperienced listeners. For this purpose two training programs were created, each lasting 2 hours. In total 37 undergraduate speech–language therapy students participated in the study and were divided into a visual plus auditory-perceptual feedback group (V + AF), an auditory-perceptual feedback group (AF), and a control group with no feedback (NF). All listeners completed two rating sessions judging overall severity labeled as grade (G), roughness (R), and breathiness (B). The judged voice samples contained the concatenation of continuous speech and sustained phonation. No significant rater reliability changes were found in the pre- and posttest between the three groups in every GRB-parameter (all p > 0.05). There was a training effect seen in the significant improvement of rater reliability for roughness within the NF and AF groups (all p < 0.05), and for breathiness within the V + AF group (p < 0.01). The rating of the severity level of roughness changed significantly after the training in the AF and V + AF groups (p < 0.01), and the breathiness severity level changed significantly after the training in the V + AF group (p < 0.01). The training of V + AF and AF may only minimally influence the reliability in the judgment of voice quality but showed significant influence on rating the severity level of GRB parameters. Therefore, the use of both visual and auditory anchors while rating as well as longer training sessions may be required to draw a firm conclusion.
LINK
Different inputs from a multisensory object or event are often integrated into a coherent and unitary percept, despite differences in sensory formats, neural pathways, and processing times of the involved modalities. Presumably, multisensory integration occurs if the cross-modal inputs are presented within a certain window of temporal integration where inputs are perceived as being simultaneous. Here, we examine the role of ongoing neuronal alpha (i.e. 10-Hz) oscillations in multimodal synchrony perception. While EEG was measured, participants performed a simultaneity judgement task with visual stimuli preceding auditory ones. At stimulus onset asynchronies (SOA's) of 160–200 ms, simultaneity judgements were around 50%. For trials with these SOA's, occipital alpha power was smaller preceding correct judgements, and the individual alpha frequency was correlated with the size of the temporal window of integration. In addition, simultaneity judgements were modulated as a function of oscillatory phase at 12.5 Hz, but the latter effect was only marginally significant. These results support the notion that oscillatory neuronal activity in the alpha frequency range, which has been taken to shape perceptual cycles, is instrumental in multisensory perception.
LINK