Phonetik und Sprachverarbeitung
print

Links und Funktionen
Sprachumschaltung

Navigationspfad


Inhaltsbereich

Publikationen

Hier finden Sie eine Liste der Publikationen der am Institut für Phonetik und Sprachverarbeitung beschäftigten und mit ihm assoziierten Wissenschaftler. Sie können die Liste durchsuchen und nach Jahr oder nach Publikationstyp sortieren lassen.

Die komplette Liste können Sie im BibTeX-Format herunterladen:
Download Publikationsliste (bibtex)

Das IPS hat seit seiner Gründung 1972 in 39 Ausgaben die „Forschungsberichte des Instituts für Phonetik und sprachliche Kommunikation der Universität München (FIPKM)“ herausgegeben. 2002 wurde die Reihe eingestellt. Einige der Ausgaben zwischen 1996 und 2002 sind online abrufbar. Andere Ausgaben sind auf Anfrage in gedruckter Form erhältlich.
Weitere Informationen


Suche


Regulärer Ausdruck, case-insensitive, wird gegen alle BibTeX-Felder geprüft (Autor, Titel etc.)


Ein oder mehrere Jahre oder Zeitspannen, z. B.
1993
1995-1998
08-
-99,02-06,14-





Reference

Schmid, G., Ziegler, W. (2006). Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech. Neuropsychologia, 44(4), 546-555.

BibTeX

@article{ekn_bibtex_00147,
  title = {Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech},
  shorttitle = {Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech},
  author = {Schmid, G. and Ziegler, W.},
  year = {2006},
  journal = {Neuropsychologia},
  volume = {44},
  number = {4},
  eprint = {16129459},
  eprinttype = {pubmed},
  pages = {546--555},
  abstract = {BACKGROUND: Audio-visual speech perception mechanisms provide evidence for a supra-modal nature of phonological representations, and a link of these mechanisms to motor representations of speech has been postulated. This leads to the question if aphasic patients and patients with apraxia of speech are able to exploit the visual signal in speech perception and if implicit knowledge of audio-visual relationships is preserved in these patients. Moreover, it is unknown if the audio-visual processing of mouth movements has a specific organisation in the speech as compared to the non-speech domain. METHODS: A discrimination task with speech and non-speech stimuli was applied in four presentation modes: auditory, visual, bimodal and cross-modal. We investigated 14 healthy persons and 14 patients with aphasia and/or apraxia of speech. RESULTS: Patients made substantially more errors than normal subjects on both the speech and the non-speech stimuli, in all presentation modalities. Normal controls made only few errors on the speech stimuli, regardless of the presentation mode, but had a high between-subject variability in the cross-modal matching of non-speech stimuli. The patients' cross-modal processing of non-speech stimuli was mainly predicted by lower face apraxia scores, while their audio-visual matching of syllables was predicted by word repetition abilities and the presence of apraxia of speech. CONCLUSIONS: (1) Impaired speech perception in aphasia is located at a supra-modal representational level. (2) Audio-visual processing is different for speech and non-speech oral gestures. (3) Audio-visual matching abilities in patients with left-hemisphere lesions depend on their speech and non-speech motor abilities}
}

Powered by bibtexbrowser