Untergeordnete Navigation

Der Kalender liegt auch im csv-Format und im ics/iCal-Format vor.



Juli 2016
MoDiMiDoFrSaSo
    123
45678910
11121314151617
18192021222324
25262728293031

 

Informatik-Oberseminar: Handling Multimodality and Scarce Resources in Sign Language Machine Translation

04.08.2016, 14:00 Uhr (Informatik-Zentrum, Ahornstraße 55, E3 Raum 9222)

Referent: Dipl.-Inform. Christoph Schmidt

Titel:

Abstract:

In the field of statistical machine translation, the translation of sign languages poses an interesting and challenging problem. Since signed languages differ in grammar, vocabulary and expression from spoken languages even within the same country, the signs have to be translated into a spoken language text. In sign languages, meaning is conveyed simultaneously not only via the two hands, but also by facial expressions, body posture, head movement, and eye gaze. Because of this complex and multimodal nature of sign languages, there is no common writing system, and the scientific question of an annotation scheme suitable for machine translation remains open. Another difficulty when applying statistical methods to sign language translation is the lack of a sufficient amount of training data. This data scarcity often leads to poor translation results. Moreover, the multimodal nature of sign language is not handled by current translation systems, which usually process sequences of words. In this thesis, we approach the above three problems: finding a suitable annotation scheme, dealing with small amounts of annotated data, and handling multimodality in the machine translation process. To find an annotation scheme suitable for machine translation, we analyse the importance of additional information such as mouthings, locations in the signing space or simultaneous signing of two different signs with both hands and devise an improved way of including it in the process of translation. To deal with data scarcity and the resulting poor word alignments, we improve the automatic alignment by applying a morphosyntactic as well as a semantic analysis to find corresponding signs and phrases. To handle the multimodality of sign languages in statistical machine translation, we present two approaches. Firstly, we automatically adapt the granularity of the annotation by distinguishing signs with the same hand movements but different mouthings based on an automatic extraction of lip movements. Secondly, we use the mouthing directly in the decoding process, using both the information signed by the hands and the mouthing as an input to the decoder. By approaching the three issues of a suitable annotation, of data sparseness and of multimodality, we arrive at a translation system which can handle the multimodal sign language input and which is well beyond the performance of a standard translation system that only translates the manual component of a sign language utterance.

Es laden ein: Die Dozenten der Informatik

08.06.2016, mbr

Aachen 2025 - Digitalen Wandel erleben

23.09.2016 - 25.09.2016

Die Digitalisierung des Alltags schreitet voran. Neue Technologien, die unser Leben beeinflussen und verändern, entstehen im Stundentakt. Wie nehmen diese Technologien Einfluss auf mein Leben? Wie wird Aachen in 10 Jahren sein – im Jahr 2025?

Zeit, sich mit den Chancen, auch mit den Herausforderungen, die daraus erwachsen, auf spannende, unterhaltsame und mitreißende Art zu beschäftigen.

Aachen 2025 wird unterstützt von Prof. Stefan Kowalewski und Prof. Manfred Nagl.

Weitere Informationen: www.aachen2025.de

14.12.2015, mbr

PromotionsCafé: Perfektes Auftreten, SmallTalk, Geschlechterspezifische Kommunikation, Erfolgreiches Selbstmarketing

02.11.2016, 16:00-17:30 Uhr

Unser Gast, Melanie Götze, Systemische Beraterin und Freiberuflerin in Düsseldorf, verrät erfolgreiche Tipps und pfiffige Tricks:

Alle Doktorandinnen und Doktoranden sind zu dieser Veranstaltung herzlich eingeladen!

Weitere Informationen

15.06.2016, mbr