Adapter layers have recently proven to be flexible and lightweight mechanisms for multi-lingual translation models. In this internship we plan to explore their use for speech-to-text translation as a way of leveraging mono-lingual data to be able to translate from/to new languages in an unsupervised way.
Naver Labs Europe
– PhD or research master student, in NLP, speech or machine learning with an interest on language technologies
– Familiarity with modern machine learning, as applied to NLP. Evidenced by publications in the domain.
– Familiarity with deep learning frameworks and python.