Overview

About the role

The primary responsibility of the researcher will be to develop and
train architectures for conversational agents that can interpret
coreference and reference which can be embedded in virtual worlds such
as MineCraft.

About you

We seek outstanding researchers with a genuine interest in
conversational agents, human-in-the-loop learning, and coreference /
reference, and ideally all of the above. The ideal candidates will have
an extensive background in deep learning methods applied to NLP, in at
least one of the areas above. We would prefer candidates who already
have a PhD in conversational agents, machine learning, or natural
language processing; such candidates will be considered for a
Post-doctoral position. Candidates who are essentially done with the PhD
work and only waiting for the defense/viva will also be considered,
initially for a Research Assistant position, that can be turned into a
post-doctoral position once the candidate receives their PhD.

The post

The post is based at the Mile End Campus in London. It is a full-time,
fixed-term appointment for up to 32 months available from June 2022 or
as soon as feasible. For a Research Assistant the annual starting
salary will be in the range of £32,087-£33,824 per annum; for a
Postdoctoral Assistant it will be between £34,733- £38,655 per annum.
Inclusive of London Allowance.

Research Environment

The researcher will work with Massimo Poesio, Julian Hough, Diego
Perez-Llebana, Matt Purver, and Chris Madge from the Cogsci and Game AI
groups at Queen Mary University of London; and Richard Bartle, Jon
Chamberlain, and Juntao Yu from the NLP group at the University of Essex.

The Cognitive Science Research Group at Queen Mary University of London
(CogSci) is a leader in the areas of conversational agents and dialogue,
coreference and reference, and the use of games-with-a-purpose (GWAPs)
to collect data for NLP. With 9 faculty members, 10 postdocs and 27 PhD
students, Cogsci?s Computational Linguistics Lab
(http://compling.eecs.qmul.ac.uk/) is one of the largest NLP labs in the
UK.

The Game AI Research Group, Queen Mary University of London (GAIG)
(https://gaigresearch.github.io/) consists of 8 faculty members, 2
postdocs and 25 PhD students and is the largest Game AI group in UK and
a leader in research on developing intelligent artificial agents able to
act as non-playing actors in games. GAIG also runs the Intelligent
Games and Games Intelligence (IGGI; EP/S022325/1) Doctoral Training Centre.

The Natural Language and Information Processing group at the University
of Essex (NLIP) (https://essexnlip.uk/) is one of the oldest NLP groups
in the UK, and has recently become a pioneer in the areas of
crowdsourcing, in particular with games with a purpose such as Phrase
Detectives (http://www.phrasedetectives.com_, the result of a
collaboration between NLIP and the equally long-established Essex Games
and AI group, which developed MUD, the oldest virtual world in existence.

The wider context

London is a vibrant town and one of the most active centres of AI/NLP
research in the world. It is the location of the Turing AI Institute,
of several Universities active in AI and NLP, and of numerous AI
companies both large and small.

Company:

Queen Mary
University

Qualifications:

We seek outstanding researchers with a genuine interest in
conversational agents, human-in-the-loop learning, and coreference /
reference, and ideally all of the above. The ideal candidates will have
an extensive background in deep learning methods applied to NLP, in at
least one of the areas above. We would prefer candidates who already
have a PhD in conversational agents, machine learning, or natural
language processing; such candidates will be considered for a
Post-doctoral position. Candidates who are essentially done with the PhD
work and only waiting for the defense/viva will also be considered,
initially for a Research Assistant position, that can be turned into a
post-doctoral position once the candidate receives their PhD.

Educational level:

Ph. D.

Tagged as: , , ,