Postdoctoral Research Assistant/Associate in
Computing the Face Syntax of Social Communication
Dr. Rachael Jack is delighted to announce the opening of a 3-year ERC-funded 
postdoctoral researcher position on the project Computing the Face Syntax of 
Social Communication at the Institute of Neuroscience & Psychology and School 
of Psychology at the University of Glasgow, Scotland, UK.

The Project. This ambitious project aims to mathematically model the human face 
as an algebraic generator of dynamic social signals and build a psychologically 
and culturally valid generative model of social face signalling that is 
transferrable to social robots. The project will use a multidisciplinary 
approach that combines social and cultural psychology with dynamic 3D 
structural face computer graphics, vision science psychophysical methods, and 
mathematical psychology. Given that project involves interdisciplinary 
knowledge and skills, the ideal candidate would have experience of both 
computational (e.g., programming) and social psychology, for example via a 
joint degree or research experience/interests.

Research Environment. The successful applicant will experience a unique and 
intellectually stimulating research environment within the Institute of 
Neuroscience & Psychology, undertake a specific programme of specialist 
research skill development, and contribute to progressing an internationally 
competitive and strategic research agenda. The applicant will have access to 
(1) a unique, state-of-the-art 4D structural face imaging technology and 
dynamic face movement generator; (2) specialist in-house training on advanced 
quantitative methods and statistical analyses (e.g., 4D image processing, model 
fitting); (3) postdoctoral communities; (4) a dedicated full-time Research 
Technologist specializing in 3D and 4D computer graphics; (5) a dedicated 
full-time computing support team who provide data storage (>5 Petabytes), 
high-security data management systems, high-performance equipment and software; 
(6) a secure on online Subject Pool (7,000+ members, 106 nationalities); (7) 
international collaborators; and (8) a full suite of brain imaging facilities 
including 7T fMRI, MEG, EEG, and TMS.

The Team
Primary Investigator:                       Dr. Rachael E. Jack
http://www.gla.ac.uk/schools/psychology/staff/rachaeljack/

The successful applicant will join an internationally renowned, high 
performance interdisciplinary research team and receive regular, close 
mentorship and collegial interaction from PI Jack and other lab members via lab 
meetings. The successful applicant will develop and apply state-of-the-art 
specialist skills and knowledge of social face perception and face signalling 
including 3D & 4D face capture and generation, advanced MATLAB programming, lab 
testing booth preparation, high volume data collection, mathematically 
modelling 3D dynamic face signals, analyzing high-dimensional data, scientific 
writing, and producing high-quality data visualizations for presentations and 
high-profile publications. The successful applicant will also have the 
opportunity to present at national and international academic conferences, 
participate in public engagement activities, and submit their work to 
high-impact and specialist peer reviewed academic journals. Successful 
applications may also have the opportunity to work with other interested 
parties (e.g., social robotics designers).

Affiliate labs. The Jack lab regularly interacts with and has joint lab 
meetings with the following labs:

Prof. Stacy Marsella
https://www.gla.ac.uk/researchinstitutes/neurosciencepsychology/staff/stacymarsella/

Prof. Philippe G. Schyns
http://www.gla.ac.uk/researchinstitutes/neurosciencepsychology/staff/philippeschyns/

Start date: May 2020 (negotiable)

Closing date: 20th February 2020

Reference number: 032999

Apply here: https://www.jobs.ac.uk/job/BYD853/research-assistant-associate
_______________________________________________
uai mailing list
[email protected]
https://secure.engr.oregonstate.edu/mailman/listinfo/uai

Reply via email to