[visionlist] Frontiers Research Topic “Discrimination of Genuine and Posed Facial Expressions of Emotion”Posted: November 29, 2018
Call for Papers:
Frontiers Research Topic
“Discrimination of Genuine and Posed Facial Expressions of Emotion”
About this Research Topic
Facial expressions demonstrate one emotional states in interpersonal situations. Evidence shows that part of the facial display reflects the emotional experience that is literally felt by the expresser. Interestingly, human beings are capable of identifying facial expressions of the felt emotions as a form of intentional deception to conduct social interaction and to stage displays that have the support of others. Staged or posed facial expressions implement an emotion that an expresser intends to convey, where genuine expressions are considered as the companion of spontaneous emotional expressions. The ability to differentiate genuine displays of emotional experience from posed ones is very important for dealing with day-to-day social interactions.
Recent work has been conducted on whether or not people can distinguish between posed and genuine displays of emotion. In spite of few studies to investigate this ability, most prior research suggests that people have the ability to judge genuine and posed facial displays. Unfortunately, previous research has suffered from two major shortcomings: (1) the mixture of staged and genuine displays due to the lack of accounting for possible effects of intentional manipulation, and (2) struggling to consider dynamic aspects when people prepare facial stimuli for experimental investigation.
This Research Topic encourages the submission of theoretical and experimental perspectives to broaden understanding of the importance of the discrimination of genuine and posed facial expressions of emotion. These may be new theoretical approaches, those from other disciplines of psychology not usually utilized within the discrimination of genuine and staged emotion identification or new theories and designs.
We seek articles that present new hypotheses, concepts, experimental observations, and theories or models; demonstrate how theories adapted from other disciplines may be utilized for emotion identification or provide recommendations to improve current models or theories to enhance their capacity. We look for papers that may combine critical analysis of current models, synthesis of early empirical work by single- or multi-modality analysis (e.g. video, ECG, EEG, fMRI, and single cells in humans and/or animals), and explore the potential for their development applied to the context of emotion recognition.
Submissions of systematic reviews and meta-analysis should discuss and promote comprehensive approaches to update and evolve concepts, hypotheses, and theories with potential applications in the community.
Keywords: face expression, discrimination, genuine, posed, emotion
Huiyu Zhou, Department of Informatics, University of Leicester, United Kingdom.
Caroline Ling Li, School of Computing, University of Kent, United Kingdom.
Shiguang Shan, Institute of Computing Technology, Chinese Academy of Sciences, China.
Shuo Wang, Chemical and Biomedical Engineering, West Virginia University, United States.
Jian Liu, Centre for Systems Neuroscience, University of Leicester, United Kingdom.
The Center for Visual Science at the University of Rochester announces its 2019 Summer Research
Fellowship Program in Vision Science.
This is an excellent program for outstanding undergraduate students seeking to expand their research experience. Our fellowship will cover travel expenses for US citizens/residents (and up to $500 for foreign
students), cover the majority of on-campus housing expenses and provide a $4,054 stipend for a summer of continuous training in a faculty member’s
lab. Deadline is February 4.
If you know qualified undergraduates, please share this link:
Here is a PDF poster that can be posted on bulletin boards:
of Voice and Music Processing
postdoctoral position (PostDoc) and a PhD position are available
at the University of Zurich (Switzerland) in the Cognitive and
Affective Neuroscience Lab (Prof. Sascha Frühholz; https://ift.tt/2SlPjYh);
the lab is located at the Psychological Institute of the
University of Zurich.
positions are available for 3-4 years. Earliest possible
starting date for the positions is February 1st 2019
or later. Please see the attachments for further information on
candidates are supposed to work on a project about neurofeedback
influences on the dynamic perception of auditory signals,
especially of voices and music, using real-time neurofeedback
fMRI. The project has two major parts: the PhD candidate will be
mainly responsible to conduct fMRI studies in the field of voice
processing, while the PostDoc candidate will be mainly
responsible for implementing and running fMRI studies in the
field of music processing. The PhD and the PostDoc candidate are
expected to closely work together on the research projects.
University of Zürich provides an excellent research environment
at the Department of Psychology (https://ift.tt/2E3nSze).
We also offer a multidisciplinary and interactive
neuroscientific environment within the framework of the
Neuroscience Center Zürich (https://ift.tt/2SlPl2l).
and informal inquiries about the positions should be sent to
Prof. Sascha Frühholz (email@example.com). Applications will be considered until
the position is filled.
I am looking for a highly motivated Postdoctoral Research Associate in Computer Vision and Computer Graphics to work on dynamic scene reconstruction for 6-DoF VR Video.
This is a full-time, fixed-term appointment until June 2021 (32 months from now). The starting salary will be £33,199–£39,609. The application deadline is Tuesday, 4 December 2018, with an anticipated interview date of Tuesday, 18 December 2018. The successful candidate is expected to start as soon as possible. Project context:To feel truly immersed in virtual reality, one needs to be able to freely look around within a virtual environment and see it from the viewpoints of one’s own eyes. Immersion requires ‘freedom of motion’ in six degrees-of-freedom (‘6-DoF’), so that viewers see the correct views of an environment. As viewers move their heads, the objects they see should move relative to each other, with different speeds depending on their distance to the viewer. This is called motion parallax and is a vital depth cue for the human visual system that is entirely missing from existing 360° VR video.To achieve 6-DoF VR video that enables photorealistic exploration of dynamic real environments in 360° virtual reality, this project will develop novel video-based capture, 3D reconstruction and rendering techniques. We first explore different approaches for capturing static and dynamic 360° environments, which are more challenging, including using 360° cameras and multi-camera rigs. We start with 6-DoF 360° VR photographs (i.e. static scenes) and then extend our approach to 6-DoF VR videos.Our aim is to publish the resulting research and results in leading computer vision and computer graphics venues, specifically CVPR/ICCV/ECCV, IJCV and/or SIGGRAPH (Asia)/Transactions on Graphics.Job description:The main focus of this job is to translate research algorithms into plugins for industry-standard software like Foundry’s CaraVR/Nuke, for example for reconstructing the 3D geometry of the environments from the captured images and videos of 360° environments. Extending image-based rendering to 360° environments will then enable 6-DoF motion within a photorealistic 360° environment with high visual fidelity, and will result in detailed 360° environments covering all possible viewing directions. Another focus of this job is to oversee the software development within the project team (and potentially other additional CAMERA projects). This is to ensure best practices with respect to software design, implementation, testing, version control and documentation.You will join the vibrant Visual Computing group at Bath, which comprises around 30 doctoral students, 5 post-doctoral researchers and 8 academics, and presents many opportunities for collaborative work and shared publications.To find out more about this position, the application procedure, the project and/or our group, please contact Christian Richardt (firstname.lastname@example.org).Please forward this email to anyone who might be interested in these positions.Many thanks,Dr Christian RichardtAssistant ProfessorUniversity of BathUnited Kingdom