[visionlist] Postdoctoral fellow: Neural dynamics of scene processing

The labs of Michelle R Greene (Bates College) and Bruce C Hansen
(Colgate University) are seeking a highly motivated postdoctoral fellow for a
2-year NSF-funded project investigating the neural dynamics of scene
categorization. The project offers an opportunity to engage in cutting-edge
high-density EEG combined with machine learning, computational modeling,
behavioral psychophysics, and neuromodulation.

This position is especially
suited for a candidate interested in pursuing an academic career in a liberal
arts setting. There will be ample opportunity to hone mentorship skills with
undergraduate assistants. The candidate will have extensive interaction with
both PIs, and there are ample opportunities for extended work at Colgate
University as well.

The position is available
starting August 1, 2018. Application materials should include a CV, a brief
statement of research goals, and names and contact information of three
references.  Please email all application
materials to both PIs (mgreene2@bates.edu, bchansen@colgate.edu).  Informal inquiries may be sent to either PI.

About Bates College: Founded in 1855, Bates is one of the nation’s
leading liberal arts colleges, with a long history of commitments to principles
of human dignity and diversity.

Since its inception, Bates has
been a coeducational, nonsectarian, residential college with special
commitments to academic rigor and to programs and opportunities by qualified
learners of all backgrounds. Historically, Bates students and graduates have
linked education with service, leadership, and obligations beyond themselves.

Bates has highly competitive
admission, graduates over 90% of its entering students, and over half of its
alumni earn graduate degrees. Bates has 1,700 students, 200 faculty members and
550 staff and administrative employees.

The College is proud of its
strong involvements in the Lewiston-Auburn communities, Maine’s second largest
urban area, with a population of approximately 65,000. Bates is located on a
beautiful, 109-acre, traditional New England campus in Lewiston, Maine, a
lively small city enjoying an exciting economic and cultural revitalization.
With an engaged citizenry, lively arts scene, and historic downtown, this
walkable city is 35 miles north of Portland, 140 miles north of Boston, and 340
miles north of New York City.

Advertisements

Re: [visionlist] About help on eye-tracking of head free non-human primates

Dear all,

I am Stefan, a visiting assistant professor at the Department of
Neurophysics at the University of Marburg and Chief Scientific
Officer at Thomas RECORDING GmbH, Germany.
We faced the same problems in the past and are currently working
on a solution, which allows freely moving primates to perform
behavioral tasks in their home cages or special arenas in
combination with eye tracking and single cell recordings.
Recently we finished the first step, i.e. a training and
experimental unit for freely moving primates, which is
commercially available at Thomas RECORDING (please see:
https://ift.tt/2zJOHHm).
You can find a demo video of the system on Youtube: https://youtu.be/yDOZauNSwqs

In short: The system consists of a ruggedized tablet computer, a
flexible cage-mountable holding device and an integrated reward
unit. Currently the build-in front-facing camera can be used to
monitor the animal and its overall behavior. However, we are
currently working on a software update to implement basic eye
tracking features (fixation control, saccade-antisaccade-tasks,
etc.) to the system.
Furthermore, a trigger interface for synchronization with chronic
recording devices (e.g. the wireless version of the AMEP system
https://ift.tt/2mrnz6J),
is in development.

Taken together, I think this system should meet most of your
requirements regarding eye tracking and single unit recordings in
freely moving primates. At the moment, you can start training your
animals with the system and getting them used to the new
environment. In the near future, you can upgrade your existing
device with a new software package, giving you the possibility to
track the eyes of the primate and synchronize your behavioral and
eye tracking data with your physiological recordings.

If you have further questions or suggestions, please feel free to
contact me anytime.

Best regards,

Dr. Stefan Dowiasch

Am 14.07.2018 um 21:09 schrieb
visionlist-request@visionscience.com:

Date: Sat, 14 Jul 2018 12:00:45 +0530
From: Harish Katti
To: visionlist@visionscience.com
Subject: [visionlist] About help on eye-tracking of head free
non-human primates
Message-ID:

Content-Type: text/plain; charset=”utf-8″

Dear all
I am Harish, a post-doctoral fellow in Dr SP Arun’s experimental
vision group at the Centre for Neuroscience, Indian Institute of Science.
I’m posting this to get feedback from researchers who have tried automated
eye-gaze/head-pose/body-pose tracking of freely moving non-human primates.

In our lab we are trying to setup eye tracking in monkeys without any
head restraints. Our plan is to have a behavioural arena where the
animal is not head-fixed and can come up to a touch screen and perform
simple tasks in return for juice rewards. Since the animals are not
head-fixed, the eye-tracking needs to be done in a manner that can
handle change in body and head pose. We have been evaluating a few
commercial eye-tracking systems but find that the trackers have
difficulty in finding the face/eyes. It will be nice to have your inputs
on the following issues,

1. Is there a good eye tracking system that already has macaque face
appearance templates bulit in?

2. Are there any novel ways of placing the screen and tracker that
result in better eye-tracking? We have tried various ways of placing
trackers below the screen and at various distances from the animal.

3. Are there multi-camera eye-tracker systems that we can set-up from
different view points so that one or more can always have a clear view
of the animal?

4. Do these systems have hardware input for behavioral event markers and
analog/digital outputs of eye-gaze data so that we can sync it with our
neural data acquisition.

best,
Harish


[visionlist] New application deadline: Neuroscience data analyst opening at European Neuroscience Institute Göttingen, Germany

Please note the new application deadline July 31st, 2018.

The European Neuroscience Institute
Göttingen (ENI-G) is seeking a

Neuroscience data analyst

 

initially limited until 31.12.2019
with extension options, full-time | salary according to TV-L

 

About us

The University Medical Center Göttingen
is a tertiary care center and offers great development potential. Its 7,700
employees work in over 65 departments and facilities to provide top-quality
patient care, excellent research and modern teaching. Göttingen, “City of
Science”, located near the center of Germany, the University Medical Center Göttingen
is embedded in the city’s attractive network of scientific research facilities.

The European Neuroscience Institute
is looking to fill the position of a data analyst (full time). We are looking
for a data analyst with interest and experience in systems neuroscience.
Research at the European Neuroscience Institute ranges from molecular biology
to human psychophysics and involves a range of model organisms (from drosophila
to non-human primates), techniques and approaches (electrophysiology,
two-photon imaging, fMRI, EEG, behavior). The data analyst will work closely
with various research groups at the European Neuroscience Institute, supporting
research efforts, e.g., through modelling and statistical analyses of high-dimensional
data, image processing, and programming/development of experiments. S/he will
have the opportunity to develop and publish, e.g., analytical tools that arise
from this work.

The applicant should possess a university degree in a
relevant field, e.g., statistics, biostatistics, informatics, or similar.
Prior experience in the field of systems neuroscience is highly desired.

The applicant should have experience in a research
setting utilizing quantitative methods and statistics. The applicant shall
demonstrate strong analytical skills and knowledge of novel and emerging
analysis techniques is highly desired. Future/forward thinking in the area
of big data analytics/informatics and applying them to contribute to the
research groups’ scientific process is expected.

The applicant should be skilled in the analysis of
multivariate datasets to reveal patterns and build models; conduct
exploratory data analysis, and communicate with team lead/team members;
identify improvements for existing data management and recommend
requirements for new systems.

Contribute to replicability by making suggestions for
existing data management and recommend requirements for new systems and
identify potential data integrity issues.

Utilize programming languages such as Python, Matlab
and/or C++.

Good command of English is mandatory.

Women are especially encouraged to
apply. Applicants with disabilities and equal qualifications will be given
preferential treatment.

We look forward to receiving your application by July 31st, 2018

University Medical Center Göttingen
European Neuroscience Institute Göttingen
Dr. Caspar Schwiedrzik
Group Leader
Grisebachstr. 5
37077 Göttingen
Tel.: 0551/39-61371
E-Mail: cschwie3@gwdg.de
Web: http://www.eni-g.de/

Please send your application only
via e-mail as a PDF-file.

 


[visionlist] Fwd: About help on eye-tracking of head free non-human primates

Hi,
   I’m forwarding this mail on behalf of Dr Goossens.

Best,

Harish


Re: [visionlist] About help on eye-tracking of head free non-human primates

Hi Harish, 

I could offer my thoughts since I am working on head-free portable eye tracking. Note: I have never worked with monkeys so I do not know if they are opposed to bodily attachments such as wristbands and headbands etc.

A potential (and cheapest IMO) solution would be to retrofit a cricket helmet + ratchet add-on and placing the IR based cameras on the insides on the helmet. This will ensure that the monkey won’t scratch the eye tracking cameras and that it remains snug. You could fit an IMU on the inside of the helmet – something like this:

Please ignore the Stereo cameras, this is a picture from my presentation (Note: no self promotion intended).

Now, the Pupil labs has excellent open source software that can provide excellent pupil detection. You could either use their cameras or make your own. https://pupil-labs.com/store/ 

The best part about the pupil hardware is that their design is very easy to work with or break open and use the eye + scene cameras separately. (Note: not promoting Pupil labs, this is simply a suggestion towards your project). You can easily detach the scene and eye cameras and place them into the cricket helmet and their software will do the rest.

By extracting the head pose from the IMU and Gaze from the Eye Tracker, you could get a Gaze-In-World vector, however, it is in the nature of IMUs to drift. Hence, the GIW vector would require occasional correction (every 10 mins or so) – which brings me to my last point.

I’m assuming you’d want accurate gaze tracking on the screen (in pixel coordinates). You could easily display fiducial markers at the corners of the screens and find the mapping between screen coordinates and Point of Regard values on the Scene camera. This mapping (should be a 3×3 transformation matrix) can also be used to estimate head position in a 3D space relative to the screen location. Everytime a monkey arrives near a screen, a program can identify these markers and estimate head position in 3D space and automatically align the IMU – this last part can be a little difficult to implement though! It took me ages to work with Quaternions and poses.

If you want body tracking as well, then a 2nd IMU, a little below the nape would be a good idea.

If you have the ₹ for it, you could also fit the monkeys in a tracking suit and use a motion capture system such as PhaseSpace http://phasespace.com/ – We have one in our lab and it works great!

I hope I could help

Rakshit

P.S – You could also create your own MoCap system and that would be a fun project for an engineering graduate – using patterns of IR emitters and IR cameras around the room to triangulate the 3D position of a marker!

P.S.S – There are many open source implementations of skeleton find using a 3D camera, but I don’t know if they can be modified for monkeys.

On Sat, Jul 14, 2018 at 12:14 PM Harish Katti wrote:

Dear all      I am Harish, a post-doctoral fellow in Dr SP Arun’s experimental vision group at the Centre for Neuroscience, Indian Institute of Science. I’m posting this to get feedback from researchers who have tried automated eye-gaze/head-pose/body-pose tracking of freely moving non-human primates.In our lab we are trying to setup eye tracking in monkeys without any head restraints. Our plan is to have a behavioural arena where the animal is not head-fixed and can come up to a touch screen and perform simple tasks in return for juice rewards. Since the animals are not head-fixed, the eye-tracking needs to be done in a manner that can handle change in body and head pose. We have been evaluating a few commercial eye-tracking systems but find that the trackers have difficulty in finding the face/eyes. It will be nice to have your inputs on the following issues,1. Is there a good eye tracking system that already has macaque face appearance templates bulit in?2. Are there any novel ways of placing  the screen and tracker that result in better eye-tracking? We have tried various ways of placing trackers below the screen and at various distances from the animal.3. Are there multi-camera eye-tracker systems that we can set-up from different view points so that one or more can always have a clear view of the animal?4. Do these systems have hardware input for behavioral event markers and analog/digital outputs of eye-gaze data so that we can sync it with our neural data acquisition.best,Harish


[visionlist] About help on eye-tracking of head free non-human primates

Dear all      I am Harish, a post-doctoral fellow in Dr SP Arun’s experimental vision group at the Centre for Neuroscience, Indian Institute of Science. I’m posting this to get feedback from researchers who have tried automated eye-gaze/head-pose/body-pose tracking of freely moving non-human primates.In our lab we are trying to setup eye tracking in monkeys without any head restraints. Our plan is to have a behavioural arena where the animal is not head-fixed and can come up to a touch screen and perform simple tasks in return for juice rewards. Since the animals are not head-fixed, the eye-tracking needs to be done in a manner that can handle change in body and head pose. We have been evaluating a few commercial eye-tracking systems but find that the trackers have difficulty in finding the face/eyes. It will be nice to have your inputs on the following issues,1. Is there a good eye tracking system that already has macaque face appearance templates bulit in?2. Are there any novel ways of placing  the screen and tracker that result in better eye-tracking? We have tried various ways of placing trackers below the screen and at various distances from the animal.3. Are there multi-camera eye-tracker systems that we can set-up from different view points so that one or more can always have a clear view of the animal?4. Do these systems have hardware input for behavioral event markers and analog/digital outputs of eye-gaze data so that we can sync it with our neural data acquisition.best,Harish


[visionlist] Postdoc position at Purdue University