[visionlist] Postdoctoral Position in Theoretical and Computational Neuroscience at Yale

Advertisements

[visionlist] Deadline Extension: Second International Workshop on Context Based Affect Recognition CBAR13

****Deadline for paper submission postponed to 10th of May 2013*****

***************************************************************************************CBAR 2013: CALL FOR
PAPERSACII 2013,
2nd
International  Workshop on CONTEXT BASED AFFECT
RECOGNITION CBAR2013http://cbar2013.blogspot.com/Submission
Deadline: May 10,
2013***************************************************************************************The second international workshop on “Context Based Affect Recognition” CBAR13
(http://cbar2013.blogspot.com/) will be held in conjunction with the 2013 Affective Computing and Intelligent Interaction conference ACII2013, 2-5 September 2013, Geneva, Switzerland (http://www.acii2013.org/).

For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://cbar2013.blogspot.com/Best regards,Zakia Hammal

Zakia Hammal,
PhDThe Robotics
Institute, Carnegie Mellon Universityhttp://www.ri.cmu.edu/

Human-Machine
InteractionFacial Expression RecognitionVisual Perceptionhttp://www.pitt.edu/~emotion/ZakiaHammal.html


[visionlist] (faculty) Biostatistics Research Faculty position at University of Houston


[visionlist] Postdoctoral position in Bayesian modeling for eye-writing at LPNC, Grenoble, France

Postdoctoral position: Bayesian modeling of on-line character recognition in an eye-writing software application

Candidates are invited to apply for a 12-month postdoctoral position (to start in September or October, 2013) to study and develop a Bayesian computational model for on-line character recognition. Research will take place in the Laboratory of Psychology and NeuroCognition (LPNC; CNRS and Grenoble University) in Grenoble, France, under the supervision of Dr. Julien Diard.

The context of this project is the computational study of “eye writing”, and software development of character recognition. We use a novel apparatus, based on a particular visual display and the reverse-phi illusory motion, that enables users to generate smooth-pursuit movement at will and in the direction of their choice. Coupled with eye tracking, the system allows participants to “write” cursive letters with their eyes. The application of the system to disabled, motor-impaired patients is central to the project.

Our objectives are two-fold. The first concerns adapting a previous model for cursive character recognition and production (Gilet et al., 2011) to the task of eye writing (Lorenceau, 2012). Our main goal here is to provide rapid character recognition, in an on-line manner, that is to say, as the character is being traced, and robust to the signal characteristic specific to eye writing (e.g., spurious saccades). The second objective is to expand and adapt the model, for instance toward disability assessment, and toward code convergence for easier man-machine interaction (i.e., adapting the system’s vocabulary to symbols that are convenient to produce for the patient).

Applicants must recently have obtained a PhD degree in Computational cognitive modeling, artificial intelligence, signal processing or a closely related field. This PhD is required to have been obtained not more than 24 months before the starting date of this postdoc (e.g., starting the postdoc on September, 1st, 2013 if you defended your PhD after September, 1st, 2011). The applicant must not have held any position, previously, in the LPNC. The postdoc will be supported by the EOL (Eye On-Line) project, funded by the French National Research Agency (ANR), in collaboration with Dr. Jean Lorenceau. Gross salary will be 2,500 € / month.

Required skills include software development, signal processing and probabilistic (Bayesian) modeling. The ability to communicate scientific ideas both orally and in writing are essential, while an interest in cognitive science and experimental psychology is desirable. Publications at an excellent level will be expected during the postdoc. Support for administrative procedures, for international candidates, is provided by the ISSO of Grenoble University.

Please send your application to Dr. Julien Diard. Only applications received before 12.00 midday on June, 14, 2013 can be considered (with a tentative decision date on June, 28). You will be required to provide a covering letter, a CV including a publication list, and names and contact information of two references with a brief description of your relationship to each reference. Applications can be written in French or English. Working language, also, will either be French or English. Informal enquiries before submitting a full application are welcome.

Contact Person: Dr. Julien Diard
Contact Phone: (+33) 4 76 82 58 93
Closing Date: 14 June 2013
Contact Email: julien.diard@upmf-grenoble.fr

Online resources for more information:
– Julien Diard’s website: http://diard.wordpress.com
– LPNC website: http://web.upmf-grenoble.fr/LPNC/
– More about the EOL project, from the general press:
http://health.usnews.com/health-news/news/articles/2012/07/26/writing-using-the-eyes-might-help-paralyzed-communicate
http://www.sciencedaily.com/releases/2012/07/120726122112.htm

References
Gilet, E., Diard, J. and Bessière, P. (2011) Bayesian action–perception computational model: Interaction of production and recognition of cursive letters. PLoS ONE, 6(6):e20387. http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0020387
Lorenceau, J. (2012) Cursive writing with smooth pursuit eye movements. Current Biology, 22:1506–1509. http://www.sciencedirect.com/science/article/pii/S0960982212006719


Re: [visionlist] #Question: 3D point cloud HDR

Dear Olivier,

I could offer some professional 3D scans with color and reflectance.
Please have a look at our youtube channel to see some data sets, e.g.,
http://youtu.be/CLwXeo2m83Y (http://www.youtube.com/AutomationAtJacobs)
A few data sets are available via the robotic 3D scan repository
(http://kos.informatik.uni-osnabrueck.de/3Dscans/)

Please drop me note, if your are interested.

Very best,
Andreas


[visionlist] 2013 VSS Meeting Information


[visionlist] Research Assistant Position in the McDermott Lab at MIT

Dear colleagues,I am hoping to hire an RA in the near future. I’d be grateful if you could bring it to the attention of any promising undergraduates that you may know who are about to graduate.thanks,JoshPOSITION OPENING: Technical Assistant in the McDermott Lab, Department of Brain and Cognitive Sciences, MIT, to assist with all aspects of research on human audition.RESPONSIBILITIES: Designing, programming, and conducting behavioral experiments; analyzing data; participant outreach and recruitment; implementation and maintenance of analysis software and computational models; technical support for lab personnel; participation in reading groups and research seminars; and some basic administrative duties including ordering equipment, tracking supplies, etc. Will be encouraged to take an active role in scientific research. The position is ideal for individuals considering future graduate study in cognitive science or neuroscience.REQUIREMENTS: A bachelor’s degree in cognitive science, neuroscience, computer science, engineering, physics, or math; strong math, statistics, and computer skills (e.g., MATLAB, Python, shell scripting); substantial programming experience, including experience using HTML/CSS/JavaScript and experience implementing simple web sites; Macintosh and Windows troubleshooting skills and comfort in a Unix environment; good people skills; and evidence of serious interest in a career in cognitive science or neuroscience. We seek an organized, self-motivated, independent, and reliable individual who is able to multitask efficiently in a fast-paced environment. Must be able to work as part of a team. Research experience in cognitive science or neuroscience would be helpful, especially experience conducting behavioral experiments in human subjects.To apply, please follow the instructions at this link: http://sh.webhire.com/servlet/av/jd?ai=631&ji=2681090&sn=I