USER PASSWORD
Testo
 

  CALENDARIO
AGO 2019
L
M
M
G
V
S
D
1234
567891011
12131415161718
19202122232425
262728293031
HOME PAGE
INFO GENERALI
 Benvenuto
 Chi Mimos
 Un po' di Storia
 Statuto
 Regolamento
 Consiglio Direttivo
 Per Associarsi
LAVORI IN CORSO
 Spring School 2019
 MIMOS Simulation Day 2019
 Salento AVR 2019
 SimulTech 2019
 BIC 19
ARTICOLI e NEWS
 2003
 2002
 2004
 2006
 2007
 2008
 2009
 2010
 2011
 2012
 2013
 2014
 2015
 2016
 2017
 2018
CONVEGNI MIMOS
 2001 Annuale
 2002 Annuale
 2003 Annuale
 2004 Annuale
 2005 Annuale
 2006 HLA
 2006 Annuale
 2007 Annuale
 2008 Convegni
 2009 Convegni
 2010 Convegni
 2011 Convegni
 2012 Convegni
 2012 Il Decennale
 2014 Convegni
 2013 Convegni
 2015 Convegni
 2018 Convegni
 2016 Convegni
 2017 Convegni
EVENTI SIM&VR
 2006
 2007
 2008
 2009
 2010
 2011
 2012
 2013
 2014
 2015
 2017
 2016
 2018
Pubblicazioni
 Atti pubblicati 2011
 Pubblicazioni 2009
PREMIO MIMOS
 2004
 2005
 2006
 2009
 2011
 2013
 2016
 2018
TESI E RICERCHE
 2003
 2007
 2008
 2009
CORPORATE CORNER
 Le Aziende di MIMOS
CONTATTI
 E-mail
 Pubblica con noi
 Segnala chi fa 3D!
ENGLISH NEWSLETTER
 2014
 2015
NEWSLETTER
 2007
 2008
 2009
 2010
 2011
 2012
 2013
 2014
 2015
 2016
AFFILIATI & PARTNER
CONTENUTI
MEDIA GALLERY
03/09/2018
Doctoral thesis
Navigation aid for the visually impaired:
Virtual Reality acoustic simulations for interior navigation preparation

Laboratories IJLRA (Institut Jean le Rond d’Alembert, UMR 7190 CNRS – Sorbonne Université) and IRCAM (Institut de Recherche et Coordination Acoustique/Musique, UMR 9912 STMS IRCAM – CNRS – Sorbonne Université)
Doctoral school  École Doctorale Sciences Mécaniques, Acoustique, Électronique et Robotique (SMAER): ED 391
Discipline  Acoustics (Virtual Reality, Audio, Interaction, Aide Handicap)
Co-supervision  Brian KATZ (DR-CNRS, IJLRA) et Markus NOISTERNIG (CR, IRCAM)
Keywords  Virtual reality, 3D audio, spatial sound, spatial cognition, room acoustics, visual impairments, navigation aid

Research context   This thesis project is placed in the context of the ANR 2018-2021 project RASPUTIN (Room Acoustic Simulations for Perceptually Realistic Uses in Real-Time Immersive and Navigation Experiences). In the domains of sound synthesis and virtual reality (VR), much effort had been placed on the quality and realism of sound source renderings, from text-to-speech to musical instruments to engine noise for use in driving and flight simulators. The same degree of effort cannot be seen with regards to the spatial aspects of sound synthesis and virtual reality, particularly with respect to the acoustics of the surrounding environment. Room acoustic simulation algorithms have for decades been improving in their ability to predict acoustic measurement metrics like reverberation time from geometrical acoustic models, at a cost of higher and higher computational requirements. However, it is only recently that the perceptual quality of these simulations are being explored beyond their musical applications. In real-time systems, where sound source, listener, and room architecture can vary in unpredicted ways, investigation of the perceptual quality or realism has been hindered by necessary simplifications to algorithms. This project aims to improve real-time simulation quality towards perceptual realism.
The capability of a real-time acoustic simulation to provide meaningful information to a visually impaired user through a virtual reality exploration is the focus of the project. As a preparatory tool prior to visiting a public building or museum, the virtual exploration will improve user's knowledge of the space and navigation confidence during their on-site visit, as compared to traditional methods such as tactile maps.
The thesis work entails participating in the creation and evaluation of a training system application for visually impaired individuals. Tasks involve the development of an experimental prototype in collaboration with project partners with a simplified user interface for the construction of virtual environments to explore. Working in conjunction with a selected user group panel who will remain engaged in the project for the duration, several test cases of interest will be identified for integration into the prototype and subsequent evaluations. The prototype will be developed by the thesis student in collaboration with Novelab (audio gaming) and IRCAM/STMS-CNRS (developers of the audio rendering engine). Design and evaluation will be carried out in collaboration with the Centre de Psychiatrie et Neurosciences and StreetLab/Institut de la Vision. The ability to communicate in French would be beneficial, but is not mandatory at the start of the project.
Evaluations will involve different experimental protocols in order to assess the accuracy of the mental representation of the learned environments. From the point of view of the metrics relation preservation, participants will have to carry out experimental spatial memory tests as well as onsite navigation tasks.

Candidate profile:  We are looking for dynamic, creative, and motivated candidates with scientific curiosity, strong problem solving skills, the ability to work both independently and in a team environment, and the desire to push their knowledge limits and areas of confidence to new domains. The candidate should have a Master in Computer Science, Acoustics, Architectural Acoustics, Multimodal Interfaces, or Audio Signal Processing. A strong interest in spatial audio, room acoustics, and working with the visually impaired is necessary.   It is not expected that a candidate will have already all the skills necessary for this multidisciplinary subject, so a willingness and ability to rapidly step into new domains, including spatial cognition and psychoacoustics will be appreciated.

Domaine   Réalité virtuelle, Audio, Interaction, Aide Handicap

Dates  Preferred starting date from 1-Nov-2018 to 20-Dec-2019, and no later than March-2019.

Application   Interested candidates should send a CV, transcript of Master’s degree courses, a cover letter (limit 2 pages) detailing their motivations for pursuing a PhD in general and specifically the project described above, and contact information for 2 references that the selection committee can contact. Incomplete candidatures will not be processed.

Application deadline  Complete candidature files should be submitted to brian.katz@sorbonne-universite.fr and markus.noisternig@ircam.fr before 1-Oct-2018.
 

MIMOS Movimento Italiano Modellazione e Simulazione - Associazione culturale senza scopo di lucro - Corso Marche 41 10146 Torino - CF 97623370018