- Accueil
 - EN
 - Studying at ULB
 - Find your course
 - UE
 
- 
            
            Share this page
 
Immersive Multimedia Technologies I
Course teacher(s)
Gauthier LAFRUIT (Coordinator)ECTS credits
5
Language(s) of instruction
english
Course content
Immersive 3D video and audio rendering technologies described in the Google Starline paper [1] for holographic video conferencing, as well as Visual Volumetric rendering recently published in the MPEG-I standards (Moving Picture Experts Group – Immersive) will be revisited. This includes multi-camera calibration, stereo matching, point cloud alignment by Iterated Closest Point (ICP), as well as Structure-from-Motion (SfM) for 3D reconstruction.
The proposed multi-camera capturing approaches are also used in Visual Special Effect studios, in recent automotive driving assistance systems that give a complete picture of the environment surrounding the car, as well as in immersive cinema theaters providing a 360 degrees view on the movie. Extensions towards 3D light field displays (auto-stereoscopy, holography) will also be studied.
This course presents techniques that are predominantly image-based (e.g. giving a perspective 3D illusion by blending 2D images, extracting depth from stereo matching, etc.) and is therefore complementary to INFO-H502 that focuses on 3D graphics modeling and its OpenGL rendering. Both approaches can be used in VR applications like the Metaverse, but the former is preferred for free navigation in movies.
Objectives (and/or specific learning outcomes)
Understand volumetric rendering techniques based on images (and video).
Prerequisites and Corequisites
Required and Corequired knowledge and skills
C/C++ programming skills are recommended, though the exercises can be done in Python only (but the case study examples will mostly be in C, often without object-oriented concepts).
Teaching methods and learning activities
Theory lessons will explain the main ingredients of immersive 3D rendering technologies, complemented with exercises (mini-projects) for mastering their basics.
References, bibliography, and recommended reading
[1] Jason Lawrence, et. al., “Project Starline: A high-fidelity telepresence system,” ACM Trans. Graph., Vol. 40, No. 6, Article 242, December 2021, https://doi.org/10.1145/3478513.3480490
[2] G. Lafruit, M. Teratani, “Virtual Reality and Light Field Immersive Video Technologies for Real-World Applications,” Institution of Engineering & Technology, ISBN 978-1785615788, 2022.
Course notes
- Syllabus
 - Université virtuelle
 
Other information
Additional information
Slides and notes (close to a complete syllabus) inspired by [1,2] and their bibliographic references, as well as video clips for some subjects.
Contacts
Prof. Gauthier Lafruit, LISA-VR
Campus
Solbosch
Evaluation
Method(s) of evaluation
- Oral examination
 
Oral examination
A report on the exercises must be submitted before the exam period.
The oral examination covers one theory question, as well as punctual questions on the exercises report.
This is no open book exam, though a single A4 page with formulae is allowed (besides of a copy of [1]).
Mark calculation method (including weighting of intermediary marks)
50% on the theory question, 50% on the exercises' questions (and report).
Language(s) of evaluation
- english
 - (if applicable french, Dutch )