Research Projects: Application areas / Data type
Our research projects are summarized in the following table. Projects are displayed according to the target application domain (columns) and source image type to be processed (rows). [click buttons for details]
(planets, small bodies)
| DATA TYPE|
An Interdisciplinary Research
Our application areas range from deep sky exploration to Earth observation, aiming at a better understanding of space imaging in order to provide new tools for the specialists of these areas.
- Astronomy (star fields, galaxies, nebulae) → image enhancement and improved visualisation (e.g. denoising, color coding), data fusion for an easier data manipulation (e.g. channel fusion, multi-source fusion, data reduction), automatic analysis and physics-based interpretation (e.g. segmentation, classification).
- Planetary Imaging (Solar System planets, satellites, small bodies) → image enhancement (e.g. deblurring, PSF estimation), model-based data fusion (e.g. 3D surface reconstruction, super-resolution), reflectance estimation and analysis (e.g. terrain classification).
- Remote Sensing (Earth observations at various resolutions) → shadow and vegetation segmentation in urban landscape; (former) image enhancement (e.g. blind deconvolution), model-based data fusion (e.g. topography reconstruction, super-resolution, data reduction), bidirectional reflectance estimation and analysis (e.g. segmentation).
Analyzing Various Data Types & Modalities
Depending on the application, various types of input images are used and the processing methods must adapt consequently. We separate the methods into three main classes based on the spectral depth (one, several, many bands). For each class we have two subclasses, whether a single image or multiple ones are required.
- Image sources: (single image vs. multiple images) → One input image is used in classical techniques (e.g. deblurring, segmentation). However some tasks require at least two images (e.g. surface recovery), other require even more (e.g. bidirectional reflectance inference). In some cases the task itself is dictated by the great number of available sources (e.g. data fusion). Multiple modalities, other than image-like data, can also be accounted for (e.g. existing elevation models, radar measurements).
- Spectral depth: (single band vs. multi or hyperspectral) → Single band images are the simplest to handle, whereas multispectral images require to account for band interactions and increased dimensionality of the problem (Hughes phenomenon). In addition, hyperspectral data usually require dimensionality reduction prior to further processing and analysis.