Introduction

We present a multi-sensor dataset for multi-view 3D surface reconstruction. It includes registered RGB and depth data from 7 sensors of different resolutions and modalities (a): smartphones, Intel RealSense, Microsoft Kinect, industrial cameras, and structured-light scanner. The scenes are selected to emphasize a diverse set of material properties challenging for existing algorithms (c), such as featureless (F), highly specular with sharp reflections (S), or translucent (T), as illustrated with reconstructions produced by state-of-the-art algorithms (compare with an “easy” object on the bottom right). We provide around 1.4 million images of 107 different scenes acquired from 100 viewing directions under 14 lighting conditions (b). We expect our dataset will be useful for evaluation and training of 3D reconstruction algorithms and for related tasks.

Changelog

2023 Dec 30

  • Added the script for generation of download links.

2023 Aug 16

  • Initial release of the documentation.
  • Updated download instructions.
  • Added pre-rendered SL depth maps.
  • Added camera parameters in MVSNet and IDR formats.

2023 Mar 24

  • Initial release of the dataset.

Publication

Paper (6 MB) | Supplementary text (2 MB) | Supplementary results (101 MB) | arXiv | Publisher version

If you use Skoltech3D dataset please cite our work

@InProceedings{voynov2022multi,
    title     = {Multi-sensor large-scale dataset for multi-view 3D reconstruction},
    author    = {Voynov, Oleg and Bobrovskikh, Gleb and Karpyshev, Pavel and Galochkin, Saveliy and Ardelean, Andrei-Timotei and Bozhenko, Arseniy and Karmanova, Ekaterina and Kopanev, Pavel and Labutin-Rymsho, Yaroslav and Rakhimov, Ruslan and Safin, Aleksandr and Serpiva, Valerii and Artemov, Alexey and Burnaev, Evgeny and Tsetserukou, Dzmitry and Zorin, Denis},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
}

License

Skoltech3D dataset is provided for free non-commercial use under the Creative Commons Attribution-NonCommercial 4.0 International License.

Getting started

See the description of the dataset in documentation and follow the download instructions to get the data.