In this paper, we propose a transformer-based procedure for the efficient registration of non-rigid 3D point clouds. The proposed approach is data-driven and adopts for the first time the transformer architecture in the registration task. Our method is general and applies to different settings. Given a fixed template with some desired properties (e.g. skinning weights or other animation cues), we can register raw acquired data to it, thereby transferring all the template properties to the input geometry. Alternatively, given a pair of shapes, our method can register the first onto the second (or vice-versa), obtaining a high-quality dense correspondence between the two. In both contexts, the quality of our results enables us to target real applications such as texture transfer and shape interpolation. Furthermore, we also show that including an estimation of the underlying density of the surface eases the learning process. By exploiting the potential of this architecture, we can train our model requiring only a sparse set of ground truth correspondences (10 ∼ 20% of the total points). The proposed model and the analysis that we perform pave the way for future exploration of transformer-based architectures for registration and matching applications. Qualitative and quantitative evaluations demonstrate that our pipeline outperforms state-of-the-art methods for deformable and unordered 3D data registration on different datasets and scenarios.

Trappolini, G., Cosmo, L., Moschella, L., Marin, R., Melzi, S., Rodola, E. (2021). Shape registration in the time of transformers. In 35th Conference on Neural Information Processing Systems, NeurIPS 2021 (pp.5731-5744). Neural information processing systems foundation.

Shape registration in the time of transformers

Melzi S.;
2021

Abstract

In this paper, we propose a transformer-based procedure for the efficient registration of non-rigid 3D point clouds. The proposed approach is data-driven and adopts for the first time the transformer architecture in the registration task. Our method is general and applies to different settings. Given a fixed template with some desired properties (e.g. skinning weights or other animation cues), we can register raw acquired data to it, thereby transferring all the template properties to the input geometry. Alternatively, given a pair of shapes, our method can register the first onto the second (or vice-versa), obtaining a high-quality dense correspondence between the two. In both contexts, the quality of our results enables us to target real applications such as texture transfer and shape interpolation. Furthermore, we also show that including an estimation of the underlying density of the surface eases the learning process. By exploiting the potential of this architecture, we can train our model requiring only a sparse set of ground truth correspondences (10 ∼ 20% of the total points). The proposed model and the analysis that we perform pave the way for future exploration of transformer-based architectures for registration and matching applications. Qualitative and quantitative evaluations demonstrate that our pipeline outperforms state-of-the-art methods for deformable and unordered 3D data registration on different datasets and scenarios.
No
paper
Scientifica
3D point cloud; Transformers; Data-driven; Non-rigid shape registration;
English
35th Conference on Neural Information Processing Systems, NeurIPS 2021 - 6 December 2021 through 14 December 2021
9781713845393
Trappolini, G., Cosmo, L., Moschella, L., Marin, R., Melzi, S., Rodola, E. (2021). Shape registration in the time of transformers. In 35th Conference on Neural Information Processing Systems, NeurIPS 2021 (pp.5731-5744). Neural information processing systems foundation.
Trappolini, G; Cosmo, L; Moschella, L; Marin, R; Melzi, S; Rodola, E
File in questo prodotto:
File Dimensione Formato  
shape_registration_in_the_time (2).pdf

Solo gestori archivio

Descrizione: Conference Paper
Tipologia di allegato: Author’s Accepted Manuscript, AAM (Post-print)
Dimensione 410.92 kB
Formato Adobe PDF
410.92 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/10281/389391
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact