Wang T Y, Su H, Huang Q, et al. Unsupervised texture transfer from images to model collections[J]. ACM Trans. Graphics (TOG), 2016, 35(6): 177.

Abstract:

Large 3D model repositories of common objects are now ubiquitous and are increasingly being used in computer graphics and computer vision for both analysis and synthesis tasks. However, images of objects in the real world have a richness of appearance that these repositories do not capture, largely because most existing 3D models are untextured. In this work we develop an automated pipeline capable of transporting texture information from images of real objects to 3D models of similar objects. This is a challenging problem, as an object's texture as seen in a photograph is distorted by many factors, including pose, geometry, and illumination. These geometric and photometric distortions must be undone in order to transfer the pure underlying texture to a new object --- the 3D model. Instead of using problematic dense correspondences, we factorize the problem into the reconstruction of a set of base textures (materials) and an illumination model for the object in the image. By exploiting the geometry of the similar 3D model, we reconstruct certain reliable texture regions and correct for the illumination, from which a full texture map can be recovered and applied to the model. Our method allows for large-scale unsupervised production of richly textured 3D models directly from image data, providing high quality virtual objects for 3D scene design or photo editing applications, as well as a wealth of data for training machine learning algorithms for various inference tasks in graphics and vision.

Bibtex:

@article{wang2016unsupervised,
  title={Unsupervised texture transfer from images to model collections},
  author={Wang, Tuanfeng Y and Su, Hao and Huang, Qixing and Huang, Jingwei and Guibas, Leonidas and Mitra, Niloy J},
  journal={ACM Trans. Graphics (TOG)},
  volume={35},
  number={6},
  pages={177},
  year={2016}
}