Berk Kaya

Master Thesis
Supervisors: Dr. Radu Timofte

Learning 3D Shape Reconstruction and Image Generation from Unpaired Data

Understanding the relation between 2D images and 3D objects is a challenging problem because it requires exploiting knowledge of shape priors. Existing methods usually depend on supervision in which 3D data is annotated and paired with RGB images. However, shape annotation is expensive and difficult to obtain, and for this reason, many methods rely on synthetic 3D models. With the recent progress on generative models and shape modeling methods, unsupervised approaches attract more attention. In this thesis, we present a framework that fulfills three major tasks: (i) reconstructing the 3D shape from a single image; (ii) learning disentangled representations for shape, appearance and viewpoint; and (iii) generating a realistic RGB image from these independent factors. In contrast to the existing approaches, our method does not require annotated data, image-shape pairs, for training. Instead, it uses unpaired image and shape datasets for the same object class. The experimental results verify that our model achieves promising results on single view shape reconstruction and image generation. It also allowed us to develop many operations like novel view synthesis, shape & texture editing, shape & &texture transfer and viewpoint estimation.