Idil Kanpolat

Semester Work
Supervisors: Eirikur Agustsson, Dr. Radu Timofte

Attribute based Image Generation with GANs and Classifiers

Generative adversarial networks (GANs) were networks (GANs) were networks (GANs) were introduced as a successful way for training generative models. GANs generate high quality images with a satisfying diversity. Moreover, control over the generated images is achieved by building conditional GANs (cGANs). The conditioned information is the demanded features of the output image such as class labels or attributes. For cGANs, controlling the generated images successfully while maintaining the diversity and image quality is a difficult task. Moreover, increasing the number of control parameters is a challenge. Models combining Variational Autoencoders (VAE) and GANs give good results in terms of image quality and diversity conditioned on many attributes. However, a model not based on VAE that can successfully control a large amount of attributes is yet to be achieved. In this report, we propose a variant of GANs capable of generating realistic and diverse samples with respect to dozens of attributes. As a new approach, we introduce the side information to the generator at each hidden layer and we use pre-trained classifiers for the optimization of the generator. We are able to generate faces with attribute configurations that are not introduced in the dataset.