Please use this identifier to cite or link to this item:
|Title:||High-resolution generative adversarial neural networks applied to histological images generation|
Diaz Rosado, Jose Carlos
|Keywords:||Deep learning;Diagnosis;Medical imaging;Neural networks;Diagnostic algorithms;Generative Adversarial Nets;High resolution;Histological images;Learning-based methods;Photo realistic image synthesis;Photorealistic images;Statistical correlation;Image analysis|
|Abstract:||For many years, synthesizing photo-realistic images has been a highly relevant task due to its multiple applications from aesthetic or artistic  to medical purposes [1, 6, 21]. Related to the medical area, this application has had greater impact because most classification or diagnostic algorithms require a significant amount of highly specialized images for their training yet obtaining them is not easy at all. To solve this problem, many works analyze and interpret images of a specific topic in order to obtain a statistical correlation between the variables that define it. By this way, any set of variables close to the map generated in the previous analysis represents a similar image. Deep learning based methods have allowed the automatic extraction of feature maps which has helped in the design of more robust models photo-realistic image synthesis. This work focuses on obtaining the best feature maps for automatic generation of synthetic histological images. To do so, we propose a Generative Adversarial Networks (GANs)  to generate the new sample distribution using the feature maps obtained by an autoencoder [14, 20] as latent space instead of a completely random one. To corroborate our results, we present the generated images against the real ones and their respective results using different types of autoencoder to obtain the feature maps. © Springer Nature Switzerland AG 2018.|
|Appears in Collections:||Artículos de investigación|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.