Sliced Generative Models
Choose format
RIS BIB ENDNOTESliced Generative Models
Publication date: 2018
Schedae Informaticae, 2018, Volume 27, pp. 69-79
https://doi.org/10.4467/20838476SI.18.006.10411Authors
Sliced Generative Models
In this paper we discuss a class of AutoEncoder based generative models based on one dimensional sliced approach. The idea is based on the reduction of the discrimination between samples to one-dimensional case.
Our experiments show that methods can be divided into two groups. First consists of methods which are a modification of standard normality tests, while the second is based on classical distances between samples.
It turns out that both groups are correct generative models, but the second one gives a slightly faster decrease rate of Frechet Inception Distance (FID).
[1] H. Cramér and H.Wold. Some theorems on distribution functions. London Math. Soc., 11:290-294, 1936.
[2] M. Hazewinkel, ed. Kolmogorov-Smirnov test. Encyclopedia of Mathematics. Springer Science+Business Media B.V. / Kluwer Academic Publishers, 2001.
[3] N. Henze. Invariant tests for multivariate normality: a critical review. Statist. Papers, 43(4):467-506, 2002.
[4] M. Heusel, H. Ramsauer, T. Unterthiner, B. Nessler, G. Klambauer, and S. Hochreiter. Gans trained by a two time-scale update rule converge to a nash equilibrium. arXiv:1706.08500, 2017.
[5] D.P. Kingma and M.Welling. Auto-encoding variational bayes. arXiv:1312.6114, 2014.
[6] S. Kolouri, P.E. Pope, C.E. Martin, and G.K. Rohde. Sliced wasserstein autoencoders. 2018.
[7] M. Mazur and P. Kościelniak. On some goodness of fit tests for normality based on the optimal transport distance. submitted.
[8] A. Palmer, D. Dey, and J. Bi. Reforming generative autoencoders via goodnessoffit hypothesis testing. UAI, 2018.
[9] B.W. Silverman. Density estimation for statistics and data analysis. Monographs on Statistics and Applied Probability. Chapman & Hall, London, 1986.
[10] Jacek Tabor, Szymon Knop, Przemysław Spurek, Igor Podolak, Marcin Mazur, and Stanisław Jastrz¦bski. Cramer-wold autoencoder. arXiv preprint arXiv:1805.09235, 2018.
[11] I. Tolstikhin, O. Bousquet, S. Gelly, and B. Schoelkopf. Wasserstein autoencoders. arXiv preprint arXiv:1711.01558, 2017.
Information: Schedae Informaticae, 2018, Volume 27, pp. 69-79
Article type: Original article
Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland
Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland
Faculty of Mathematics and Computer Science, Jagiellonian University ul. Łojasiewicza 6, 30-348 Kraków, Poland
Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland
Faculty of Mathematics and Computer Science, Jagiellonian University, Krakow, Poland
Published at: 2018
Article status: Open
Licence: CC BY-NC-ND
Percentage share of authors:
Article corrections:
-Publication languages:
English