Effectiveness of Unsupervised Training in Deep Learning Neural Networks
cytuj
pobierz pliki
RIS BIB ENDNOTEChoose format
RIS BIB ENDNOTEEffectiveness of Unsupervised Training in Deep Learning Neural Networks
Publication date: 11.04.2016
Schedae Informaticae, 2015, Volume 24, pp. 41 - 51
https://doi.org/10.4467/20838476SI.15.004.3026Authors
Effectiveness of Unsupervised Training in Deep Learning Neural Networks
Deep learning is a field of research attracting nowadays much attention, mainly because deep architectures help in obtaining outstanding results on many vision, speech and natural language processing – related tasks. To make deep learning effective, very often an unsupervised pretraining phase is applied. In this article, we present experimental study evaluating usefulness of such approach, testing on several benchmarks and different percentages of labeled data, how Contrastive Divergence (CD), one of the most popular pretraining methods, influences network generalization.
Information: Schedae Informaticae, 2015, Volume 24, pp. 41 - 51
Article type: Original article
Titles:
Effectiveness of Unsupervised Training in Deep Learning Neural Networks
Effectiveness of Unsupervised Training in Deep Learning Neural Networks
Wroclaw University of Technology, Poland
University of Bielsko-Biala Department of Computer Science
Published at: 11.04.2016
Article status: Open
Licence: None
Percentage share of authors:
Article corrections:
-Publication languages:
EnglishView count: 2201
Number of downloads: 3206