Institutional Repository
Technical University of Crete
EN  |  EL

Search

Browse

My Space

On overfitting, generalization, and randomly expanded training sets

Karystinos Georgios, Pados Dimitris A.

Simple record


URIhttp://purl.tuc.gr/dl/dias/176CBB13-43FD-4D79-9F27-C297C4B1F452-
Identifierhttp://www.telecom.tuc.gr/~karystinos/paper_TNN.pdf-
Identifierhttps://doi.org/10.1109/72.870038-
Languageen-
Extent7en
TitleOn overfitting, generalization, and randomly expanded training setsen
CreatorKarystinos Georgiosen
CreatorΚαρυστινος Γεωργιοςel
Creator Pados Dimitris A.en
PublisherInstitute of Electrical and Electronics Engineersen
DescriptionΔημοσίευση σε επιστημονικό περιοδικό el
Content SummaryAn algorithmic procedure is developed for the random expansion of a given training set to combat overfitting and improve the generalization ability of backpropagation trained multilayer perceptrons (MLPs). The training set is K-means clustered and locally most entropic colored Gaussian joint input-output probability density function estimates are formed per cluster. The number of clusters is chosen such that the resulting overall colored Gaussian mixture exhibits minimum differential entropy upon global cross-validated shaping. Numerical studies on real data and synthetic data examples drawn from the literature illustrate and support these theoretical developmentsen
Type of ItemPeer-Reviewed Journal Publicationen
Type of ItemΔημοσίευση σε Περιοδικό με Κριτέςel
Licensehttp://creativecommons.org/licenses/by/4.0/en
Date of Item2015-10-23-
Date of Publication2000-
SubjectBackpropagationen
Subjectclustering methodsen
Subjectentropyen
SubjectGaussian distributionsen
Subjectmultilayer perceptrons (MLPs)en
Subjectstochastic approximationen
Subjectstochastic processesen
Bibliographic CitationG. N. Karystinos and D. A. Pados, “On overfitting, generalization, and randomly expanded training sets,” IEEE Transactions on Neural Networks, vol. 11, no. 5, pp. 1050-1057, Sept. 2000. doi: 10.1109/72.870038en

Services

Statistics