| URI | http://purl.tuc.gr/dl/dias/176CBB13-43FD-4D79-9F27-C297C4B1F452 | - |
| Identifier | http://www.telecom.tuc.gr/~karystinos/paper_TNN.pdf | - |
| Identifier | https://doi.org/10.1109/72.870038 | - |
| Language | en | - |
| Extent | 7 | en |
| Title | On overfitting, generalization, and randomly expanded training sets | en |
| Creator | Karystinos Georgios | en |
| Creator | Καρυστινος Γεωργιος | el |
| Creator | Pados Dimitris A. | en |
| Publisher | Institute of Electrical and Electronics Engineers | en |
| Description | Δημοσίευση σε επιστημονικό περιοδικό | el |
| Content Summary | An algorithmic procedure is developed for the random expansion of a given training set to combat overfitting and improve the generalization ability of backpropagation trained multilayer perceptrons (MLPs). The training set is K-means clustered and locally most entropic colored Gaussian joint input-output probability density function estimates are formed per cluster. The number of clusters is chosen such that the resulting overall colored Gaussian mixture exhibits minimum differential entropy upon global cross-validated shaping. Numerical studies on real data and synthetic data examples drawn from the literature illustrate and support these theoretical developments | en |
| Type of Item | Peer-Reviewed Journal Publication | en |
| Type of Item | Δημοσίευση σε Περιοδικό με Κριτές | el |
| License | http://creativecommons.org/licenses/by/4.0/ | en |
| Date of Item | 2015-10-23 | - |
| Date of Publication | 2000 | - |
| Subject | Backpropagation | en |
| Subject | clustering methods | en |
| Subject | entropy | en |
| Subject | Gaussian distributions | en |
| Subject | multilayer perceptrons (MLPs) | en |
| Subject | stochastic approximation | en |
| Subject | stochastic processes | en |
| Bibliographic Citation | G. N. Karystinos and D. A. Pados, “On overfitting, generalization, and randomly expanded training sets,” IEEE Transactions on Neural Networks, vol. 11, no. 5, pp. 1050-1057, Sept. 2000. doi: 10.1109/72.870038 | en |