An Empirical Investigation of Brute Force to choose Features, Smoothers and Function Approximators
Journal Article, Computational Learning Theory and Natural Learning Systems, Vol. 3, pp. 361 - 379, April, 1995
Abstract
The generalization error of a function approximator, feature set or smoother can be estimated directly by the leave-one-out cross-validation error. For memory-based methods, this is computationally feasible. We describe an initial version of a general memory-based learning system (GMBL): a large collection of learners brought into a widely applicable machine-learning family. We present ongoing investigations into search algorithms which, given a dataset, nd the family members and features that generalize best. We also describe GMBL's application to two noisy, di cult problems|predicting car engine emissions from pressure waves, and controlling a robot billiards player with redundant state variables.
BibTeX
@article{Moore-1995-16071,author = {Andrew Moore and D. J. Hill and M. P . Johnson},
title = {An Empirical Investigation of Brute Force to choose Features, Smoothers and Function Approximators},
journal = {Computational Learning Theory and Natural Learning Systems},
year = {1995},
month = {April},
volume = {3},
pages = {361 - 379},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.