Search from over 60,000 research works

Advanced Search

l1-norm penalized orthogonal forward regression

[thumbnail of l1nofrIJSSREV.pdf]
Preview
l1nofrIJSSREV.pdf - Accepted Version (405kB) | Preview
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Hong, X. orcid id iconORCID: https://orcid.org/0000-0002-6832-2298, Chen, S., Guo, Y. and Gao, J. (2017) l1-norm penalized orthogonal forward regression. International Journal of Systems Science, 48 (10). pp. 2195-2201. ISSN 0020-7721 doi: 10.1080/00207721.2017.1311383

Abstract/Summary

A l1-norm penalized orthogonal forward regression (l1-POFR) algorithm is proposed based on the concept of leave-one-out mean square error (LOOMSE), by defining a new l1-norm penalized cost function in the constructed orthogonal space and associating each orthogonal basis with an individually tunable regularization parameter. Due to orthogonality, the LOOMSE can be analytically computed without actually splitting the data set, and moreover a closed form of the optimal regularization parameter is derived by greedily minimizing the LOOMSE incrementally. We also propose a simple formula for adaptively detecting and removing regressors to an inactive set so that the computational cost of the algorithm is significantly reduced. Examples are included to demonstrate the effectiveness of this new l1-POFR approach.

Altmetric Badge

Item Type Article
URI https://reading-clone.eprints-hosting.org/id/eprint/72078
Item Type Article
Refereed Yes
Divisions Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
Publisher Taylor & Francis
Download/View statistics View download statistics for this item

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Search Google Scholar