Search from over 60,000 research works

Advanced Search

Zero attracting recursive least squares algorithms

[thumbnail of 07416639.pdf]
Preview
07416639.pdf - Accepted Version (3MB) | Preview
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Hong, X. orcid id iconORCID: https://orcid.org/0000-0002-6832-2298, Gao, J. and Chen, S. (2017) Zero attracting recursive least squares algorithms. IEEE Transactions on Vehicular Technology, 66 (1). 213 -221. ISSN 0018-9545 doi: 10.1109/TVT.2016.2533664

Abstract/Summary

The l1-norm sparsity constraint is a widely used technique for constructing sparse models. In this contribution, two zero-attracting recursive least squares algorithms, referred to as ZA-RLS-I and ZA-RLS-II, are derived by employing the l1-norm of parameter vector constraint to facilitate the model sparsity. In order to achieve a closed-form solution, the l1-norm of the parameter vector is approximated by an adaptively weighted l2-norm, in which the weighting factors are set as the inversion of the associated l1-norm of parameter estimates that are readily available in the adaptive learning environment. ZA-RLS-II is computationally more efficient than ZA-RLS-I by exploiting the known results from linear algebra as well as the sparsity of the system. The proposed algorithms are proven to converge, and adaptive sparse channel estimation is used to demonstrate the effectiveness of the proposed approach.

Altmetric Badge

Item Type Article
URI https://reading-clone.eprints-hosting.org/id/eprint/65511
Item Type Article
Refereed Yes
Divisions Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
Publisher IEEE
Download/View statistics View download statistics for this item

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Search Google Scholar