The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Sub-Nyquist sparse signal reconstruction technique can significantly reduce the cost of hardware design. Many sub-Nyquist reconstruction algorithms (e.g., greedy relax and convex optimization) have been developed to reconstruct the real frequency-sparse signal by utilizing its sparsity. However, greedy algorithms require a large memory size and convex optimization algorithms exhaust a long calculation...
Invariable step size based least-mean-square error (ISS-LMS) was considered as a very simple adaptive filtering algorithm and hence it has been widely utilized in many applications, such as adaptive channel estimation. It is well known that the convergence speed of ISS-LMS is fixed by the initial step-size. In the channel estimation scenarios, it is very hard to make tradeoff between convergence speed...
Normalized least mean square (NLMS) was considered as one of the classical adaptive system identification algorithms. Because most of systems are often modeled as sparse, sparse NLMS algorithm was also applied to improve identification performance by taking the advantage of system sparsity. However, identification performances of NLMS type algorithms cannot achieve high‐identification performance,...
Least mean square (LMS)-type adaptive sparse algorithms have been attracting much attention on sparse multipath channel estimation (SMPC) due to their two advantages: low computational complexity and reliability. By introducing ℓ1 -norm sparse constraint function into LMS algorithm, both zero-attracting least mean square (ZA-LMS) and reweighted zero-attracting least mean square (RZA-LMS) have been...
Least mean square (LMS) based adaptive algorithms have been attracted much attention since their low computational complexity and robust recovery capability. To exploit the channel sparsity, LMS-based adaptive sparse channel estimation methods, e.g., zero-attracting LMS (ZA-LMS), reweighted zero-attracting LMS (RZA-LMS) and Lp - norm sparse LMS (LP-LMS), have also been proposed. To take full advantage...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.