This paper extends the existing fast RLS recursions originally intended to exponentially windowed problems for general models, to a generalized sliding window formulation (GSWRLS). From a matrix algebra perspective, we show explicitly how the displacement rank of the underlying inverse covariance matrix associated to any operator is defined as a function of the number of window breakpoints and how the fast GSWRLS calculates these rank factors in a fast manner. The recursions hold regardless of the (first order) data structure induced and show that fast fixed order and order recursive RLS algorithms can still be obtained for unwindowed data matrices exhibiting a fixed arbitrary relation between successive regressors. Our approach highlights the existence of a certain degree of freedom inherent to structured data matrices induced by general models, showing that efficient representations of their inverse covariances are not limited to factor circulants, but rather constructed from any arbitrary operator. These Bezoutians, usually expressed via reproducing kernel relations, can be represented exactly in matrix form, along with a precise correspondence to variables of a GSWRLS. As a fallout, we obtain a vector relation stating the so-called minimality property, for extended models and windows, as opposed to analogous generating function arguments normally seen in original approaches. These results pave the way to a more general framework of polynomial Vandermonde covariance decompositions which arise naturally via a proper choice of recurrence related polynomials. This has further impact on several signal processing applications, including superfast realization of equalizers in communications scenarios.