Many techniques for time-series or sequential data work by taking a fixed size window, that is the last N items of the series, and then use each wondow as if it were separate data, either for training or execution. Examples include moving average methods in time series analysis, simple Markov models and applying neural networks to the data windows. Windowing methods are always finite impluse response as data before the wondow cannot affect the current response. That said, when used generatively a windowed model can create an infinite output as each prediced value can be added to the end of the series and then re-used as part of tthe next window – this is precisely how large-language models work to create text.
Used in Chap. 8: page 123; Chap. 14: pages 223, 229, 233, 236