Contents
- 14.1 Overview
- 14.2 General Properties
- 14.2.1 Kinds of Temporal and Sequential Data
- 14.2.2 Looking through Time
- 14.2.3 Processing Temporal Data
- 14.2.3.1 Windowing
- 14.2.3.2 Hidden State
- 14.2.3.3 Non-time Domain Transformations
- 14.3 Probability Models
- 14.3.1 Markov Model
- 14.3.2 Higher-order Markov Model
- 14.3.3 Hidden Markov Model
- 14.4 Grammar and Pattern-Based Approaches
- 14.4.1 Regular Expressions
- 14.4.2 More Complex Grammars
- 14.5 Neural Networks
- 14.5.1 Window-based Methods
- 14.5.2 Recurrent Neural Networks
- 14.5.3 Long-term Short-term Memory Networks
- 14.5.4 Transformer Models
- 14.6 Statistical and Numerical Techniques
- 14.6.1 Simple Data Cleaning Techniques
- 14.6.2 Logarithmic Transformations and Exponential Growth
- 14.6.3 ARMA Models
- 14.6.4 Mixed Statistics/ML Models
- 14.7 Multi-stage/Multi-scale
- 14.8 Summary
Glossary items referenced in this chapter
accuracy, AlphaFold, ARMA (Auto Regressive Moving Average), attention mechanisms, auto-regressive model, Babbage, Charles, bias, bootstrapping, bottom-up reasoning, ChatGPT, computer chess, data cleaning, database, decibel, deep neural network, Difference Engine, discontinuous, ECG , event, event stream, expert knowledge, expert system, explainable AI, exponential decay, exponential growth, Fast Fourier Transform, feedback, finite impulse response, finite state machine, first-order difference, fitness function, Fourier transform, frequency domain, genetic algorithm, genetic programming, Google Gemini, grammar, Harr wavelets, handwriting recognition, heterogeneous events, heuristic evaluation function, hidden Markov model, hidden state, hierarchical grammars, homogeneous events, hybrid AI/statistical system, infinite impulse response, information preserving, Lego-style matching, linear regression, logarithm base, logarithmic transform, long-term memory, long-term potentiation, long-term short-term memory networks, machine learning, Markov model, Markov, Andrey, moving average, moving average model, n-gram, natural logarithm, neural network, neural-network architecture, non-overlapping windows, non-time domain transformations, Normal distribution, outliers, pattern matching, periodicity, phase, phase changes, pre-processing, probabilistic process, probability, probability theory, probability transition, quasi-periodic, recurrent neural network, regular expression, scale-related variability, seasonal adjustments, signal processing, smoothing, spectrogram, speech recognition, sporadic sample, stationarity, statistical methods, statistical techniques, substructure, supervised learning, surrogate expert, synapse weights, time domain, time series, time series analysis, time series data, transformer model, transition probabilities, trend removal, trigger, uniform sampling rate, unsupervised classifier, unsupervised learning, variable-order markov models, wavelength, wavelet, wavelet transform, windowing