down-sampling

Terms from Artificial Intelligence: humans at the heart of algorithms

Down sampling is a term from signal processing and time-series analysis. It refers to reducing the sampling points, often by simply picking every n-th sample. For example, given the 20 second sampled data:
      [1.35, 1.37, 1.38, 1.45, 1.42, 1.48, 1.52, 1.51, 1.56],
we might take every 3rd item to give one minute samples [1.35, 1.45, 1.52]. More sophisticated down sampling might do some form of moving average or other smoothing process before the selection, for example, taking one munute running averages: [1.36, 1.45, 1.53].

Downsamopling is particularly useful as a data reduction step for slowly changing data. In a window-based algorithm it effectively increases how far back the algorithm can look for a given window size.

Used on page 437

Also known as down-sample