Autocorrelation and Power Spectral Density
Autocorrelation Function (ACF)
The Autocorrelation Function (ACF), denoted quantifies how a random process correlates with a time-shifted version of itself:
If the process has a non-zero mean , then:
Key Properties of a Valid ACF (for a real-valued WSS process):
Maximum at Zero Lag:
Even Symmetry:
Non-Negative Power Spectrum:
Power Spectral Density (PSD)
The Power Spectral Density (PSD) describes how the power of a random process is distributed over frequency.
Defined as:
Inverse relation:
The Wiener–Khinchin Theorem
Stationarity
Stationarity describes how the statistical properties of a random process behave over time. It tells us whether the process’s behavior is consistent (time-invariant) or whether it changes (time-varying).
A random process ( X(t) ) is said to be stationary if its statistical characteristics — such as mean, variance, and correlation — do not change with time. Otherwise, it is non-stationary.
1. Strict-Sense Stationary (SSS)
A random process is Strict-Sense Stationary if all its joint probability distributions are invariant under time shifts.
That is, for any integer time instants , and any time shift :
This means the entire probabilistic behavior of the process looks the same no matter when you observe it.
In simple terms: Shifting the observation window in time does not change the statistics of the process — not just the mean or variance, but all higher-order moments and dependencies too.
Examples:
- A pure sinusoid \cos(\omega t + \phi) \phi [0, 2\pi)$→ SSS because its distribution is independent of time.
2. Wide-Sense Stationary (WSS)
Since verifying full strict-sense stationarity is often impractical, we use a weaker but very useful notion: Wide-Sense Stationarity (WSS).
A process is WSS if it satisfies:
Constant Mean:
Autocorrelation depends only on time lag:
This means that the correlation between two values of depends only on how far apart they are in time (the lag , not when they were measured.
In other words:
- The process has a stable average value.
- The level of similarity between samples depends only on time separation.
Examples:
- A zero-mean white noise process (flat PSD).
- A stationary Gaussian process with constant variance.
3. Non-Stationary Processes
A process is non-stationary if it violates the WSS conditions:
- Mean is not constant, i.e. .
- Autocorrelation depends on the absolute time instants, not just the lag.
Examples:
- : mean grows linearly with time.
- : amplitude envelope changes with time.
- Speech signals, stock prices, and weather patterns are typically non-stationary.
Relationship Between SSS and WSS
- Every Strict-Sense Stationary process is also Wide-Sense Stationary.
- The converse is not necessarily true — a process may have constant mean and lag-dependent autocorrelation but still have higher-order moments that vary with time.
Hierarchy:
Why Stationarity Matters
Stationarity is crucial in signal processing and stochastic analysis because many analytical tools — such as the Power Spectral Density (PSD), Fourier Transform relationships, and filtering techniques — assume that the process is WSS.
If the process is non-stationary, these tools may not be valid or may give misleading results.
In practice:
- Real-world signals (e.g., EEG, seismic data, audio) are often locally stationary — approximately WSS within short time windows. → This is why Short-Time Fourier Transform (STFT) and wavelet analysis are used for time-varying spectral analysis.
Summary Table
| Type | Definition | Examples | Remarks |
|---|---|---|---|
| Strict-Sense Stationary (SSS) | All joint distributions are invariant under time shift | Sinusoid with random phase, white noise | Hard to verify; strongest form |
| Wide-Sense Stationary (WSS) | Mean constant, ACF depends only on lag | Gaussian noise, AR(1) process with stable parameters | Most common assumption in practice |
| Non-Stationary | Mean or correlation changes with time | Speech, climate data, trends | Often approximated as “locally stationary” |