14 Aug
This post is part of a series that supports and promotes our recent document. It collects three parts into one page so readers can follow the full narrative from basic HMMs to a factorial...
HMMtime seriesBayesian
August 14, 2025 · 5 minute read
27 Feb
Autoregressive (AR) processes are a class of time series models used to describe a variable that is correlated with its past values. AR models are widely applied in various fields such as economics, engineering,...
autoregressive processestime series
February 27, 2025 · 6 minute read
25 Jan
Probabilistic graphical models (PGMs) provide a compact, visual language for reasoning about joint distributions over many random variables. In directed acyclic graphs (DAGs), three elementary three-node structures — tail-to-tail, head-to-tail, and head-to-head — serve...
PGMconditional independenced-separation
January 25, 2025 · 6 minute read
10 Jan
In autonomous driving, modeling and understanding the interactions between the ego vehicle and its surrounding vehicles is crucial for safe and efficient navigation. One important challenge is dealing with varying traffic densities and dynamic...
Gaussian processesdriving behavior
January 10, 2025 · 7 minute read
22 Dec
Time-series forecasting is a critical application of Gaussian Processes (GPs), as they offer a flexible and probabilistic framework for predicting future values in sequential data. GPs not only provide point predictions but also quantify...
Gaussian processestime series
December 22, 2024 · 5 minute read
20 Dec
In this post, we’ll explore four important concepts in time series modeling and stochastic processes: Autoregressive processes, Cochrane-Orcutt correction, Ornstein-Uhlenbeck (OU) processes, and Gaussian processes (GPs). After explaining each concept, we will also examine...
autocorrelationautoregressive processesCochrane-Orcutt correctionGaussian processesOrnstein-Uhlenbeck processes
December 20, 2024 · 8 minute read
18 Dec
Autocorrelation is a key property of time series data, describing the dependency of a variable on its past values. Both the Fourier Transform (FT) and Gaussian Processes (GP) can model autocorrelation, but they operate...
autocorrelationforecastingtime seriesGaussian processesFourier transform
December 18, 2024 · 4 minute read
17 Dec
Ordinary Least Squares (OLS) is one of the most widely used methods for linear regression. It provides unbiased estimates of the model parameters under the assumption that the error terms are independent and identically...
OLSGLSregressionautocorrelationheteroskedasticity
December 17, 2024 · 4 minute read
17 Dec
Consider the linear regression model: \(\mathbf{y} = \mathbf{X}\boldsymbol{\beta} + \boldsymbol{\varepsilon},\) where: \(\mathbf{y}\) is an \(n \times 1\) vector of observations. \(\mathbf{X}\) is an \(n \times p\) design matrix (full column rank). \(\boldsymbol{\beta}\) is a...
OLSregressionproof
December 17, 2024 · less than 1 minute read
27 Nov
In many driving behavior studies, we model how a following vehicle responds to the movement of a lead vehicle. For example, the Intelligent Driver Model (IDM) uses a set of parameters \(\boldsymbol{\theta} = (v_0,...
hierarchical modeltricksrandom effects
November 27, 2024 · 4 minute read
24 Nov
Hierarchical models are powerful tools in statistical modeling and machine learning, enabling us to represent data with complex dependency structures. These models are particularly useful in contexts where data is naturally grouped or exhibits...
hierarchical modeltrickshyperparameters
November 24, 2024 · 4 minute read
20 Nov
The log-sum-exp trick is a critical technique in numerical computations involving logarithms and exponentials. It is widely used in machine learning, especially in algorithms like the forward-backward procedure in Hidden Markov Models ( HMMs)....
logsumexptricksHMM
November 20, 2024 · 3 minute read
01 Oct
Problem: Solve $\frac{\partial\left|\boldsymbol{A}\circ (\boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_ {F}^{2}}{\partial\boldsymbol{W}}$ and $\frac{\partial\left|\boldsymbol{A}\circ ( \boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_{F}^{2}}{\partial\boldsymbol{X}}$, where $\circ$ denotes the Hadamard product, and all variables are matrices.
matrix derivativetricksHadamard productFrobenuis product
October 01, 2022 · less than 1 minute read