Research Notes and Blog Posts

Short notes and write-ups I publish as I explore ideas — mostly on Bayesian methods, time-series modeling, and traffic behavior.

13 posts 27 topics

2025

27 Feb

Introduction to Autoregressive (AR) Processes

Autoregressive (AR) processes are a class of time series models used to describe a variable that is correlated with its past values. AR models are widely applied in various fields such as economics, engineering,...

autoregressive processestime series

February 27, 2025 · 6 minute read

25 Jan

Fundamental Probabilistic Graphical Models: Tail-to-Tail, Head-to-Tail, and Head-to-Head

Probabilistic graphical models (PGMs) provide a compact, visual language for reasoning about joint distributions over many random variables. In directed acyclic graphs (DAGs), three elementary three-node structures — tail-to-tail, head-to-tail, and head-to-head — serve...

PGMconditional independenced-separation

January 25, 2025 · 6 minute read

2024

22 Dec

Gaussian Processes (GP) for Time Series Forecasting

Time-series forecasting is a critical application of Gaussian Processes (GPs), as they offer a flexible and probabilistic framework for predicting future values in sequential data. GPs not only provide point predictions but also quantify...

Gaussian processestime series

December 22, 2024 · 5 minute read

20 Dec

Connections among autoregressive (AR) processes, Cochrane-Orcutt correction, Ornstein-Uhlenbeck (OU) processes, and Gaussian Processes (GP)

In this post, we’ll explore four important concepts in time series modeling and stochastic processes: Autoregressive processes, Cochrane-Orcutt correction, Ornstein-Uhlenbeck (OU) processes, and Gaussian processes (GPs). After explaining each concept, we will also examine...

autocorrelationautoregressive processesCochrane-Orcutt correctionGaussian processesOrnstein-Uhlenbeck processes

December 20, 2024 · 8 minute read

18 Dec

Modeling Autocorrelation: FFT vs Gaussian Processes

Autocorrelation is a key property of time series data, describing the dependency of a variable on its past values. Both the Fourier Transform (FT) and Gaussian Processes (GP) can model autocorrelation, but they operate...

autocorrelationforecastingtime seriesGaussian processesFourier transform

December 18, 2024 · 4 minute read

17 Dec

From Ordinary Least Squares (OLS) to Generalized Least Squares (GLS)

Ordinary Least Squares (OLS) is one of the most widely used methods for linear regression. It provides unbiased estimates of the model parameters under the assumption that the error terms are independent and identically...

OLSGLSregressionautocorrelationheteroskedasticity

December 17, 2024 · 4 minute read

17 Dec

Proof: unbiasedness of ordinary least squares (OLS)

Consider the linear regression model: \(\mathbf{y} = \mathbf{X}\boldsymbol{\beta} + \boldsymbol{\varepsilon},\) where: \(\mathbf{y}\) is an \(n \times 1\) vector of observations. \(\mathbf{X}\) is an \(n \times p\) design matrix (full column rank). \(\boldsymbol{\beta}\) is a...

OLSregressionproof

December 17, 2024 · less than 1 minute read

27 Nov

Random Effects and Hierarchical Models in Driving Behaviors Modeling

In many driving behavior studies, we model how a following vehicle responds to the movement of a lead vehicle. For example, the Intelligent Driver Model (IDM) uses a set of parameters \(\boldsymbol{\theta} = (v_0,...

hierarchical modeltricksrandom effects

November 27, 2024 · 4 minute read

2022

01 Oct

Matrix derivative of Frobenius norm involving Hadamard product

Problem: Solve $\frac{\partial\left|\boldsymbol{A}\circ (\boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_ {F}^{2}}{\partial\boldsymbol{W}}$ and $\frac{\partial\left|\boldsymbol{A}\circ ( \boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_{F}^{2}}{\partial\boldsymbol{X}}$, where $\circ$ denotes the Hadamard product, and all variables are matrices.

matrix derivativetricksHadamard productFrobenuis product

October 01, 2022 · less than 1 minute read