404 — Trajectory Lost
The page you’re looking for veered off the lane.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
The page you’re looking for veered off the lane.
Dr. Chengyuan Zhang — Ph.D., McGill University. Research on Trustworthy AI/ML, Bayesian inference, and stochastic driver models for autonomous driving.
Published:
This post is part of a series that supports and promotes our recent document. It collects three parts into one page so readers can follow the full narrative from basic HMMs to a factorial model and then to an interpretable driving framework.
Published:
Autoregressive (AR) processes are a class of time series models used to describe a variable that is correlated with its past values. AR models are widely applied in various fields such as economics, engineering, finance, and traffic modeling, among others. In this post, we will introduce the concept of AR processes, their mathematical formulation, and the different types of AR models used in time series forecasting.
Published:
Probabilistic graphical models (PGMs) provide a compact, visual language for reasoning about joint distributions over many random variables. In directed acyclic graphs (DAGs), three elementary three-node structures — tail-to-tail, head-to-tail, and head-to-head — serve as the building blocks that determine when two variables are (conditionally) independent. Understanding these three patterns is the key to reading any larger Bayesian network.
Published:
In autonomous driving, modeling and understanding the interactions between the ego vehicle and its surrounding vehicles is crucial for safe and efficient navigation. One important challenge is dealing with varying traffic densities and dynamic environments, where the number of surrounding vehicles can fluctuate. Traditional models, which rely on a fixed number of surrounding vehicles, may struggle in such conditions.
Published:
Time-series forecasting is a critical application of Gaussian Processes (GPs), as they offer a flexible and probabilistic framework for predicting future values in sequential data. GPs not only provide point predictions but also quantify uncertainty, making them particularly useful in scenarios where confidence in predictions is important.
Published:
In this post, we’ll explore four important concepts in time series modeling and stochastic processes: Autoregressive processes, Cochrane-Orcutt correction, Ornstein-Uhlenbeck (OU) processes, and Gaussian processes (GPs). After explaining each concept, we will also examine their connections and differences. In the end, we will provide some literature of the applications in driving behavior (car-following) modeling.
Published:
Autocorrelation is a key property of time series data, describing the dependency of a variable on its past values. Both the Fourier Transform (FT) and Gaussian Processes (GP) can model autocorrelation, but they operate in fundamentally different domains: FFT in the frequency domain and GP in the time domain. Despite their differences, the two methods are mathematically connected through the spectral representation theorem. This blog explores the core concepts, their mathematical underpinnings, and practical differences.
Published:
Consider the linear regression model: \(\mathbf{y} = \mathbf{X}\boldsymbol{\beta} + \boldsymbol{\varepsilon},\) where:
Published:
Ordinary Least Squares (OLS) is one of the most widely used methods for linear regression. It provides unbiased estimates of the model parameters under the assumption that the error terms are independent and identically distributed (i.i.d.) with constant variance. However, real-world data often violate these assumptions. When the errors exhibit heteroskedasticity (non-constant variance) or correlation, OLS estimates remain UNBIASED (see this post) but lose their efficiency, leading to incorrect standard errors and confidence intervals.
Published:
In many driving behavior studies, we model how a following vehicle responds to the movement of a lead vehicle. For example, the Intelligent Driver Model (IDM) uses a set of parameters \(\boldsymbol{\theta} = (v_0, T, a_ {\text{max}}, b, s_0)\) to describe a driver’s response in terms of desired speed, time headway, maximum acceleration, comfortable deceleration, and minimal spacing. A critical challenge, however, is that not all drivers behave the same way. Some maintain larger headways, others brake more aggressively, and still others prefer smoother accelerations.
Published:
Hierarchical models are powerful tools in statistical modeling and machine learning, enabling us to represent data with complex dependency structures. These models are particularly useful in contexts where data is naturally grouped or exhibits multi-level variability. A critical aspect of hierarchical models lies in their hyperparameters, which control the relationships between different levels of the model.
Published:
The log-sum-exp trick is a critical technique in numerical computations involving logarithms and exponentials. It is widely used in machine learning, especially in algorithms like the forward-backward procedure in Hidden Markov Models ( HMMs). In this post, we will cover:
Published:
Problem: Solve $\frac{\partial\left|\boldsymbol{A}\circ (\boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_ {F}^{2}}{\partial\boldsymbol{W}}$ and $\frac{\partial\left|\boldsymbol{A}\circ ( \boldsymbol{Y}-\boldsymbol{W}^\top\boldsymbol{X})\right|_{F}^{2}}{\partial\boldsymbol{X}}$, where $\circ$ denotes the Hadamard product, and all variables are matrices.
Short description of portfolio item number 1
Short description of portfolio item number 2 