Brownian motion
Brownian motion is defined as a stochastic process
which is Gaussian, i.e. such that for any
,
for
have a jointly Gaussian (normal distribution),
for all
, and the covariance is
for all
. A normal distribution is uniquely determined by its mean vector
and covariance matrix
. Each
has a normal distribution
.


It follows from the definition that Brownian motion has independent increments, in other words whenever
,
and
are independent.

A consequence of the definition is that for any
, the process
is also a Brownian motion.

By a therorem of Norbert Wiener, Brownian motion can be chosen such that with probability 1 (or for all
),
is continuous as a function of
. a Brownian motion with this property is called the standard Brownian motion.


A Markov time
for Brownian motion is defined as a random variable
such that for each
, the event
is a function of
for
.
The strong Markov property of Brownian motion says that if
is a Markov time such that
, then conditional on
, the process
for
is itself a Brownian motion and is independent of
for
.
Using the strong Markov property one can prove the reflection principle for Brownian motion: for any
and
, the probability that
for some
with
equals
.

What may be called “Wald’s identity“ for Brownian motion says that if
is a Markov time with
, and so
with probability 1,
and
.

The Brownian Bridge is a Gaussian process
defined for
with
and covariance
for
.

The path starts and ends at 0 (a.s.).
The dashed curves plot the
band (two standard deviations), and the dotted line marks the zero mean.
Note that
a.s., and the fluctuations are largest near
.
Relations between the Brownian motion and Brownian bridge:
- If
is a Brownian motion then
for
is a Brownian bridge, which is independent of
. This shows that
also can (and will) be taken to be continuity as a function of
. - Conversely, if
is a Brownian bridge and
is a
variable independent of
then
has the distribution of Brownian motion for
. - The conditional distribution of
given that
converges to that of
as
. - If
is a Brownian bridge and
then
is a Brownian motion.

Brownian motion (BM) — where it shows up
- Physics & PDEs: Diffusion of particles; probabilistic solution of the heat equation; harmonic measure & potential theory.
- Finance: Geometric BM for asset prices; correlated BMs in stochastic volatility (e.g., Heston); first-passage ideas for default/credit models; Monte Carlo pricing.
- Statistics & sequential analysis: Likelihood-ratio tests/SPRT; drift-diffusion models for decision making; estimation of diffusion parameters; functional CLT (random walk ⇒ BM).
- Machine learning & MCMC: Langevin Monte Carlo; score-based/diffusion generative models (continuous-time SDE viewpoint).
- Biology & neuroscience: Wright–Fisher diffusion (genetic drift); membrane-potential and evidence-accumulation models (first-passage times).
- Queues, control, and signals: Heavy-traffic limits (Reflected BM); Kalman–Bucy filtering (continuous time); integrated white noise as a model for 1/f21/f^21/f2-like signals.
Brownian bridge (BB) — why it’s special
A BB is BM on [0,1]conditioned to be 0 at t=1: Yt=Bt−tB1
- Goodness-of-fit tests: The exact null distributions of Kolmogorov–Smirnov, Cramér–von Mises, and Anderson–Darling statistics are functionals of a BB.
- Empirical process theory: Donsker’s theorem ⇒ centered empirical CDF processes converge to a BB; yields asymptotic CIs/bands for distribution functions and QQ-plots.
- Conditioned path sampling: “Diffusion bridges” in Bayesian inference, particle filters, and SDE calibration when you know endpoints (e.g., smoothing between observations).
- Finance (variance reduction): Brownian-bridge corrections in Monte Carlo for barrier options: between time steps, condition on endpoints to estimate barrier crossings and cut bias/variance.
- Boundary crossing & change-point: Many suprema/infima of centered cumulative sums have BB limits, giving p-values and thresholds for change-point and scan statistics.
- Random geometry/combinatorics: BB/bridges underpin Brownian excursions, which connect to random trees (Aldous’ CRT) and scaling limits in random maps.
Let
be a Brownian bridge and
. Then,
(a) the probability that
for some
, which is the same that probability that
for some
by sample continuity and the intermediate value theorem, equals
.

(b) the probability that
for some
, which is the same as the probability that
for some
, equals
. The series converges quite fast except for small b.

For a Brownian motion
and two real numbers
and
, let
be the least time
such that
, if such a
exists, or
otherwise. Then
is a Markov time.
(a) If
then
.

(b) If
then
but
, otherwise we’d get a contradiction to the Wald identity
.

(c) If
then
and
.

(d) If
then
.

Martingales
Let
be a sequence of random variables. Let
be a collection of random variables such that
for
and
for each
. Then,
is called a martingale sequence if whenever
,
, or a submartingale sequence if
, or a supermartingale sequence if
.



To make the collection
as small as possible one can and often will take
.
A stopping time for a martingale (or sub- or supermartingale) sequence is a positive integer-valued random variable
such that for each
the event
is a function of the variables in
.

Optional stopping. Let
be a martingale sequence for
. Let
and let
and
be two stopping times for
such that
with probability 1. Then
. For a submartingale or supermartingale, the equality is replaced by
or
respectively.
Doob’s maximal inequality. Let
be a submartingale sequence, let
, and let
be the event
. Then
![]()

Let
for all
and suppose
form a supermartingale with respect to
. Let
be stopping times for
. Then
![]()
Taking expectations of both sides gives
![]()
This is an integration of notes from Stochastic Processes with AI. Pretty impressive.

Leave a Reply