Brownian motion in a box

I decided to run a simple simulation of a one-dimensional random walk on lattice with closed boundary condition. To implement the boundary conditions, I basically forced to "turn back" the particle every time it hits the boundaries of the segment.

I took every lattice site to be an integer, and the boundaries to be at $[-L,L]$. This is the result for $L=50$ and $10^8$ steps:

enter image description here

So, it looks like in this simple lattice model the distribution is uniform everywhere except from the very boundaries of the segment, i.e. it is uniform in $(-L,L)$.

I don't know if this still holds in a continuous model, also because it becomes more tricky to do the simulation in this way, because the particle can never really hit the boundary, so it is possible that the result depends on how the boundary conditions are implemented exactly.

Relevant literature

This problem is commonly known as reflected brownian motion (RBM). Many articles can be found regarding RBM in open regions (like $[0,\infty)$), but we are more interested in RBM in closed regions.

About this, I found this relevant article:

The Stationary Distribution of Reflected Brownian Motion in a Planar Region - . M. Harrison, H. J. Landau, L. A. Shepp

It gives an explicit -and rather complicated- expression for the stationary probability distribution. The article is quite technical, but from what I understand the stationary distribution is not uniform.


I suspect there is a natural way to treat the general case, but I will only deal with $1$ boundary condition. Say you are in a the one dimensional box $[-L,L]$, (we will see in a moment that the same argument works in the $n$ dimensional box). Brownian motion has infinitesimal generator equal to $1/2$ times the Laplacian. This is equivalent to saying that if we begin our Brownian motion with probability distribution $\mu$, then for any bounded continuous function $f$, the function $$u_t(x)=E_{\mu}[f(B_t)]=\int f(x) P(B_t=x),$$ solves the equation $\partial_t u=\frac{1}{2} \Delta u.$ The subscript $\mu$ on the expectation means the Brownian motion $B_t$ at time zero has probability distribution $\mu$, it may be simplest to think of $\mu$ being a delta function.

We can formally write a solution to the differential equation which $u$ satisfies as $$u_t(x)=e^{\frac{t}{2} \Delta} u_0(x),$$ where we are interpreting $e^{\frac{t}{2} \Delta}$ as the power series expansion $$e^{\frac{t}{2} \Delta}=\sum_{k=0}^{\infty} \frac{(t \Delta/2)^k}{k!}.$$ We expect this to work because formally differentiating $\partial_t e^{\frac{t}{2} \Delta}$ gives $\frac{1}{2} \Delta$. If you are rightfully suspicious of this formal differentiation you can check directly that $$\partial_t \sum_{k=0}^{\infty} \frac{(t \Delta(u)/2)^k}{k!}=\frac{1}{2}\Delta u.$$ Dealing with this exponential seems tricky on first sight so lets diagonalize the Laplacian keeping in mind that we can exponentiate a diagonal matrix by just exponentiating each entry. Lets consider the boundary conditions such that $\partial_x u(L)=\partial_x u(-L)=0$ i.e. force the derivative to be $0$ on the boundary. Then the Laplacian has eigenfunctions $$\psi_{2n}(x)=\cos\left(\frac{(2n)\pi}{2 L} x\right)$$ with eigenvalues $$\lambda_{2n}=-\left(\frac{(2n)\pi}{L}\right)^2,$$ and $$\psi_{2n+1}(x)=\sin\left(\frac{(2n+1) \pi}{2L}x\right)$$ with eigenvalues $$\lambda_{2n+1}=-\left( \frac{(2n+1) \pi}{L} \right)^2$$ for each $n \geq 0$. Note that every one of these eigenvalues is negative except for the eigenvalue of the constant function $\psi_0(x)$, which is $0$.

Fourier analysis tells us that we can write any reasonable function $u_0(x)$ on the interval $[-L,L]$ whose derivative at the boundary is $0$ as a sum $u_0(x)=\sum_{n \geq 0} a_n \psi_{n}(x)$. In particular this works for any bounded continuous function. Now we have $$u_t(x)=e^{\frac{t}{2} \Delta} \sum_{n \geq 0} a_n \psi_{n}(x)=\sum_{n=0}^{\infty} e^{\frac{t \lambda_n}{2}} a_n \psi_n(x).$$ Because all eigenvalues are negative except for $\lambda_0$, as $t \to \infty$, $u_t(x)$ approaches the constant function $a_0$. In fact all other terms have exponential decay in $t$, so $u_t(x)$ converges to a constant quite quickly.

Now intuitively we would like to set $f(x)=\delta(y-x)$ so that $$u_t(x)=E_{\mu}[\delta(y-x)]=P(B_t=y),$$ is approaching a constant quickly. This would allow us to conclude that the distribution of a brownian motion approaches the uniform distribution regardless of its initial distribution $\mu$. Unfortunately a delta function is not bounded and continuous, so we can't do this directly but we can approximate the delta function arbitrarily well by a bounded continuous function and thus get arbitrarily good approximations of $P(B_t=y)$ which converge to a constant independent of $y$.

In a higher dimensional box $\prod_{i=1}^n [-L_i, L_i]$ we can make a similar argument, where the eigenfunctions of the $n$ dimensional Laplacian can be written in terms of our $1$ dimensional eigenfunctions as $\prod_{i=1}^n \psi_n(x)$ and similarly the eigenvaules as $\sum_{i=1}^n \lambda_n$. The important observation is that the constant function still has eigenvalue $0$ and every other eigenvalue is still negative.

The argument for periodic boundary conditions will be almost identical because the eigenfunctions of the Laplacian can still be written in terms of sin and cosine which have negative eigenvalues unless they are constant. I have not thought much about eigenfunctions of the Laplacian with sticky boundary conditions.