Exploring the World of Markov Chain Monte Carlo: An Introduction to MCMC

Markov Chain Monte Carlo (MCMC) is a powerful computational method that allows us to generate samples from complex probability distributions, even when the distributions are difficult to sample from directly. It is used widely in fields such as statistics, physics, and computer science.

The MCMC algorithm works by constructing a Markov chain that has the desired distribution as its stationary distribution. At each step, the algorithm generates a new sample by proposing a new value from a candidate distribution and then accepting or rejecting the proposed value based on an acceptance probability.

The most common MCMC algorithm is the Metropolis-Hastings algorithm, which was first introduced in the 1950s by Nicholas Metropolis and later generalized by W. Keith Hastings in the 1970s. The algorithm works by proposing a new value from a candidate distribution and then accepting or rejecting the proposed value based on an acceptance probability.

Here is a simple example of the Metropolis-Hastings algorithm implemented in Python:

import numpy as np
import matplotlib.pyplot as plt

# Define the target distribution
def target(x):
    return np.exp(-x**2)

# Define the proposal distribution
def proposal(x, sigma):
    return np.random.normal(x, sigma)

# Set the initial value of x and the standard deviation of the proposal distribution
x = 0
sigma = 0.5

# Generate a large number of samples using the Metropolis-Hastings algorithm
samples = []
for i in range(10000):
    y = proposal(x, sigma)
    alpha = target(y) / target(x)
    if np.random.uniform() < alpha:
        x = y
    samples.append(x)

# Plot the samples
plt.hist(samples, bins=50, density=True)
plt.show()

In this example, we define a target distribution using the target function, which is a standard normal distribution. We also define a proposal distribution using the proposal function, which is a normal distribution centered at the current value of x with a standard deviation of sigma.

We start with an initial value of x and generate a large number of samples using the Metropolis-Hastings algorithm. At each step, we generate a new value y from the proposal distribution and compute the acceptance probability alpha. If the acceptance probability is greater than a random number drawn from a uniform distribution between 0 and 1, we accept the new value and set x = y. Otherwise, we reject the new value and set x = x.

Finally, we plot the resulting samples using a histogram. We can see that the samples closely approximate the target distribution, which is a standard normal distribution.

In conclusion, MCMC is a powerful computational method that allows us to generate samples from complex probability distributions. The Metropolis-Hastings algorithm is a widely used MCMC algorithm that can be easily implemented in Python.