The probability integral transform is a fundamental concept in statistics that connects the cumulative distribution function, the quantile function, and the uniform distribution. We motivate the need for a generalized inverse of the CDF and prove the result in this context.

# Definitions

Suppose $U$ is the uniform distribution on $[0,1]$ , and $F$ is the cumulative distribution of the random variable, $Y$ , i.e.

and define the inverse cdf — the quantile function — as Figure: an example cdf

Recall that distributions — cdfs — are right continuous and monotonically increasing. However, as the example shows, they may also be flat and discontinuous. Indeed, the flat sections motivate the definition of the inverse cdf as a minimum. Thus, in the example, while $\forall x \in [0,1]$ $F(x) = 0.5$ , $F^{-1}(0.5) = 0$.

# The Main Result:

The key result is:

That is, $F^{-1}(U)$ and $Y$ have the same distribution.

### Proof

The proof isn’t particularly complicated, but it relies on two identities that follow from the definition of the inverse cdf.

First,

This follows from the definition as $F^{-1}(u)$ is the smallest value of $y$ for which $F(y) \geq u$.

Second,

Using the definition, write the left hand side, with a change of index, as

The inequality follows from the following argument. From the example, we can see that $z$ can certainly be less than $y$, but because $F$ is right continuous, the minimum cannot exceed $y$.

The main result is a statement about probabilities. As such, we can proceed by showing the following:

In what follows, let $u = U(\omega)$.

#### The forward implication

We will show

Suppose $F^{-1}(u) \leq y$.

Then, since $F$ is monotonic, we can apply it to both sides of the inequality without issue:

Combining this with the first of the above identities,

Hence,

#### The reverse implication

The argument is similar. Note that $F^{-1}$ is also monotonic. Thus, $u \leq F(y)$ implies

and the result follows after applying the second of the above identities: