# Markov Inequality

If the expectation value of a non-negative random variable is small, then the random variable must itself be small with high probability. Markov's inequality quantifies this observation.

Let $X$ be a non-negative random variable with finite expectation $\mathbb{E}[X]$. Then for any $\epsilon > 0$,

(1)### Proof

In two steps:

(2)The first inequality uses the fact that the integrand is non-negative, the second that $\epsilon$ lower-bounds the new, shrunk integrand.

## Derivative Inequalities

Let $X$ be any random variable, and $f$ a non-negative increasing function. Then $X \geq \epsilon$ if and only if $f(X) \geq f(\epsilon)$. Supposing that $\mathbb{E}[f(X)] \leq \infty$, applying the basic Markov inequality gives

(3)### Relationship to the Chebyshev Inequality

Take $f(x) = (x - \mathbb{E}[X])^2$. This is non-negative, whether or not $X$ is, and its expectation, if it exists, is the variance $\mathbb{V}[X]$. Therefore

(4)which is the Chebyshev inequality. Similar bounds can be derived for higher moments, but do not have eponyms.

### Exponential Inequalities

Picking a $t > 0$ and taking $f(x) = e^{tx}$,

(5)when the expectation is finite.

The **cumulant generating function** is $K_X(t) = \log{\mathbb{E}[e^{tX}]}$. The above inequality gives

the right-hand side being the Legendre-Frenchel transform of the cumulant generating function.

This approach is often useful for getting exponential inequalities for unbounded variables, and is part of Cramér's theorem in large deviations theory.