Start From Gaussian Distribution

A Guassian random variable has the following pdf:

where are the expectation and standard deviation of . It is easy to show that for standard Guassian , we have for any ,

(The upper bound can be improved to ). Generally, we have

Tail bound of

If , then for any ,

A related bound is for moment generating function. If , then its MGF is

Scaling it implies that for , we have

MGF of

If , then

Takeaway

The coefficient of in is and the coefficient of in is , their multiplication is 1. This is because of the Fenchel–Legendre duality.

This rule applies to all concentration inequalities derived by Cramér–Chernoff method.

Coefficients in Concentration Inequalities

(Subgaussian Random Variable)

A random variable is -subgaussian if for all , it holds that

On the parameter

Here, can be regarded as the “variance” of . Using the properties of variance, we can determine the parameter for other Subguassian random variables. For example, the sum of iid -subguassian random variables is -subguassian.

Equivalent Definition:

  1. If is -subgaussian, then for any ,
  1. Conversely, if the above two inequalities hold for any , then is -subguassian. The constant is loose and can be improved to .

Bounded Random Variable

If a random variable is in , then we know its variance satisfies

We can prove that is -subguassian, that is

This implies the Hoeffding’s inequality: If are independent random variables, then for any ,

Some special cases of Hoeffding’s Inequalities

Suppose are independent -subguassian random variables. Then

To remember this it, we note that the “variance” of is and that the coefficient of is the inverse of variance. Similarly, we have

Commonly used inequalities

Choose in , we get

This inequality shows that is of order and the ommited term controls the probability.