Let X1,X2,…,Xn
be independent random variables. Denote
The well known Kolmogrov inequality can stated as for all ε≥0
P(max1≤j≤n|Sj|≥ε)≤Var(Sn)ε2.
The one side kolmogrov type ineqalites are stated as for all ε≥0
P(max1≤j≤nSj≥ε)≤Var(Sn)ε2+Var(Sn).
We will prove this inequality in the following.
Proposition. LetX
be a random variable with Var(X)<∞
. Then for all ε≥0
Proof. Without loss of generality, we may assume that E(X)=0
. Then
ε=E(ε?X)=E{(ε ?X)IX<ε}+E{(ε ?X)IX≥ε}≤E{(ε ?X)IX<ε}.
By Cauchy-Schwardz‘s inequality, We have
ε2≤[E{(ε ?X)IX≤ε}]2≤E(ε +X)2P(X≤ε)=[ε2+Var(X)][1?P(X>ε)].
Therefor,
Proof of the one side Kolmogorov type inequality. Let Λ={max1\lej≤nSj ≥ε}
and Λk={max1≤j<kSj<ε,Sk≥ε}
, then Λ=?nk=1Λk
. Without loss of generality, we assume that E(Xj)=0,j=1,…,n.
Then by the independence of the random variables,
ε=E[ε?Sn]=E[(ε?Sn)IΛ]+[(ε?Sn)IcΛ]=∑k=1n[(ε?Sn)IΛk]+[(ε?Sn)IΛc]=∑k=1nE[{(ε?Sk)?(Sn?Sk)}IΛk]+[(ε?Sn)IΛc]=∑k=1nE[(ε?Sk)IΛk]+∑k=1n[E(Sn?Sk)IΛk]+E[(ε?Sn)IΛc]=∑k=1nE[(ε?Sk)IΛk]+E[(ε?Sn)IΛc]≤E[(ε?Sn)IΛc.
By Cauchy-Schwardz‘s inequality, we have
ε2≤{E[(ε?Sn)IΛc]}2≤E[(ε?Sn)]2P(IΛc)=[ε2+Var(S2n)][1?P(Λ)].
Therefore,
as the inequality claimed.
The inequality is also true for martingale difference sequence, the
proof is samilar.
On One Side Kolmogorov type inequalities