随机变量乘积的期望和方差

数学证明

随机变量乘积的期望: 已知两个随机变量 x 1 x_1 x1​和 x 2 x_2 x2​为相互独立, 则 x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的期望为 E ( x 1 ⋅ x 2 ) = E ( x 1 ) ⋅ E ( x 2 ) \mathbb{E}(x_1\cdot x_2)=\mathbb{E}(x_1)\cdot \mathbb{E}(x_2) E(x1​⋅x2​)=E(x1​)⋅E(x2​)

证明:随机变量 x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的期望为 E ( x 1 ⋅ x 2 ) = E ( x 1 ) ⋅ E ( x 2 ) + C o v ( x 1 , x 2 ) \mathbb{E}(x_1\cdot x_2)=\mathbb{E}(x_1)\cdot\mathbb{E}(x_2)+\mathrm{Cov}(x_1,x_2) E(x1​⋅x2​)=E(x1​)⋅E(x2​)+Cov(x1​,x2​)因为随机变量 x 1 x_1 x1​和 x 2 x_2 x2​相互独立,则 C o v ( x 1 , x 2 ) = 0 \mathrm{Cov}(x_1,x_2)=0 Cov(x1​,x2​)=0进而可知 E ( x 1 ⋅ x 2 ) = E ( x 1 ) ⋅ E ( x 2 ) + 0 = E ( x 1 ) ⋅ E ( x 2 ) \mathbb{E}(x_1\cdot x_2)=\mathbb{E}(x_1)\cdot\mathbb{E}(x_2)+0=\mathbb{E}(x_1)\cdot\mathbb{E}(x_2) E(x1​⋅x2​)=E(x1​)⋅E(x2​)+0=E(x1​)⋅E(x2​)证毕。

随机变量乘积的方差: 已知两个随机变量 x 1 x_1 x1​和 x 2 x_2 x2​为相互独立, 则 x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的方差为 V a r ( x 1 ⋅ x 2 ) = V a r ( x 1 ) ⋅ V a r ( x 2 ) + V a r ( x 1 ) ⋅ E ( x 2 ) 2 + V a r ( x 2 ) ⋅ E ( x 1 ) 2 \mathrm{Var}(x_1\cdot x_2)=\mathrm{Var}(x_1)\cdot\mathrm{Var}(x_2)+\mathrm{Var}(x_1)\cdot \mathbb{E}(x_2)^2+\mathrm{Var}(x_2)\cdot \mathbb{E}(x_1)^2 Var(x1​⋅x2​)=Var(x1​)⋅Var(x2​)+Var(x1​)⋅E(x2​)2+Var(x2​)⋅E(x1​)2

证明:已知随机变量 x 1 x_1 x1​和 x 2 x_2 x2​相互独立,则随机变量 x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的方差为 V a r ( x 1 ⋅ x 2 ) = E ( ( x 1 ⋅ x 2 − E ( x 1 ⋅ x 2 ) ) 2 ) = E ( x 1 2 ⋅ x 2 2 ) − E ( x 1 ⋅ x 2 ) 2 = E ( x 1 2 ) ⋅ E ( x 2 2 ) − E ( x 1 ) 2 ⋅ E ( x 2 ) 2 = ( V a r ( x 1 ) + E ( x 1 ) 2 ) ⋅ ( V a r ( x 2 ) + E ( x 2 ) 2 ) − E ( x 1 ) 2 ⋅ E ( x 2 ) 2 = V a r ( x 1 ) ⋅ V a r ( x 2 ) + V a r ( x 1 ) ⋅ E ( x 2 ) 2 + V a r ( x 2 ) ⋅ E ( x 1 ) 2 \begin{aligned}\mathrm{Var}(x_1\cdot x_2)&=\mathbb{E}\left((x_1\cdot x_2-\mathbb{E}(x_1\cdot x_2))^2\right)\\&=\mathbb{E}(x^2_1\cdot x_2^2)-\mathbb{E}(x_1\cdot x_2)^2\\&=\mathbb{E}(x^2_1) \cdot \mathbb{E}(x^2_2)-\mathbb{E}(x_1)^2\cdot \mathbb{E}(x_2)^2\\&=(\mathrm{Var}(x_1)+\mathbb{E}(x_1)^2)\cdot(\mathrm{Var}(x_2)+\mathbb{E}(x_2)^2)-\mathbb{E}(x_1)^2\cdot \mathbb{E}(x_2)^2\\&=\mathrm{Var}(x_1)\cdot \mathrm{Var}(x_2)+\mathrm{Var}(x_1)\cdot \mathbb{E}(x_2)^2+\mathrm{Var}(x_2)\cdot \mathbb{E}(x_1)^2\end{aligned} Var(x1​⋅x2​)​=E((x1​⋅x2​−E(x1​⋅x2​))2)=E(x12​⋅x22​)−E(x1​⋅x2​)2=E(x12​)⋅E(x22​)−E(x1​)2⋅E(x2​)2=(Var(x1​)+E(x1​)2)⋅(Var(x2​)+E(x2​)2)−E(x1​)2⋅E(x2​)2=Var(x1​)⋅Var(x2​)+Var(x1​)⋅E(x2​)2+Var(x2​)⋅E(x1​)2​

具体实例

给定两个独立同分布的随机变量 x 1 x_1 x1​和 x 2 x_2 x2​,且 x 1 , x 2 ∼ N ( 0 , 1 ) x_1,x_2\sim \mathcal{N}(0,1) x1​,x2​∼N(0,1),根据以上两随机变量乘积的期望公式可知, x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的期望为 E ( x 1 ⋅ x 2 ) = E ( x 1 ) ⋅ E ( x 2 ) = 0 × 0 = 0 \mathbb{E}(x_1\cdot x_2)=\mathbb{E}(x_1)\cdot \mathbb{E}(x_2)=0\times 0 = 0 E(x1​⋅x2​)=E(x1​)⋅E(x2​)=0×0=0根据以上两随机变量乘积的方差公式可知 x 1 ⋅ x 2 x_1\cdot x_2 x1​⋅x2​的方差为 V a r ( x 1 ⋅ x 2 ) = V a r ( x 1 ) ⋅ V a r ( x 2 ) + V a r ( x 1 ) ⋅ E ( x 2 ) 2 + V a r ( x 2 ) ⋅ E ( x 1 ) 2 = 1 × 1 + 1 × 0 + 1 × 0 = 1 \begin{aligned}\mathrm{Var}(x_1\cdot x_2)&=\mathrm{Var}(x_1)\cdot\mathrm{Var}(x_2)+\mathrm{Var}(x_1)\cdot \mathbb{E}(x_2)^2+\mathrm{Var}(x_2)\cdot \mathbb{E}(x_1)^2\\&=1\times 1 +1\times 0+ 1\times 0\\&=1\end{aligned} Var(x1​⋅x2​)​=Var(x1​)⋅Var(x2​)+Var(x1​)⋅E(x2​)2+Var(x2​)⋅E(x1​)2=1×1+1×0+1×0=1​

上一篇:机器学习——Perceptron(感知机)


下一篇:无监督-DEEP GRAPH INFOMAX