Basic facts on Gaussian vectors
A random vector [math]Z=(Z_1,...,Z_n)[/math] is said to be Gaussian, if for all [math]\lambda_1,...,\lambda_n\in\R[/math]
is Gaussian.Moreover, [math]Z[/math] is called centered, if [math]\E[Z_j]=0[/math] for all [math]1\leq j\leq n[/math]. Let [math]Z[/math] be a Gaussian vector. Then for all [math]\xi\in\R^n[/math] we get
where [math]C_Z:=(C_{ij})[/math] and [math]C_{ij}=\E[Z_iZ_j][/math]. If [math]Cov(Z_i,Z_j)=0[/math], then [math]Z_i[/math] and [math]Z_j[/math] are independent. More generally, we have the Gaussian vectors
[math]Y_1[/math] and [math]Y_2[/math] are independent if and only if [math]Cov(X_j,X_n)=0[/math], where [math]1\leq j\leq i_1[/math] and [math]i_1+1\leq k\leq i_2[/math]. If [math]Z_1,...,Z_n[/math] are independent Gaussian r.v.'s, we have that
is a Gaussian vector. If [math]Z[/math] is a Gaussian vector and [math]A\in \mathcal{M}(m\times n,\R)[/math] , we get that [math]AZ[/math] is again a Gaussian vector.
Let [math](\Omega,\F,\p)[/math] be a probability space. Let [math]X\in L^1(\Omega,\F,\p)[/math] and [math]Y_1,...,Y_p\in L^1(\Omega,\F,\p)[/math] and let [math](X,Y_1,...,Y_p)[/math] be a centered Gaussian vector. Then
\label{rem4} Moreover, for a measurable map [math]h:\R\to\R_+[/math] we get
\begin{proof}[Proof of Theorem] Let [math]\tilde X=\lambda_1 Y_1+\dotsm +\lambda_pY_p[/math] be the orthogonal projection of [math]X[/math] onto [math]span\{Y_1,...,Y_p\}[/math], meaning that for all [math]1\leq j\leq p[/math]
Note that this condition gives us explicitly the [math]\lambda_j's[/math]. We obtain therefore that [math](Y_1,...,Y_p,(X-\tilde X))[/math] is a Gaussian vector. Moreover, we get [math]\E[(X-\tilde X)Y_j]=Cov(X-\tilde X,Y_j)=0[/math] and thus [math]X-\tilde X[/math] is independent of [math]Y_1,...,Y_p[/math]. Hence
\end{proof}
General references
Moshayedi, Nima (2020). "Lectures on Probability Theory". arXiv:2010.16280 [math.PR].
Notes
- This is done similarly to the proof of theorem