guide:B846f441d7: Difference between revisions
mNo edit summary |
mNo edit summary |
||
Line 37: | Line 37: | ||
</math> | </math> | ||
as desired.}} | as desired.}} | ||
The classical Chernoff bound recipe is now to compute or estimate the right hand side by exploiting additional information on the <math>Y_i</math> that is given in concrete situations. Readers unfamiliar with Chernoff bounds can find a classical example on how the recipe works in [[ | The classical Chernoff bound recipe is now to compute or estimate the right hand side by exploiting additional information on the <math>Y_i</math> that is given in concrete situations. Readers unfamiliar with Chernoff bounds can find a classical example on how the recipe works in [[exercise:Dfadfac430 |Problem]] where the <math>Y_i</math> are independent Bernoulli random variables. In [[#MTB-THM |Theorem]] below the additional knowledge comes from the assumption that the moments of each <math>Y_i</math> grow at most like factorial. | ||
{{proofcard|Theorem|MTB-THM|Let <math>(\Omega,\Sigma,\P)</math> be a probability space and let <math>Y_1,\dots,Y_d\colon\Omega\rightarrow\mathbb{R}</math> be mutually independent random variables with expectation zero and <math>|\E(Y_i^k)|\leqslant k!/2</math> for <math>i=1,\dots,d</math> and <math>k\geqslant2</math>. Then we have | {{proofcard|Theorem|MTB-THM|Let <math>(\Omega,\Sigma,\P)</math> be a probability space and let <math>Y_1,\dots,Y_d\colon\Omega\rightarrow\mathbb{R}</math> be mutually independent random variables with expectation zero and <math>|\E(Y_i^k)|\leqslant k!/2</math> for <math>i=1,\dots,d</math> and <math>k\geqslant2</math>. Then we have |
Latest revision as of 23:43, 1 June 2024
We start this chapter by explaining a general technique to obtain bounds for [math]\P\bigl[Y_1+\cdots+Y_d\geqslant a\bigr][/math] where the [math]Y_i[/math] are independent random variables. A bound that arises from this ‘recipe’ is often referred to as a Chernoff bound. The basic idea is given in the following Lemma which establishes the ‘generic’ Chernoff bound.
Let [math](\Omega,\Sigma,\P)[/math] be a probability space and let [math]Y_1,\dots,Y_n\colon\Omega\rightarrow\mathbb{R}[/math] be mutually independent random variables. Then
For fixed [math]t \gt 0[/math] the function [math]\exp(t\cdot -)[/math] is increasing. We put [math]Y=Y_1+\cdots+Y_n[/math] and get
The classical Chernoff bound recipe is now to compute or estimate the right hand side by exploiting additional information on the [math]Y_i[/math] that is given in concrete situations. Readers unfamiliar with Chernoff bounds can find a classical example on how the recipe works in Problem where the [math]Y_i[/math] are independent Bernoulli random variables. In Theorem below the additional knowledge comes from the assumption that the moments of each [math]Y_i[/math] grow at most like factorial.
Let [math](\Omega,\Sigma,\P)[/math] be a probability space and let [math]Y_1,\dots,Y_d\colon\Omega\rightarrow\mathbb{R}[/math] be mutually independent random variables with expectation zero and [math]|\E(Y_i^k)|\leqslant k!/2[/math] for [math]i=1,\dots,d[/math] and [math]k\geqslant2[/math]. Then we have
Let [math]Y[/math] be one of the random variables [math]\pm Y_1,\dots,\pm Y_d[/math] and [math]0 \lt t\leqslant 1/2[/math]. Then we estimate
This implies
General references
Wegner, Sven-Ake (2024). "Lecture Notes on High-Dimensional Data". arXiv:2101.05841 [math.FA].