guide:B846f441d7: Difference between revisions
mNo edit summary |
mNo edit summary |
||
Line 15: | Line 15: | ||
\newcommand{\ran}{\operatorname{ran}} | \newcommand{\ran}{\operatorname{ran}} | ||
\newcommand{\card}{\#} | \newcommand{\card}{\#} | ||
\renewcommand{\P}{\operatorname{P}} | |||
\renewcommand{\L}{\operatorname{L}} | |||
\newcommand{\mathds}{\mathbb}</math></div> | \newcommand{\mathds}{\mathbb}</math></div> | ||
Line 24: | Line 26: | ||
</math> | </math> | ||
holds for every <math>a > 0</math>. | holds for every <math>a > 0</math>. | ||
|For fixed <math>t > 0</math> the function <math>\exp(t\cdot | |For fixed <math>t > 0</math> the function <math>\exp(t\cdot -)</math> is increasing. We put <math>Y=Y_1+\cdots+Y_n</math> and get | ||
<math display="block"> | <math display="block"> |
Revision as of 22:56, 1 June 2024
We start this chapter by explaining a general technique to obtain bounds for [math]\P\bigl[Y_1+\cdots+Y_d\geqslant a\bigr][/math] where the [math]Y_i[/math] are independent random variables. A bound that arises from this ‘recipe’ is often referred to as a Chernoff bound. The basic idea is given in the following Lemma which establishes the ‘generic’ Chernoff bound.
Let [math](\Omega,\Sigma,\P)[/math] be a probability space and let [math]Y_1,\dots,Y_n\colon\Omega\rightarrow\mathbb{R}[/math] be mutually independent random variables. Then
For fixed [math]t \gt 0[/math] the function [math]\exp(t\cdot -)[/math] is increasing. We put [math]Y=Y_1+\cdots+Y_n[/math] and get
The classical Chernoff bound recipe is now to compute or estimate the right hand side by exploiting additional information on the [math]Y_i[/math] that is given in concrete situations. Readers unfamiliar with Chernoff bounds can find a classical example on how the recipe works in Problem where the [math]Y_i[/math] are independent Bernoulli random variables. In Theorem below the additional knowledge comes from the assumption that the moments of each [math]Y_i[/math] grow at most like factorial.
Let [math](\Omega,\Sigma,\P)[/math] be a probability space and let [math]Y_1,\dots,Y_d\colon\Omega\rightarrow\mathbb{R}[/math] be mutually independent random variables with expectation zero and [math]|\E(Y_i^k)|\leqslant k!/2[/math] for [math]i=1,\dots,d[/math] and [math]k\geqslant2[/math]. Then we have
Let [math]Y[/math] be one of the random variables [math]\pm Y_1,\dots,\pm Y_d[/math] and [math]0 \lt t\leqslant 1/2[/math]. Then we estimate
This implies
General references
Wegner, Sven-Ake (2024). "Lecture Notes on High-Dimensional Data". arXiv:2101.05841 [math.FA].