guide:3285677816: Difference between revisions

From Stochiki
mNo edit summary
mNo edit summary
Line 24: Line 24:
<math display="block">\sqrt{n}(h(B)-h(\beta))\,\xrightarrow{D}\,\mathcal{N}(0, \nabla h(\beta)^T \cdot \Sigma \cdot \nabla h(\beta)).</math>
<math display="block">\sqrt{n}(h(B)-h(\beta))\,\xrightarrow{D}\,\mathcal{N}(0, \nabla h(\beta)^T \cdot \Sigma \cdot \nabla h(\beta)).</math>


<div class = "text-right">
<div>
<proofs page = "guide_proofs:6b004aad05" section = "proof" label = "The Delta Method" />
<proofs page = "guide_proofs:3285677816" section = "proof" label = "The Delta Method" />
</div>
</div>


==References==
==References==
*{{cite web |url = https://en.wikipedia.org/w/index.php?title=Delta_method&oldid=885377245 | title=  Delta method | author = Wikipedia contributors | website= Wikipedia |publisher= Wikipedia |access-date = 30 May 2019}}
*{{cite web |url = https://en.wikipedia.org/w/index.php?title=Delta_method&oldid=885377245 | title=  Delta method | author = Wikipedia contributors | website= Wikipedia |publisher= Wikipedia |access-date = 30 May 2019}}

Revision as of 22:52, 22 August 2022

The delta method is a result concerning the approximate probability distribution for a function of an asymptotically normal statistical estimator from knowledge of the limiting variance of that estimator.

Method

While the delta method generalizes easily to a multivariate setting, careful motivation of the technique is more easily demonstrated in univariate terms. Roughly, if there is a sequence of random variables [math]X_n[/math] satisfying

[[math]]{\sqrt{n}[X_n-\theta]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2)},[[/math]]

where [math]\theta[/math] and [math]\sigma^2[/math] are finite valued constants and [math]\xrightarrow{D}[/math] denotes convergence in distribution, then

[[math]] {\sqrt{n}[g(X_n)-g(\theta)]\,\xrightarrow{D}\,\mathcal{N}(0,\sigma^2[g'(\theta)]^2)} [[/math]]

for any function [math]g[/math] satisfying the property that [math]g'(\theta) [/math] exists and is non-zero valued.


The method extends to the multivariate case. By definition, a consistent estimator [math]B[/math] converges in probability to its true value [math]\beta[/math], and often a central limit theorem can be applied to obtain asymptotic normality:

[[math]]\sqrt{n} (B-\beta )\,\xrightarrow{D}\,\mathcal{N}(0, \Sigma ),[[/math]]

where n is the number of observations and [math]\Sigma[/math] is a covariance matrix. The multivariate delta method yields the following asymptotic property of a function [math]h[/math] of the estimator [math]B[/math] under the assumption that the gradient [math]\nabla h[/math] is non-zero:

[[math]]\sqrt{n}(h(B)-h(\beta))\,\xrightarrow{D}\,\mathcal{N}(0, \nabla h(\beta)^T \cdot \Sigma \cdot \nabla h(\beta)).[[/math]]

<proofs page = "guide_proofs:3285677816" section = "proof" label = "The Delta Method" />

References

  • Wikipedia contributors. "Delta method". Wikipedia. Wikipedia. Retrieved 30 May 2019.