Misplaced Pages

Mill's Inequality

Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
The topic of this article may not meet Misplaced Pages's general notability guideline. Please help to demonstrate the notability of the topic by citing reliable secondary sources that are independent of the topic and provide significant coverage of it beyond a mere trivial mention. If notability cannot be shown, the article is likely to be merged, redirected, or deleted.
Find sources: "Mill's Inequality" – news · newspapers · books · scholar · JSTOR (December 2024) (Learn how and when to remove this message)

Mill's Inequality is a useful tail bound on Normally distributed random variables.

Mill's Inequality — Let Z N ( 0 , 1 ) {\displaystyle Z\sim N(0,1)} . Then P ( | Z | > t ) 2 π exp ( t 2 / 2 ) t exp ( t 2 / 2 ) t {\displaystyle \operatorname {P} (|Z|>t)\leq {\sqrt {\frac {2}{\pi }}}{\frac {\exp(-t^{2}/2)}{t}}\leq {\frac {\exp(-t^{2}/2)}{t}}}

The looser bound shows the exponential shape. Compare this to the Chernoff bound:

P ( | Z | > t ) 2 exp ( t 2 / 2 ) {\displaystyle \operatorname {P} (|Z|>t)\leq 2\exp(-t^{2}/2)}

References

  1. Wasserman, Larry (2004). "All of Statistics". Springer Texts in Statistics: 65. doi:10.1007/978-0-387-21736-9. ISBN 978-1-4419-2322-6. ISSN 1431-875X.
  2. Ma, Xuezhe. "Probability Inequalities 10/36-705 Intermediate Statistics Lecture Notes 2" (PDF).
Category:
Mill's Inequality Add topic