Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license.
Give it a read and then ask your questions in the chat.
We can research this topic together.
Theorem in probability
☢In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Latin alphabet, is a theorem from probability, and is also frequently used in analysis.
Consider complex numbers , which can be pictured as vectors in a plane. Now sample random signs , with equal independent probability. The inequality intuitively states that
for some constants depending only on (see Expected value for notation). The sharp values of the constants were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof). It is a simple matter to see that when , and when .
The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let be a linear operator between two L spaces and , , with bounded norm , then one can use Khintchine's inequality to show that
for some constant depending only on and .
Generalizations
For the case of Rademacher random variables, Pawel Hitczenko showed that the sharpest version is:
where , and and are universal constants independent of .
Here we assume that the are non-negative and non-increasing.
Thomas H. Wolff, "Lectures on Harmonic Analysis". American Mathematical Society, University Lecture Series vol. 29, 2003. ISBN0-8218-3449-5
Uffe Haagerup, "The best constants in the Khintchine inequality", Studia Math. 70 (1981), no. 3, 231–283 (1982).
Fedor Nazarov and Anatoliy Podkorytov, "Ball, Haagerup, and distribution functions", Complex analysis, operators, and related topics, 247–267, Oper. Theory Adv. Appl., 113, Birkhäuser, Basel, 2000.