Dvoretzky–Kiefer–Wolfowitz inequality
In the theory of probability and statistics, the Dvoretzky–Kiefer–Wolfowitz inequality predicts how close an empirically determined distribution function will be to the distribution function from which the empirical samples are drawn. It is named after Aryeh Dvoretzky, Jack Kiefer, and Jacob Wolfowitz, who in 1956 proved the inequality with an unspecified multiplicative constant C in front of the exponent on the righthand side.^{[1]} In 1990, Pascal Massart proved the inequality with the sharp constant C = 2,^{[2]} confirming a conjecture due to Birnbaum and McCarty.^{[3]}
The DKW inequality
Given a natural number n, let X_{1}, X_{2}, …, X_{n} be realvalued independent and identically distributed random variables with cumulative distribution function F(·). Let F_{n} denote the associated empirical distribution function defined by
So is the probability that a single random variable is smaller than , and is the average number of random variables that are smaller than .
The Dvoretzky–Kiefer–Wolfowitz inequality bounds the probability that the random function F_{n} differs from F by more than a given constant ε > 0 anywhere on the real line. More precisely, there is the onesided estimate
which also implies a twosided estimate ^{[4]}
This strengthens the Glivenko–Cantelli theorem by quantifying the rate of convergence as n tends to infinity. It also estimates the tail probability of the Kolmogorov–Smirnov statistic. The inequalities above follow from the case where F corresponds to be the uniform distribution on [0,1] in view of the fact^{[5]} that F_{n} has the same distributions as G_{n}(F) where G_{n} is the empirical distribution of U_{1}, U_{2}, …, U_{n} where these are independent and Uniform(0,1), and noting that
with equality if and only if F is continuous.
See also
 Concentration inequality – a summary of bounds on sets of random variables.
References
 ^ Dvoretzky, A.; Kiefer, J.; Wolfowitz, J. (1956), "Asymptotic minimax character of the sample distribution function and of the classical multinomial estimator", Annals of Mathematical Statistics, 27 (3): 642–669, doi:10.1214/aoms/1177728174, MR 0083864
 ^ Massart, P. (1990), "The tight constant in the Dvoretzky–Kiefer–Wolfowitz inequality", The Annals of Probability, 18 (3): 1269–1283, doi:10.1214/aop/1176990746, MR 1062069

^ Birnbaum, Z. W.; McCarty, R. C. (1958). "A distributionfree upper confidence bound for Pr{Y
. Annals of Mathematical Statistics. 29: 558–562. doi:10.1214/aoms/1177706631. MR 0093874. Zbl 0087.34002.  ^ Kosorok, M.R. (2008), "Chapter 11: Additional Empirical Process Results", Introduction to Empirical Processes and Semiparametric Inference, Springer, p. 210, ISBN 9780387749778
 ^ Shorack, G.R.; Wellner, J.A. (1986), Empirical Processes with Applications to Statistics, Wiley, ISBN 047186725X