Esiste un limite di Chernoff inverso che limita che la probabilità di coda è almeno così grande.
cioè se X 1 , X 2 , … , X n
Esiste un limite di Chernoff inverso che limita che la probabilità di coda è almeno così grande.
cioè se X 1 , X 2 , … , X n
Risposte:
Ecco una prova esplicita che un limite Chernoff standard è strettamente legato a fattori costanti nell'esponente per un determinato intervallo di parametri. (In particolare, se le variabili sono 0 o 1, e 1 con probabilità 1/2 o meno, e ε ∈ ( 0 , 1 / 2 )
Se trovi un errore, per favore fammi sapere.
Lemma 1. (ermeticità del limite di Chernoff)
Sia X
(i)
Se ogni rv è 1 con probabilità al massimo p
(ii)
Se ogni rv è 1 con probabilità almeno p
Prova. Usiamo la seguente osservazione:
Rivendicazione 1. Se 1 ≤ ℓ ≤ k - 1 , allora
( k
Prova del reclamo 1.
Con l'approssimazione di Stirling,
io ! = √2 π i (i/e)ieλdoveλ∈[1/(12i+1),1/12i
Pertanto, ( kℓ ) , che è k!
Proof of Lemma 1 Part (i).
Without loss of generality assume each 0/1 random variable in the sum X
Fix ℓ=⌊(1−2ϵ)pk⌋+1
The assumptions ϵ2pk≥3
To finish we show A≥exp(−ϵ2pk)
Claim 2. A≥exp(−ϵ2pk)
Proof of Claim 2.
The assumptions ϵ2pk≥3
By definition, ℓ≤pk+1
Substituting the right-hand side of (ii) for ℓ
The assumption, ϵ2pk≥3
From ϵ2pk≥3
(iv) and (v) together give the claim. QED
Claim 3. B≥exp(−8ϵ2pk)
Proof of Claim 3.
Fix δ
The choice of ℓ
Claims 2 and 3
imply AB≥exp(−ϵ2pk)exp(−8ϵ2pk)
Proof of Lemma 1 Part (ii).
Without loss of generality assume each random variable is 1
Note Pr[X≥(1+ϵ)p]=∑ni=⌈(1−ϵ)pk⌉Pr[X=i/k]
The last ϵpk
The Berry-Esseen theorem can give tail probability lower bounds, as long as they are higher than n−1/2
Another tool you can use is the Paley-Zygmund inequality. It implies that for any even integer k
Pr[|X|>=12(E[Xk])1/k]≥E[Xk]24E[X2k]
Together with the multinomial theorem, for X
If you are indeed okay with bounding sums of Bernoulli trials (and not, say, bounded random variables), the following is pretty tight.
Slud's Inequality*. Let {Xi}ni=1
{Xi}ni=1 be i.i.d. draws from a Bernoulli r.v. with E(X1)=pE(X1)=p , and let integer k≤nk≤n be given. If either (a) p≤1/4p≤1/4 and np≤knp≤k , or (b) np≤k≤n(1−p)np≤k≤n(1−p) , then Pr[∑iXi≥k]≥1−Φ(k−np√np(1−p)),where ΦPr[∑iXi≥k]≥1−Φ(k−npnp(1−p)−−−−−−−−√), Φ is the cdf of a standard normal.
(Treating the argument to Φ
From here, you can use bounds on Φ
Other than that, and what other people have said, you can also try using the Binomial directly, perhaps with some Stirling.
(*) Some newer statements of Slud's inequality leave out some of these conditions; I've reproduced the one in Slud's paper.
The de Moivre-Laplace Theorem shows that variables like |T∩S1|
For lower bounds like n−c
The Generalized Littlewood-Offord Theorem isn't exactly what you want, but it gives what I think of as a "reverse Chernoff" bound by showing that the sum of random variables is unlikely to fall within a small range around any particular value (including the expectation). Perhaps it will be useful.
Formally, the theorem is as follows.
Generalized Littlewood-Offord Theorem: Let a1,…,an
The exponent in the standard Chernoff bound as it is stated on Wikipedia is tight for 0/1-valued random variables. Let 0<p<1
Here, D(x‖y)=xlog2(x/y)+(1−x)log2((1−x)/(1−y))
As mentioned, the upper bound in the inequality above is proved on Wikipedia (https://en.wikipedia.org/wiki/Chernoff_bound) under the name "Chernoff-Hoeffding Theorem, additive form". The lower bound can be proved using e.g. the "method of types". See Lemma II.2 in [1]. Also, this is covered in the classic textbook on information theory by Cover and Thomas.
[1] Imre Csiszár: The Method of Types. IEEE Transactions on Information Theory (1998). http://dx.doi.org/10.1109/18.720546