基本概率分布Basic Concept of Probability Distributions 1: Binomial Distribution
PMF
If the random variable $X$ follows the binomial distribution with parameters $n$ and $p$, we write $X \sim B(n, p)$. The probability of getting exactly $x$ successes in $n$ trials is given by the probability mass function: $$f(x; n, p) = \Pr(X=x) = {n\choose x}p^{x}(1-p)^{n-x}$$ for $x=0, 1, 2, \cdots$ and ${n\choose x} = {n!\over(n-x)!x!}$.
Proof:
$$ \begin{align*} \sum_{x=0}^{\infty}f(x; n, p) &= \sum_{x=0}^{\infty}{n\choose x}p^{x}(1-p)^{n-x}\\ &= [p + (1-p)]^{n}\;\;\quad\quad \mbox{(binomial theorem)}\\ &= 1 \end{align*} $$
Mean
The expected value is $$\mu = E[X] = np$$
Proof:
$$ \begin{align*} E\left[X^k\right] &= \sum_{x=0}^{\infty}x^{k}{n\choose x}p^{x}(1-p)^{n-x}\\ &= \sum_{x=1}^{\infty}x^{k}{n\choose x}p^{x}(1-p)^{n-x}\\ &= np\sum_{x=1}^{\infty}x^{k-1}{n-1\choose x-1}p^{x-1}(1-p)^{n-x}\quad\quad\quad (\mbox{identity}\ x{n\choose x} = n{n-1\choose x-1})\\ &= np\sum_{y=0}^{\infty}(y+1)^{k-1}{n-1\choose y}p^{y}(1-p)^{n-1-y}\quad(\mbox{substituting}\ y=x-1)\\ &= npE\left[(Y + 1)^{k-1}\right] \quad\quad\quad \quad\quad\quad \quad\quad\quad\quad\quad (Y\sim B(n-1, p)) \\ \end{align*} $$ Using the identity $$ \begin{align*} x{n\choose x} &= {x\cdot n!\over(n-x)!x!}\\ & = {n!\over(n-x)!(x-1)!}\\ &= n{(n-1)!\over[(n-1)-(x-1)]!(x-1)!}\\ &= n{n-1\choose x-1} \end{align*} $$ Hence setting $k=1$ we have $$E[X] = np$$
Variance
The variance is $$\sigma^2 = \mbox{Var}(X) = np(1-p)$$
Proof:
$$ \begin{align*} \mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\ &= npE[Y+1] - n^2p^2\\ & = np\left(E[Y] + 1\right) - n^2p^2\\ & = np[(n-1)p + 1] - n^2p^2\quad\quad (Y\sim B(n-1, p))\\ &= np(1-p) \end{align*} $$
Examples
1. Let $X$ be binomially distributed with parameters $n=10$ and $p={1\over2}$. Determine the expected value $\mu$, the standard deviation $\sigma$, and the probability $P\left(|X-\mu| \geq 2\sigma\right)$. Compare with Chebyshev's Inequality.
Solution:
The binomial mass function is $$f(x) ={n\choose x} p^x \cdot q^{n-x},\ x=0, 1, 2, \cdots$$ where $q=1-p$. The expected value and the standard deviation are $$E[X] = np=5,\ \sigma = \sqrt{npq} = 1.581139$$ The probability that $X$ takes a value more than two standard deviations from $\mu$ is $$ \begin{align*} P\left(|X-\mu| \geq 2\sigma\right) &= P\left(|X-5| \geq 3.2\right)\\ &= P(X\leq 1) + P(X \geq9)\\ &= \sum_{x=0}^{1}{10\choose x}p^{x}(1-p)^{10-x} + \sum_{x=9}^{\infty}{10\choose x}p^{x}(1-p)^{10-x}\\ & = 0.02148437 \end{align*} $$ R code:
sum(dbinom(c(0, 1), 10, 0.5)) + 1 - sum(dbinom(c(0:8), 10, 0.5))
# [1] 0.02148437
pbinom(1, 10, 0.5) + 1 - pbinom(8, 10, 0.5)
# [1] 0.02148438
Chebyshev's Inequality gives the weaker estimation $$P\left(|X - \mu| \geq 2\sigma\right) \leq {1\over2^2} = 0.25$$
2. What is the probability $P_1$ of having at least six heads when tossing a coin ten times?
Solution:
$$ \begin{align*} P(X \geq 6) &= \sum_{x=6}^{10}{10\choose x}0.5^{x}0.5^{10-x}\\ &= 0.3769531 \end{align*} $$ R code:
1 - pbinom(5, 10, 0.5)
# [1] 0.3769531
sum(dbinom(c(6:10), 10, 0.5))
# [1] 0.3769531
3. What is the probability $P_2$ of having at least 60 heads when tossing a coin 100 times?
Solution:
$$ \begin{align*} P(X \geq 60) &= \sum_{x=60}^{100}{100\choose x}0.5^{x}0.5^{100-x}\\ &= 0.02844397 \end{align*} $$ R code:
1 - pbinom(59, 100, 0.5)
# [1] 0.02844397
sum(dbinom(c(60:100), 100, 0.5))
# [1] 0.02844397
Alternatively, we can use normal approximation (generally when $np > 5$ and $n(1-p) > 5$). $\mu = np=50$ and $\sigma = \sqrt{np(1-p)} = \sqrt{25}$. $$ \begin{align*} P(X \geq 60) &= 1 - P(X \leq 59)\\ &= 1- \Phi\left({59.5-50\over \sqrt{25}}\right)\\ &= 1-\Phi(1.9)\\ &= 0.02871656 \end{align*} $$ R code:
1 - pnorm(1.9)
# [1] 0.02871656
4. What is the probability $P_3$ of having at least 600 heads when tossing a coin 1000 times?
Solution: $$ \begin{align*} P(X \geq 600) &= \sum_{x=600}^{1000}{1000\choose x} 0.5^{x} 0.5^{1000-x}\\ &= 1.364232\times10^{-10} \end{align*} $$ R code:
sum(dbinom(c(600:100), 1000, 0.5))
# [1] 1
sum(dbinom(c(600:1000), 1000, 0.5))
# [1] 1.364232e-10
Alternatively, we can use normal approximation. $\mu = np=500$ and $\sigma = \sqrt{np(1-p)} = \sqrt{250}$. $$ \begin{align*} P(X \geq 600) &= 1 - P(X \leq 599)\\ &= 1- \Phi\left({599.5-500\over \sqrt{250}}\right)\\ &= 1.557618 \times 10^{-10} \end{align*} $$ R code:
1 - pnorm(99.5/sqrt(250))
# [1] 1.557618e-10
Reference
- Ross, S. (2010). A First Course in Probability (8th Edition). Chapter 4. Pearson. ISBN: 978-0-13-603313-4.
- Brink, D. (2010). Essentials of Statistics: Exercises. Chapter 5 & 8. ISBN: 978-87-7681-409-0.
基本概率分布Basic Concept of Probability Distributions 1: Binomial Distribution的更多相关文章
- 基本概率分布Basic Concept of Probability Distributions 5: Hypergemometric Distribution
PDF version PMF Suppose that a sample of size $n$ is to be chosen randomly (without replacement) fro ...
- 基本概率分布Basic Concept of Probability Distributions 8: Normal Distribution
PDF version PDF & CDF The probability density function is $$f(x; \mu, \sigma) = {1\over\sqrt{2\p ...
- 基本概率分布Basic Concept of Probability Distributions 7: Uniform Distribution
PDF version PDF & CDF The probability density function of the uniform distribution is $$f(x; \al ...
- 基本概率分布Basic Concept of Probability Distributions 6: Exponential Distribution
PDF version PDF & CDF The exponential probability density function (PDF) is $$f(x; \lambda) = \b ...
- 基本概率分布Basic Concept of Probability Distributions 3: Geometric Distribution
PDF version PMF Suppose that independent trials, each having a probability $p$, $0 < p < 1$, o ...
- 基本概率分布Basic Concept of Probability Distributions 2: Poisson Distribution
PDF version PMF A discrete random variable $X$ is said to have a Poisson distribution with parameter ...
- 基本概率分布Basic Concept of Probability Distributions 4: Negative Binomial Distribution
PDF version PMF Suppose there is a sequence of independent Bernoulli trials, each trial having two p ...
- PRML Chapter 2. Probability Distributions
PRML Chapter 2. Probability Distributions P68 conjugate priors In Bayesian probability theory, if th ...
- Common Probability Distributions
Common Probability Distributions Probability Distribution A probability distribution describes the p ...
随机推荐
- 多态、GC、Java数据类型
多态 一.java中实现多态的机制是什么? 靠的是: 父类定义的引用变量可以指向子类的实例对象,或者接口定义的引用变量可以指向具体实现类的实例对象 而程序调用的方法,在运行期才动态绑定, 它就是引用变 ...
- Webwork 学习之路【02】前端OGNL试练
1.OGNL 出现的意义 在mvc中,数据是在各个层次之间进行流转是一个不争的事实.而这种流转,也就会面临一些困境,这些困境,是由于数据在不同世界中的表现形式不同而造成的: a. 数据在页面上是一个扁 ...
- spread语法解析与使用
@[spread, javavscript, es6, react] Spread语法是ES6中的一个新特性,在需要使用多参数(函数参数).多元素(数组迭代)或者多变量(解构赋值)的地方使用sprea ...
- redis/分布式文件存储系统/数据库 存储session,解决负载均衡集群中session不一致问题
先来说下session和cookie的异同 session和cookie不仅仅是一个存放在服务器端,一个存放在客户端那么笼统 session虽然存放在服务器端,但是也需要和客户端相互匹配,试想一个浏览 ...
- SQL 2014 in-memory中的storage部分
基于CTP1的官方白皮书,自己理解的内容.白皮书下载地址:http://download.microsoft.com/download/F/5/0/F5096A71-3C31-4E9F-864E-A6 ...
- 离散系统频响特性函数freqz()
MATLAB提供了专门用于求离散系统频响特性的函数freqz(),调用freqz()的格式有以下两种: l [H,w]=freqz(B,A,N) B和A分别为离散系统的系统函数分子.分母 ...
- [转]7个高性能JavaScript代码高亮插件
对于喜欢写技术博客的同学来说,一定对代码高亮组件非常熟悉.一款优秀的JavaScript代码高亮插件,将会帮助你渲染任何一种编程语言,包括一些关键字的着色,以及每行代码的缩进等.今天我们要来分享一些高 ...
- android开发------编写用户界面之线性布局(补充知识)
在前面的文章中 http://www.cnblogs.com/ai-developers/p/android_linearlayout.html 我们看到了布局中有这样一个属性: layout_wei ...
- canvas边界与摩擦力
处理物体超出画布时的三种基本状态,复位,移除,反弹 (1)检测是否越界的核心算法 if( object.x - object.width / 2 > right || object.x + ob ...
- 【POJ 2318】TOYS 叉积
用叉积判断左右 快速读入写错了卡了3小时hhh #include<cmath> #include<cstdio> #include<cstring> #includ ...