Term
|
Definition
μ=1/n Σ[i=1,n] x_i σ=√(1/(n-1) Σ[i=1,n] (x_i-x_bar)^2) |
|
|
Term
|
Definition
|
|
Term
|
Definition
1) Complement(AnB) =C(A)uC(B) 2) Complement(AuB)=C(A)nC(B) |
|
|
Term
|
Definition
3) An(BuC)=(AnB)u(AnC) 4) Au(BnC)=(AuB)n(Auc) |
|
|
Term
|
Definition
1) P(A)≥0 2) P(S)=1 3) P(A1uA2u…uAn) = Σ[i=1,n] P(Ai) |
|
|
Term
|
Definition
# of ways of ordering n distinct objects taken r at a time: P[n,r]=n!/(n-r)! |
|
|
Term
|
Definition
# of combos of n objects selected r at a time: C[n,r]=(n,r)=n!/(r!(n-r)!) |
|
|
Term
|
Definition
The # of ways of partitioning n distinct objects into k distinct groups containing n1,n2,…,nk objects, respectively, where each object appears in exactly one group and Σ[i=1,k] ni=n, is N= (n, n1 n2 … nk)=n!/(n1!n2!...nk!) |
|
|
Term
|
Definition
|
|
Term
Multiplicative and Additive Laws |
|
Definition
P(AnB)=P(A)P(B|A) [if A,B are independent, =P(A)P(B)] P(AuB)=P(A)+P(B)-P(AnB) [if A,B are mutually exclusive, =P(A)+P(B)] |
|
|
Term
|
Definition
P(A)=Σ[i=1,k] P(A|Bi)P(Bi) |
|
|
Term
|
Definition
P(Bj|A) = P(A|Bj)P(Bj)/(Σ[i=1,k] P(A|Bi)P(Bi)) |
|
|
Term
|
Definition
E(Y)=μ=Σ[y] yP(y) V(Y)=σ^2=E[(Y-μ)^2]=E[Y^2]-μ^2 |
|
|
Term
|
Definition
Moment: a set of numerical measures that describe or uniquely determine P(Y) under certain conditions The kth moment of the RV Y taken around the origin is E(Y^k) and is written as μ'_k |
|
|
Term
|
Definition
γ_1=E[(Y-μ)^3]/σ^3 (γ can be positive, negative or 0/symmetric) kurtosis replaces 3 with 4 |
|
|
Term
|
Definition
1) n identical, fixed trials 2) each trial results in 1 of 2 possible outcomes 3) probability of success is p, failure is 1-p (aka q) 4) trials are independent 5) R.V. Y is the # of successes |
|
|
Term
|
Definition
P(Y)=(n,y)(p^y)(q^(n-y)) E(Y)=np V(Y)=npq |
|
|
Term
|
Definition
1) F(-∞)=0 2) F(∞)=1 3) F(*) is non-decreasing in y ST if y1<y2 then F(y1)≤F(y2) |
|
|
Term
|
Definition
|
|
Term
probability density function |
|
Definition
f(y)=dF(y)/dy=F'(y) p(a≤y≤b) = integral[a,b]f(y)dy=F(b)-F(a) |
|
|
Term
E(Y), E[g(y)] (area under a curve) |
|
Definition
integral [-∞,∞] yf(y)dy, integral [-∞,∞] g(y)f(y)dy |
|
|
Term
Uniform continuous distribution function, also mean & variance |
|
Definition
f(y) = { 1/(θ2-θ1) for θ1≤y≤θ2; 0 otherwise} E(Y)=(θ1+θ2)/2; V(Y)=σ^2=(θ2-θ1)^2/12 |
|
|
Term
|
Definition
|
|
Term
|
Definition
P(y1|y2) = P(y1,y2)/P2(y2), provided p2(y2)>0 f(y1|y2) = f(y1,y2)/f2(y2), provided f2(y2)>0 |
|
|
Term
Indpendence for discrete and continuous random variables |
|
Definition
Discrete: P(y1,y2)=p1(y1)p2(y2) Continuous: f(y1,y2)=f1(y1)f2(y2) |
|
|
Term
COV(Y1,Y2), P(Cov. Coefficient) |
|
Definition
COV(Y1,Y2)=E[(Y1-μ1)(Y2-μ2)]=E[Y1,Y2]-μ1μ2, where E[Y1,Y2]=Σ[for all y1]Σ[for all y2] y1y2p(y1,y2), P(Cov. Coefficient) = COV(Y1,Y2)/σ1σ2 If Y1,Y2 are indpendent then COV(Y1,Y2)=0 (BUT converse is not true) |
|
|
Term
Conditional expectation of g(Y1)|Y2=y2 |
|
Definition
For jointly continuous Y1, Y2: E(g(Y1)|Y2=y2) = integral [-∞,∞] g(y1)f(y1|y2)dy1; For jointly discrete Y1, Y2: E(g(Y1|Y2=y2)=Σ[all y1] g(y1)p(y1|y2) |
|
|
Term
|
Definition
A function of the observable random variables in a sample and known constants |
|
|
Term
|
Definition
indentically and independently distributed |
|
|
Term
|
Definition
Central Limit Theorem: Let y1,…,yn be random samples from any distribution with a pop. mean of μ, and variance σ^2, then E(y-bar)=μ, V(y-bar)=σ^2/n and y-bar will have approximately a normal distribution as long as the sample size is iid and sufficiently large, aka y-bar~N(μ,σ^2/n) |
|
|
Term
|
Definition
|
|
Term
|
Definition
F(y)=integral[-∞,y]f(t)dt |
|
|
Term
|
Definition
f1(y1)=integral[-∞,∞]f(y1,y2)dy2 |
|
|