# Discrete Random Variables

Description of the title

## Definitions of Descrete Random Varibles

Random Variable X - A variable that represents the values obtained when we take a measurement from an experiment in the real world

Discrete Random Variable - A variable that changes by steps, and takes only specified values in a given interval

Probability Function - A function that describes how the probabilites are assigned

Probability Distribution - The set of all values of a random variable together with there assigned probabilities

Expected Value of X = E(X) = mean

Varience of X = Var(x) = Variance

1 of 2

## Expectations/Varience/Conditions

Expectation of a function of X = E[g(x)] = SUM g(x) x P(X=x)

Expectation of a linear function of X = E(aX + b) = aE(X) + b

Variance of a linear function of X = Var(aX + b) = a^2 x Var(X)

Discrete uniform distribution (If X has a discrete uniform distribution) = P(X=xr) = 1/n r =1,2,3....r

Conditions for a discrete uniform distribution - the variable X is described over a set of n distinct values. Each value x, is equally likely

2 of 2

## Comments

No comments have yet been made

## Similar Statistics resources:

See all Statistics resources »