### Probability

18 of 49 Completed

Basic Probability

Discrete Distributions

Continuous Distributions

Multivariate Distributions

Sampling Theorems

Probability Questions: Hard

## Joint Distributions & Expectations

## Discrete case

A *discrete bivariate* probability mass function (pmf) is a function $f_{X,Y}(x,y):\mathbb{Z}\times\mathbb{Z}\rightarrow[0,1]$ such that:

$f_{X,Y}(x,y)\geq0 \ \forall \ (x,y)\in\mathbb{Z}\times\mathbb{Z}$

$\sum_{x\in\mathbb{Z}}\sum_{y\in\mathbb{Z}}f_{X,Y}(x,y)=1$

Joint pmfs have the same meaning in terms of random variables as univariate pmfs:

$\mathbb{P}(X=x,Y=y)=f_{X,Y}(x,y)$

We can generalize this to more than two random variables by considering an $n$-dimensional vector of random variables (a “random vector”, if you will), say $\Xi=[X_1,X_2,\dots,X_n]^T$. If $\mathbf{x}=[x_1,x_2,\dots,x_n]^T\in\mathbb{Z}^n$ then