Back to Probability
Probability

Probability

18 of 49 Completed

Basic Probability
Discrete Distributions
Continuous Distributions
Multivariate Distributions
Sampling Theorems
Probability Questions: Hard

Joint Distributions & Expectations

Discrete case

A discrete bivariate probability mass function (pmf) is a function fX,Y(x,y):Z×Z[0,1]f_{X,Y}(x,y):\mathbb{Z}\times\mathbb{Z}\rightarrow[0,1] such that:

  1. fX,Y(x,y)0  (x,y)Z×Zf_{X,Y}(x,y)\geq0 \ \forall \ (x,y)\in\mathbb{Z}\times\mathbb{Z}

  2. xZyZfX,Y(x,y)=1\sum_{x\in\mathbb{Z}}\sum_{y\in\mathbb{Z}}f_{X,Y}(x,y)=1

Joint pmfs have the same meaning in terms of random variables as univariate pmfs:

P(X=x,Y=y)=fX,Y(x,y)\mathbb{P}(X=x,Y=y)=f_{X,Y}(x,y)

We can generalize this to more than two random variables by considering an nn-dimensional vector of random variables (a “random vector”, if you will), say Ξ=[X1,X2,,Xn]T\Xi=[X_1,X_2,\dots,X_n]^T. If x=[x1,x2,,xn]TZn\mathbf{x}=[x_1,x_2,\dots,x_n]^T\in\mathbb{Z}^n then

Good job, keep it up!

36%

Completed

You have 31 sections remaining on this learning path.

There's so much more to Interview Query! Sign up to access hundreds of interview questions, expert coaching and a flourishing data science community.