Let (X_{1}, X_{2}) be independent random varibales. X_{1} has mean 0 and variance 1, while X_{2} has mean 1 and variance 4. The mutual information I(X_{1 }; X_{2}) between X_{1} and X_{2} in bits is_______.

This question was previously asked in

GATE EC 2017 Official Paper: Shift 1

Free

CT 1: Ratio and Proportion

2672

10 Questions
16 Marks
30 Mins

__Concept: __

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.

It is mathematically defined as:

I(X_{1}, X_{2}) = H(X_{1}) – H(X_{1}/X_{2})

__Application__:

Since X_{1} and X_{2} are independent, we can write:

H(X_{1}/X_{2}) = H(X_{1})

I(X_{1},X_{2} ) = H(X_{1}) – H(X_{1})

= 0

India’s **#1 Learning** Platform

Start Complete Exam Preparation

Daily Live MasterClasses

Practice Question Bank

Mock Tests & Quizzes

Trusted by 2,18,92,429+ Students

Start your FREE coaching now >>

Testbook Edu Solutions Pvt. Ltd.

1st & 2nd Floor, Zion Building,

Plot No. 273, Sector 10, Kharghar,

Navi Mumbai - 410210

[email protected]
Plot No. 273, Sector 10, Kharghar,

Navi Mumbai - 410210

Toll Free:1800 833 0800

Office Hours: 10 AM to 7 PM (all 7 days)