# Is mutual information a measure of correlation?

## Is mutual information a measure of correlation?

Correlation analysis provides a quantitative means of measuring the strength of a linear relationship between two vectors of data. Mutual information is essentially the measure of how much “knowledge” one can gain of a certain variable by knowing the value of another variable.

## Is mutual information better than correlation?

Mutual information, like entropy, is measured in bits. It is considered more general than correlation and handles nonlinear dependencies and discrete random variables.

How is mutual information calculated?

The mutual information between two random variables X and Y can be stated formally as follows: I(X ; Y) = H(X) – H(X | Y)

How do you read mutual information values?

High mutual information indicates a large reduction in uncertainty; low mutual information indicates a small reduction; and zero mutual information between two random variables means the variables are independent.

### What is NMI score?

Normalized Mutual Information (NMI) is a normalization of the Mutual Information (MI) score to scale the results between 0 (no mutual information) and 1 (perfect correlation).

### What is a good mutual information score?

The MI score will fall in the range from 0 to 1. The higher value, the closer connection between this feature and the target, suggests that we should put this feature in the training dataset. If the MI score is 0 or very low like 0.01. the low score suggests a weak connection between this feature and the target.

What does a negative PMI mean?

A PMI(x,y) = 0 means that the particular values of x and y are statistically independent; positive PMI means they co-occur more frequently than would be expected under an independence assumption, and negative PMI means they cooccur less frequently than would be expected.

What is a good mutual Info score?

## What is the range of mutual information?

Mutual Information I(X,Y) yelds values from 0 (no mutual information – variables X and Y are independent) to +∞. The higher the I(X,Y), the more information is shared between X and Y. However, high values of mutual information might be unintuitive and hard to interpret due to its unbounded range of values I(X,Y)∈[0…

## What does mutual information score mean?

The Mutual Information score expresses the extent to which observed frequency of co-occurrence differs from what we would expect (statistically speaking). In statistically pure terms this is a measure of the strength of association between words x and y.