Information Gain Calculator

Input Values



Formula

To calculate the Information Gain:

\[ IG = E_{before} - E_{after} \]

Where:

What is Information Gain?

Information gain is a metric used to quantify the reduction in entropy or impurity in a dataset due to the application of a feature or rule. It is a key concept in machine learning and is particularly important in the construction of decision trees, where it is used to select the features that result in the largest gain (or equivalently, the largest reduction in uncertainty about the target variable).

Example Calculations

Example 1:

Assume the entropy before the split is 0.8 and the entropy after the split is 0.5. Using the formula:

\[ IG = 0.8 - 0.5 = 0.3 \]

The Information Gain is 0.3, indicating that the split resulted in a reduction of entropy by 0.3.

Example 2:

Assume the entropy before the split is 1.2 and the entropy after the split is 0.7. Using the formula:

\[ IG = 1.2 - 0.7 = 0.5 \]

The Information Gain is 0.5, showing that the split resulted in a reduction of entropy by 0.5.