Information Gain and Entropy
Conditional Entropy
Measure of the remaining uncertainty about a random variable Y when the value of another variable X is known, essential for calculating information gain. Represents the average entropy of the conditional distributions of Y given each value of X.
← Zurück