Joint entropy
| Information theory |
|---|
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
| Information theory |
|---|
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.