Vi fant 1 definisjoner av information entropy på engelsk.
Substantiv |
||
| information entropy - A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable usually in units such as bits; the amount of information measured in, say, bits contained per average instance of a character in a stream of characters. |