Information theory studies the transmission, processing, extraction, and utilization of information. The information a history a theory a flood pdf, information can be thought of as the resolution of uncertainty.

In the latter case, it took many years to find the methods Shannon’s work proved were possible. The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point. Information theory often concerns itself with measures of information of the distributions associated with random variables. Other bases are also possible, but less commonly used.

The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss. Between these two extremes, information can be quantified as follows. Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. Leibler divergence is the number of average additional bits per datum necessary for compression.