This document provides formulas and concepts related to information theory and coding. It defines key information theory terms like self-information, entropy, information rate, bits, Hartley's, nats, extremal entropy, source efficiency, and source redundancy. Formulas are given for calculating each of these terms. The document also provides formulas for calculating the average information content of symbols from different states. Finally, it lists some basic log properties that are important in information theory calculations.