Primarily in Information Theory and Coding, we discuss communication systems. it is a mathematical approach to study digital signals whether it is analog or digital and the coding of information with respect to communication.


Information is an intelligence/idea or message that used to transfer thoughts from one source to another. an information source could be electrical, picture, video, audio or can be in speech/voice format.

To pass information from the source to the destination we need an encoder, transmitter, channel, decoder, and receiver.

Information Theory and Coding - Introduction

In information theory, there are three conditions that can be occured uncertainty, surprise, and information at the time passing the information.


If the information is not transmitted by source then there is a condition of uncertainty means the receiver doesn't know that what type of information and which information can be passed so the receiver comes into an uncertainty condition.

for example, if your father is coming the home and you told him to bring a gift. but you don't know that what type of thing your father will give you in the gift because the event is not occured. so there is an uncertain condition for you.


If the information has just passed to the receiver then there is a surprise condition. just like your father just gives you your favorite game as a gift. so there is a surprise condition for you.


If the information has passed to the receiver a time back then there is a condition of having some information. just like your father gifted you a game a time back then in the current time it is the information for you and others.

Remember these three conditions uncertainty, surprise and information occur at different times and the difference of occurrence of these conditions will generate the probability.

The information is measured in a bit. so the unit of information is a bit.

Mutual information

It is defined as the amount of information transferred where x is transmitted and y is the receiver.

Average mutual information

It is defined as the amount of source information gained per received symbol.

Properties of Information

  1. Information is always positive.
  2. While increasing in uncertainty then information is also increased.
  3. If the receiver knows that message is being transmitted then the information is zero.
  4. If a source m1 is transmitting an information I1 and another source m2 is transmitting an information I2 then the combined information is (I1 + I2).


It can be defined as a measure of the average information content per source symbol. it is also called Shannon's Entropy and denoted by H.

Formula of Entropy

                                          H = Σ pi Logb(Pi)


 p = probability 

 i = occurance of message

 b = base of log.

Conditional Entropy

The amount of remaining uncertainty of an input channel after observing the channel output is called conditional entropy.

The formula of conditional entropy

                                                       H(X/Y) = p(x)H(Y/X=x)

In simple words, conditional entropy H(X/Y) of x is the average uncertainty of x when y is known.

Discrete memoryless source

A source can be a discrete memoryless source if the message present in the source does not continue and the value of each message is independent of the previous value.

For example, if we have a source X = {m1, m2, .... mn} then this is a discrete memoryless source because we can count the number of messages and each value of m is not dependent on the previous value.

Source coding

In source coding when a signal is transmitted then it encoded into a codeword or Morse word that can be decoded by the receiver.

so if a source is the discrete memoryless source of entropy H then the codeword is always greater than or equal to the source code of a signal. that means the symbols in the codeword are greater than or equal to the alphabets in source code.

Channel coding

Channel coding is a communication system that improves the reliability of the system. it maps the data sequence into a channel input sequence and inverse maps the channel output sequence into an output data sequence. the main purpose of channel coding is to minimize the noise of the signal.