Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
526 views
in General by (105k points)
closed by
What is the entropy of a communication system that consists of six messages with probabilities 1/8, 1/8, 1/8, 1/8, 1/4, and 1/4 respectively ?
1. 1 bits/message 
2. 2.5 bits/message
3. 3 bits/message
4. 4.5 bits/message

1 Answer

0 votes
by (111k points)
selected by
 
Best answer
Correct Answer - Option 2 : 2.5 bits/message

Concept:

The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.

It is calculated as:

\(H=\underset{i=1}{\overset{n}{\mathop \sum }}\,{{p}_{i}}{{\log }_{2}}\left( \frac{1}{{{p}_{i}}} \right)bits/symbol\)

pi is the probability of the occurrence of a symbol.

Calculation:

The entropy will be:

\(H=4\times\frac{1}{8}log_2(8)+2\times\frac{1}{4}log_2(4)\)

\(H=\frac{4\times3}{8}+\frac{2\times2}{4}\)

H = 2.5 bits/message

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...