Andhra Pradesh, India
Om Namah Shivay
Local Time - 01:07 AM
About Me
information theory, the Shannon entropy or information entropy is a measure of the uncertainty associated with a random variable. It quantifies the information contained in a message, usually in bits or bits/symbol. It is the minimum message length necessary to communicate information.
This also represents an absolute limit on the best possible lossless compression of any communication: treating a message as a series of symbols, the shortest number of bits necessary to transmit the message is the Shannon entropy in bits/symbol multiplied by the number of symbols in the original message.
A fair coin has an entropy of one bit. However, if the coin is not fair, then the uncertainty is lower if asked to bet on the next outcome, we would bet preferentially on the most frequent result, and thus the Shannon entropy is lower. A long string of repeating characters has an entropy of 0, since every character is predictable. The entropy of English text is between 1.0 and 1.5 bits per letter.[1]
Equivalently, the Shannon entropy is a measure of the average information content the recipient is missing when he does not know the value of the random variable.
The concept was introduced by Claude E. Shannon in his 1948 paper A Mathematical Theory of Communication.
My Expertise
Explore & Appreciate my Work
Sivaiah Gudipudi has not added any portfolio
My Project History & Feedbacks
My Endorsements
Sivaiah Gudipudi hasn't been endorsed yet
My Education
Work Experience
Certifications
$17/hr
Total Earnings
$ 0
Projects Completed
0
Services Delivered
0
Buyer worked with
0
Feedbacks
0
Followers
Total Refund
0
Contest Completed
0
Member since
My Articles
No Articles Posted
Top Freelancers by city
Copyright © 2025 | Truelancer.com