Post Page Advertisement [Top]




Just learning and thinking about the concepts of Information theory are guaranteed to strain your brain (which grows bigger and stronger with exercise) and to change the way at which you look at the world and its many curious phenomena. Just reading this article will change the way that you think.

The Free Dictionary offers several terse, but compressed (no pun intended) working definitions of Information Theory:

information theory
n.
The theory of the probability of transmission of messages with specified accuracy when the bits of information constituting the messages are subject, with certain probabilities, to transmission failure, distortion, and accidental additions.


information theory
n
(Mathematics) a collection of mathematical theories, based on statistics, concerned with methods of coding, transmitting, storing, retrieving, and decoding information

information theory
A branch of mathematics that mathematically defines and analyzes the concept of information. Information theory involves statistics and probability theory, and applications include the design of systems that have to do with data transmission, encryption, compression, and other information processing.
The American Heritage® Science Dictionary Copyright © 2005 by Houghton Mifflin Company. Published by Houghton Mifflin Company. All rights reserved.

---------------
 
The Wikipedia drawn-out but slightly more detailed definition of Information Theory follows:

Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks—as in neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, plagiarism detection and other forms of data analysis.

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPGs), and channel coding (e.g. for Digital Subscriber Line (DSL)). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.

Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.

What does this mean to me?

1) In practical terms, Braintenance Colleagues, the transmission of information (without distortion, noise, degradation and the like), without even taking into effect to the message contained in that information is always the first problem or challenge. For example, for my girlfriend to get my marriage proposal [after I'd been on a horrific bender ], she must first receive the letter, regardless of its contents. The likelihood of the package being delivered is what information theory is all about. The message must get to its destination before the meaning even becomes relevant.

2) The second challenge is getting the uncorrupted and intact message to its destination as efficiently as possible. This requires a tradeoff between compression (efficiency) of the message, versus sending over some strolling troubadours (in the preceding example). If the message (just as a signal) can be conveyed in a tiny quantum packet instead of a long wave, we are doing something right.

3) If we are sending multiple bits of information, is there a way to compress them and arrange them so that they may all "be shipped in one container?" DNA is a wonderful example of the 'largest message sent in the smallest bottle'.

Once again, for those of you who may also be readers of The Sending Signals Blog, bear in mind that we are not addressing the meaning or intelligence of the transmittal -- we are merely talking about getting it there cheaply and safely when we talk about Information Theory.

What is the Objective Of Information Theory?

The Holy Grail of Information Theory is to get the greatest amount of information from sender to receiver in the smallest possible container at the fastest possible speed. And we must remember that this challenge is completely separate from making certain that the message sent conveys precisely to the recipient what we want to say.

Think on that, folks!

Douglas E. Castle for The Braintenance Blog, The Sending Signals Blog and The Daily Burst Of Brilliance Blog

Please retweet and repost this article to your social media. See the new RT button below!






View DOUGLAS E. CASTLE's profile on LinkedIn


Douglas E Castle All Blogs & RSS Feeds

Share this page
Contact Douglas Castle
Follow Me on Pinterest

No comments:

Post a Comment

Bottom Ad [Post Page]

| Designed by Colorlib