It is safe to assume that the reader has encountered the term, Information Theory, by this time in the Digital Age. However, unless one has done their homework one probably is misconstruing what is meant by Information.
In terms of etymology and usage information means the significant data necessary to generate a model of an event, process, or object; to form, especially in the abstract intellectual sense. It implies in the root, inform, communication of that data which after Kuhn has come to mean communication of theory, apparatus, and reserved words.
In the sense of Information Theory information means the reduction of uncertainty which, while it is the net result of the generation and communication of information, is not the traditional usage of the word information but rather the word certainty.
When one is using Information Theory one is actually using Certainty Theory in usage relevant to the language. Why did Claude Shannon create this definition outside common usage? Reduction of uncertainty is the net result of the information process and his predecessors, theorizing in the field of artificial intelligence, used the term intelligence where Shannon dressed that down to information. His work was less ambitious than theirs although it shaped the Digital Age we all live in but it carries in that radical redefinition, which is the stuff of engineering genius, the philosophical baggage of the field in which he worked.
The first thing one learns in philosophy is that one says this construct in these words. To say a thing in other words is to say something else. That is philosophical communication theory and Shannon violated it and introduced confusion where it didn't have to be. When someone says Information Theory, think Certainty Theory and when someone says the bit is the basic unit of information, think basic unit of certainty and it all falls into place.
Do Well and Be Well.
In terms of etymology and usage information means the significant data necessary to generate a model of an event, process, or object; to form, especially in the abstract intellectual sense. It implies in the root, inform, communication of that data which after Kuhn has come to mean communication of theory, apparatus, and reserved words.
In the sense of Information Theory information means the reduction of uncertainty which, while it is the net result of the generation and communication of information, is not the traditional usage of the word information but rather the word certainty.
When one is using Information Theory one is actually using Certainty Theory in usage relevant to the language. Why did Claude Shannon create this definition outside common usage? Reduction of uncertainty is the net result of the information process and his predecessors, theorizing in the field of artificial intelligence, used the term intelligence where Shannon dressed that down to information. His work was less ambitious than theirs although it shaped the Digital Age we all live in but it carries in that radical redefinition, which is the stuff of engineering genius, the philosophical baggage of the field in which he worked.
The first thing one learns in philosophy is that one says this construct in these words. To say a thing in other words is to say something else. That is philosophical communication theory and Shannon violated it and introduced confusion where it didn't have to be. When someone says Information Theory, think Certainty Theory and when someone says the bit is the basic unit of information, think basic unit of certainty and it all falls into place.
Do Well and Be Well.
Comments
Post a Comment