In information theory
information has no meaning.
Instead, it is a set of all possible
meaning narrowed down in neat,
logarithmic function. A pattern
running parallel to us. A pattern
Since when were engineers
so expert in the philosophical?
Information has no meaning.
Wittgenstein himself could
only describe language.
He could not define them.
Shannon has outdone them all.
You are all the information
you are, and what you summon
is the logarithm, the pattern,
the language game. One of many.
You’re an adherent to iteration.
No meaning except in repetition
or else no meaning at all.
When an engineer has a theory
it’s testable. Look around you.
Where is there not information?
When can you not summon it up
pluck it from the wind like fruit
weightless, abundant, and free?
But whence comes meaning
if no longer in information—
if all that I am, all I do, is only
a performing over again and
again for that which teaches me?
Where’s space for meaning at all
in an automatic, tessellating body?
This is part of a series I’ve been working on that uses the metaphor of different stages of civilization to explore characteristics of language. Here are some of the other ones I’ve done so far:
I wrote this after reading Claude Shannon’s landmark 1948 paper “A Mathematical Theory of Communication,” which is considered to be the foundation for much of our technology today. Here Shannon introduces us to the concept of “bits,” the basic elements of communication, and a definition of information that is radically different from our classical one. “Frequently,” Shannons says, “messages have meaning; that is they refer
to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.” From here Shannon goes on to develop a framework for transmitting, processing, and extracting information with virtually no loss of integrity to a message, without which we would not have, among other things, CD’s, mobile phones, or the Internet.
I find this insight no less profound than if it came from a philosopher who was trying to understanding meaning itself (which is what I believe Wittgenstein was trying to do with his investigations into language). While Shannon was interested in solving the practical problem of long distance communication (the paper was published in Bell System Technical Journal) he nevertheless manages to draw a distinction between information and meaning that has great philosophical import, and which seems to be true in practice. The boons of the information age wouldn’t be possible without such a distinction: that information and meaning are distinct from one another. It’s therefore at this level of observation that we should begin any discussion about meaning. Such a thing is easy enough to test. The next time you find yourself reading an article online, or watching a video, or scrolling through the comments of a thread, stop and ask yourself where the meaning comes from that you perceive yourself to be acquiring. Does it come from the information itself? Does it come from the way you process that information? What does it even mean to make such a distinction?
Thank you for reading!
(The artwork was taken from an art exhibition called Mapping the Information Age, which took place from 2015-6 at the Christopher W. Mount Gallery in West Hollywood. It is a large, reprinted micro-chip circuity diagram.)