3 min read

H = - Ʃ p(x)logp(x) (Information Theory)

by Bradley E Hoge

H = - Ʃ p(x)logp(x) (Information Theory)
Photograoh by Mariola Grobelska

H is entropy, as in thermodynamics, but in this context 1s and 0s

 

(or remnants of analog—frequencies and amplitudes,

but mostly 1s and 0s)

 

Nowadays, all information in the universe

can be reduced to 1s and 0s.

 

And the odds of finding Intelligent life

can be calculated from these numbers—

 

Used in the context of information, entropy refers to loss

of information sent from a source to a receiver.

 

billions of planets containing Earth-like candidates—

 

It measures the likelihood that the intended signal

can be extracted from randomness

 

(static)

 

Or, perhaps, the number of monkeys typing.

billions of galaxies per multiple universes.

 

Or, how all living systems (cells, bodies, ecosystems)

increase entropy.

 

Entropy is not random, nor chaotic, but rather the lowest energy state.

 

Surely some harbored seeds. And surely a few evolved

intelligence—technology.

 

The negative of the sum indicates the loss of information.

 

Even if life is fitful, random walk must produce

advanced civilizations.

 

It seems counter-intuitive to calculate the usefulness of information

as the reciprocal of loss,

 

but that’s often how the world works

 

(nowadays).

 

We often know something exists only by the absence

of something else.

 

Or, rather, what’s real is what can’t be determined

without loss.

 

Some sum must have survived atomic age

and solved hurdles of mystical thinking—

tribal id—and instincts for self-preservation.

 

Ʃ is shorthand for add it all up.

 

So, where are they? Why haven’t they been found?

What part of Drake’s equation is unsound?

 

p means probability, p(x) is the probability

of any discrete bit being measured.

 

We’re talking about communication after all,

and the biggest fallacy of communication

is the assumption that it has occurred.

 

Perhaps no world survives atomic age.

Perhaps no species survives AI.

 

Our emergence is in progress—our stage in Act IV—

Stone age—bronze age—space race—

Eye of God in quantum particles.

 

The log is not the recording of a starship Captain’s daily diary.

It’s not a section of tree floating down river.

Not even piece of a candy bar.

 

Log, in this case, is short for logarithm.

 

Horror of expecting one truth yet unearthing another.

 

Stay with me, cuz this is confusing: the logarithm is an exponent

 

(as in i 2 = -1),

 

that must be calculated for one variable to produce an expected value

for another variable, the base

 

(as in base instincts, as in primitive urges,

as in the bottom of a vase or Homeplate).

 

If we can outlive furor of consilience, perhaps our beginning

of Act V—our Star Trek future—may dawn.

 

It gets worse. A log can be calculated using any base, as in 10 or 2 or e

 

(Euler’s constant = 2.71828)

 

 … with the … meaning…

march on to infinity without repeating,

making it irrational like π—

 

Or perhaps our light cone is too small. Perhaps our search is moot.

Equation’s flaw merely too great a distance for our call

to be heard. For neighbors to be seen.

 

irrational and imaginary numbers (i) are integral

 

(not the math term, but needed, or must have)

 

to the existence of the universe—

 

(go figure).

 

Got it?

 

We are simply still primitive beings.

So, for any random variable (x), Hx is the sum of all possible

values for a single variable.

 

Human cultures have steeped in dark forests

throughout the range of human history.

 

How else can we explain genocide,

lest attributed to such cruel game theory?

 

In other words, information theory does not directly tell us

how to remember telephone numbers, names, passwords,

or the order of words in a poem.

 

How difficult is it to assume dark forest extends beyond our meager realm?

It is simply how reliable the interpretation is of one single

value being transmitted from a source to a receiver.

 

Our burgeoning awareness of such stark reality leading to this theorem?

 

Our history proves such axioms true.

Justifying our fears of alien invasion

as simply logical view of our fate,

once our threat is evident.

 

And so, to Ʃ it all up, entropy of information

follows the same formula as entropy of heat.

 

(kinda)

 

Too late— we have broadcast our signature—

There is no time left to change human nature.

 

As entropy increases for either thermodynamics or information,

the more we are left in the cold.

About the Author

Bradley Earle Hoge’s book, The Drake Equation was published by VRÆYDA Press. Nebular Hypothesis was published by Cawing Crow Press. His poetry appears in numerous anthologies and journals. He has been a teacher, a children’s museum curator, a college professor, and a vagabond.

Subscribe to my newsletter

Subscribe to my newsletter to get the latest updates and news