D1.1 — Information Definition

âš¡ At a Glance

AttributeDetail
ClaimInformation is that which reduces uncertainty about a system state.
CategoryInformation Theory / Epistemology
Depends On001_A1.1_Existence, 002_A1.2_Distinction
Enables005_D1.2_Bit-Definition, 038_D5.2_Integrated-Information-Phi
Dispute ZoneObjective Physics vs. Subjective Interpretation
Theology?❌ No (Foundational definition)
Defeat TestShow a case where information is gained without reducing uncertainty.

🧠 Why This Matters (The Story)

The Resolution of Chaos.

Imagine you are standing in a dark room. You know there is a chair somewhere, but you don’t know where. This is a state of Uncertainty. When you turn on a flashlight and see the chair, your uncertainty vanishes. What changed? You didn’t “add stuff” to the room; you added Information.

D1.1 defines Information not as a “thing” like a rock or a gas, but as a Relationship. It is the measure of how much “Fuzziness” is removed from the world. Without this definition, we can’t talk about entropy, order, or life. Information is the light that turns a cloud of “maybe” into a solid “is.”


🔒 Formal Statement

Information is operationally defined as that which reduces uncertainty about the state of a system. $H(X) = - p(x)  p(x)  p(x)$


🟦 Definition Layer

What we mean by the terms.

Information ($I$): [Standard: Shannon] The resolution of uncertainty. The quantity required to specify a single state from a set of possible states.

Uncertainty ($H$): [Standard: Boltzmann/Shannon] The measure of indeterminacy or “possibility space” before a message is received.


🧭 Category Context (The Judge)

Orientation for the Debate.

Primary Category: Information Theory Dispute Zone: Objective Physics vs. Subjective Meaning.

If you object to this axiom, you are likely objecting to:

  • Subjectivism: “Information only exists if a human is there to understand it.”
  • Semanticism: “Information requires meaning; Shannon entropy is just math.”

🔗 Logical Dependency

The Chain of Custody.

Predicated Upon (Assumes):


🟨 Logical Structure

The Derivation.

  1. Premise 1: A system can be in one of many possible states.
  2. Premise 2: Not knowing the specific state is the definition of uncertainty.
  3. Observation: Learning the state removes the uncertainty.
  4. Conclusion: The measure of that removal is the measure of Information.

🟩 Formal Foundations (Physics View)

The Math & Theory.

Scientific Concept: Shannon Entropy. Claude Shannon proved that the only function that consistently measures uncertainty is the logarithmic sum of probabilities.

Equation / Law: Von Neumann Entropy: $$ S(ρ) = -Τρ(ρ  ρ) $$ The quantum version of information definition, which measures the “pureness” of a quantum state.


🧪 Evidence Layer (Empirical View)

The Verification.

  • Communication Limits: Every cell phone and internet cable in the world obeys this definition. We cannot cram more “certainty” through a wire than the Shannon Limit allows.
  • Landauer’s Bound: Physical experiments confirm that erasing this “uncertainty reduction” generates heat, proving information is physical.

📜 Canonical Sources (Authority View)

The Pedigree.

“Information is the resolution of uncertainty.” — Claude Shannon, A Mathematical Theory of Communication (1948)


🟥 Metaphysical Commitment (Theology View)

The Meaning.

Theological Interpretation: This axiom maps perfectly to the creative act of God in Genesis. Creation begins with Speech (The Logos) entering the “formless and void” (Maximum Uncertainty) and separating light from dark (The first Distinction). To define information this way is to recognize that God’s Word is the primary “Uncertainty-Reducer” of the cosmos.


💥 Defeat Conditions

How to break this link.

To falsify this axiom, you must:

  1. Provide a coherent definition of information that does not reference the narrowing of possibility space.
  2. Demonstrate a case where “Meaning” is conveyed without any underlying reduction in state-uncertainty.