The question is whether these pictures represent information or not.
Figure 9 is a picture of icons cut in stone as they appear in the graves of pharaohs and on obelisks of ancient Egypt. The question is whether these pictures represent information or not. So, let us check them against the three necessary conditions (NC) for identifying information (discussed in more detail in paragraph 4.2):
NC 1: A number of symbols are required to establish information. This first condition is satisfied because we have various different symbols like an owl, water waves, a mouth, reeds, etc.
NC 2: The sequence of the symbols must be irregular. This condition is also satisfied, as there are no regularities or periodic patterns.
NC 3: The symbols must be written in some recognizable order, such as drawn, printed, chiseled, or engraved in rows, columns, circles, or spirals. In this example, the symbols appear in columns.
It now seems possible that the given sequence of symbols might comprise information because all three conditions are met, but it could also be possible that the Egyptians simply loved to decorate their monuments. They could have decorated their walls with hieroglyphics,1 just like we often hang carpets on walls. The true nature of these symbols remained a secret for 15 centuries because nobody could assign meanings to them. This situation changed when one of Napoleon’s men discovered a piece of black basalt near the town of Rosetta on the Nile in July 1799. This flat stone was the size of an ordinary dinner plate and it was exceptional because it contained inscriptions in three languages: 54 lines of Greek, 32 lines of Demotic, and 14 lines of hieroglyphics. The total of 1,419 hieroglyphic symbols includes 166 different ones, and there are 468 Greek words. This stone, known as the Rosetta Stone (Figure 10), is now in the possession of the British Museum in London. It played a key role in the deciphering of hieroglyphics, and its first success was the translation of an Egyptian pictorial text in 1822.2
Because the meaning of the entire text was found, it was established that the hieroglyphics really represented information. Today, the meanings of the hieroglyphic symbols are known, and anybody who knows this script is able to translate ancient Egyptian texts. Since the meaning of the codes is known, it is now possible to transcribe English text into hieroglyphics, as is shown in Figure 11, where the corresponding symbols have been produced by means of a computer/plotter system.
This illustrative example has now clarified some basic principles about the nature of information. Further details follow.
When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:
–How many letters, numbers, and words make up the entire text?
–How many single letters does the employed alphabet contain (e. g. a, b, c . . . z, or G, C, A, T)?
–How frequently do certain letters and words occur?
To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.
As explained fully in appendix A1, Shannon’s theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:
Definition 1: According to Shannon’s theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).
According to Shannon’s definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:
Theorem 4: A message which has been subject to interference or “noise,” in general comprises more information than an error-free message.
This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.
Some quotations concerning this aspect of information are as follows: French President Charles De Gaulle (1890–1970), “The Ten Commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry.” Another philosopher said, “There are about 35 million laws on earth to validate the ten commandments.” A certain representative in the American Congress concluded, “The Lord’s Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words.”
Theorem 5: Shannon’s definition of information exclusively concerns the statistical properties of sequences of symbols; meaning is completely ignored.
It follows that this concept of information is unsuitable for evaluating the information content of meaningful sequences of symbols. We now realize that an appreciable extension of Shannon’s information theory is required to significantly evaluate information and information processing in both living and inanimate systems. The concept of information and the five levels required for a complete description are illustrated in Figure 12. This diagram can be regarded as a nonverbal description of information. In the following greatly extended description and definition, where real information is concerned, Shannon’s theory is only useful for describing the statistical level (see chapter 5).
When considering the book B mentioned earlier, it is obvious that the letters do not appear in random sequences. Combinations like “the,” “car,” “father,” etc. occur frequently, but we do not find other possible combinations like “xcy,” “bkaln,” or “dwust.” In other words:
Only certain combinations of letters are allowed (agreed-upon) English words. Other conceivable combinations do not belong to the language. It is also not a random process when words are arranged in sentences; the rules of grammar must be adhered to.
Both the construction of words and the arrangement of words in sentences to form information-bearing sequences of symbols, are subject to quite specific rules based on deliberate conventions for each and every language.3
Definition 3: Syntax is meant to include all structural properties of the process of setting up information. At this second level, we are only concerned with the actual sets of symbols (codes) and the rules governing the way they are assembled into sequences (grammar and vocabulary) independent of any meaning they may or may not have.
Note: It has become clear that this level consists of two parts, namely:
A) Code: Selection of the set of symbols used
B) The syntax proper: inter-relationships among the symbols
A set of symbols is required for the representation of information at the syntax level. Most written languages use letters, but a very wide range of conventions exists: Morse code, hieroglyphics, international flag codes, musical notes, various data processing codes, genetic codes, figures made by gyrating bees, pheromones (scents) released by insects, and hand signs used by deaf-mute persons.
Several questions are relevant: What code should be used? How many symbols are available? What criteria are used for constructing the code? What mode of transmission is suitable? How could we determine whether an unknown system is a code or not?
The number of different symbols q, employed by a coding system, can vary greatly, and depends strongly on the purpose and the application. In computer technology, only two switch positions are recognized, so that binary codes were created which are comprised of only two different symbols. Quaternary codes, comprised of four different symbols, are involved in all living organisms. The reason why four symbols represent an optimum in this case is discussed in chapter 6. The various alphabet systems used by different languages consist of from 20 to 35 letters, and this number of letters is sufficient for representing all the sounds of the language concerned. Chinese writing is not based on elementary sounds, but pictures are employed, every one of which represents a single word, so that the number of different symbols is very large. Some examples of coding systems with the required number of symbols are:
–Binary code (q = 2 symbols, all electronic DP codes)
–Ternary code (q = 3, not used)
–Quaternary code (q = 4, e.g., the genetic code consisting of four letters: A, C, G, T)
–Quinary code (q = 5)
–Octal code (q = 8 octal digits: 0, 1, 2, . . . , 7)
–Decimal code (q = 10 decimal digits: 0, 1, 2, . . . , 9)
–Hexadecimal code4 (q = 16 HD digits: 0, 1, 2, . . . , E, F)
–Hebrew alphabet (q = 22 letters)
–Greek alphabet (q = 24 letters)
–Latin alphabet (q = 26 letters: A, B, C, . . . , X, Y, Z)
–Braille (q = 26 letters)
–International flag code (q = 26 different flags)
–Russian alphabet (q = 32 Cyrillic letters)
–Japanese Katakana writing (q = 50 symbols representing different syllables)
–Chinese writing (q > 50,000 symbols)
–Hieroglyphics (in the time of Ptolemy: q = 5,000 to 7,000; Middle Kingdom, 12th Dynasty: q = approximately 800)
Coding systems are not created arbitrarily, but they are optimized according to criteria depending on their use, as is shown in the following examples:
The choice of code depends on the mode of communication. If a certain mode of transmission has been adopted for technological reasons depending on some physical or chemical phenomenon or other, then the code must comply with the relevant requirements. In addition, the ideas of the sender and the recipient must be in tune with one another to guarantee certainty of transmission and reception (see Figures 14 and 15). The most complex setups of this kind are again found in living systems. Various existing types of special message systems are reviewed below:
Acoustic transmission (conveyed by means of sounds):
–Natural spoken languages used by humans
–Mating and warning calls of animals (e.g., songs of birds and whales)
–Mechanical transducers (e.g., loudspeakers, sirens, and fog horns)
–Musical instruments (e.g., piano and violin)
Optical transmission (carried by light waves):
–Written languages
–Technical drawings (e.g., for constructing machines and buildings, and electrical circuit diagrams)
–Technical flashing signals (e.g., identifying flashes of lighthouses)
–Flashing signals produced by living organisms (e.g., fireflies and luminous fishes)
–Flag signals
–Punched cards, mark sensing
–Universal product code, postal bar codes
–hand movements, as used by deaf-mute persons, for example
–body language (e.g., mating dances and aggressive stances of animals)
–facial expressions and body movements (e.g., mime, gesticulation, and deaf-mute signs)
–dancing motions (bee gyrations)
Tactile transmission (Latin tactilis = sense of touch; signals: physical contact):
–Braille writing
–Musical rolls, barrel of barrel-organ
Magnetic transmission (carrier: magnetic field):
–magnetic tape
–magnetic disk
–magnetic card
Electrical transmission (carrier: electrical current or electromagnetic waves):
–telephone
–radio and TV
Chemical transmission (carrier: chemical compounds):
–genetic code (DNA, chromosomes)
–hormonal system
Olfactory transmission (Latin olfacere = smelling, employing the sense of smell; carrier: chemical compounds):
–scents emitted by gregarious insects (pheromones)
Electro-chemical transmission:
–nervous system
In the case of an unknown system, it is not always easy to decide whether one is dealing with a real code or not. The conditions required for a code are now mentioned and explained, after having initially discussed hieroglyphics as an example. The following are necessary conditions (NC), all three of which must be fulfilled simultaneously for a given set of symbols to be a code:
NC 1: A uniquely defined set of symbols is used.
NC 2: The sequence of the individual symbols must be irregular.
Examples: –.– – –.– * – – * * . – .. – (aperiodic) qrst werb ggtzut
Counter examples:
– – –...– – –...– – –...– – –... (periodic)
– – – – – – – – – – – – – – (the same symbol constantly repeated)
r r r r r r r r r r r r r r r r r r r
NC 3: The symbols appear in clearly distinguishable structures (e.g., rows, columns, blocks, or spirals).
In most cases a fourth condition is also required:
NC 4: At least some symbols must occur repeatedly.
It is difficult to construct meaningful sentences without using some letters more than once.5 Such sentences are often rather grotesque, for example:
Get nymph; quiz sad brow; fix luck (i, u used twice, j, v omitted).
In a competition held by the Society for the German Language, long single words with no repetitions of letters were submitted. The winner, comprised of 24 letters, was: Heizölrückstoßabdämpfung (Note that a and ä, for example, are regarded as different letters because they represent different sounds.)
There is only one sufficient condition (SC) for establishing whether a given set of symbols is a code:
SC 1: It can be decoded successfully and meaningfully (e.g., hieroglyphics and the genetic code).
There are also sufficient conditions for showing that we are not dealing with a code system. A sequence of symbols cannot be a code, if:
or
Example 1: Randomly generated characters: AZTIG KFD MAUER DFK KLIXA WIFE TSAA. Although the German word MAUER and the word WIFE may be recognized, this is not a code according to our definition, because we know that it is a random sequence.
Example 2: In the Kornberg synthesis (1955) a DNA polymerazae resulted when an enzyme reacted with Coli bacteria. After a considerable time, two kinds of strings were found:
... TATATATATATATATATATATATAT ...
... ATATATATATATATATATATATATA ...
... GGGGGGGGGGGGGGGGGGGGGG ...
... CCCCCCCCCCCCCCCCCCCCCCCC ...
Although both types of strings together contained all the symbols employed in the genetic code, they were nevertheless devoid of information, since necessary condition (NC) 2 is not fulfilled.
The fundamentals of the “code” theme were already established by the author in the out-of-print book having the same name as the present one [G5, German title: Am Anfang war die Information]. A code always represents a mental concept and, according to our experience, its assigned meaning always depends on some convention. It is thus possible to determine at the code level already whether any given system originated from a creative mental concept or not.
We are now in a position to formulate some fundamental empirical theorems:6
Theorem 6: A code is an essential requirement for establishing information.
Theorem 7: The allocation of meanings to the set of available symbols is a mental process depending on convention.7
Theorem 8: If a code has been defined by a deliberate convention, it must be strictly adhered to afterward.
Theorem 9: If the information is to be understood, the particular code must be known to both the sender and the recipient.
Theorem 10: According to Theorem 6, only structures which are based on a code can represent information. This is a necessary but not sufficient condition for the establishment of information.
Theorem 11: A code system is always the result of a mental process (see footnote 8) (it requires an intelligent origin or inventor).
The expression “rejoice” appears in different languages and coding systems in Figure 13. This leads to another important empirical theorem:
Theorem 12: Any given piece of information can be represented by any selected code.
Comment: Theorem 12 does not state that a complete translation is always possible. It is an art to suitably translate and express metaphors, twists of logic, ambiguities, and special figurative styles into the required language.
It is possible to formulate fundamental principles of information even at the relatively low level of codes by means of the above theorems. If, for example, one finds a code underlying any given system, then one can conclude that the system had a mental origin. In the case of the hieroglyphics, nobody suggested that they were caused by a purely physical process like random mechanical effects, wind, or erosion; Theorem 11 is thus validated.
The following is a brief list of some properties common to all coding systems:
–A code is a necessary prerequisite for establishing and storing information.
–Every choice of code must be well thought out beforehand in the conceptual stage.
–Devising a code is a creative mental process.
–Matter can be a carrier of codes, but it cannot generate any codes.
Definition 4: The actual syntax describes the construction of sentences and phrases, as well as the structural media required for their formation. The set of possible sentences of a language is defined by means of a formalized or formalizable assemblage of rules. This comprises the morphology, phonetics, and vocabulary of the language.
The following questions are relevant:
–Which of the possible combinations of symbols are actual defined words of the language (lexicon and notation)?
–How should the words be arranged (construction of the sentences, word placement, and stylistics), linked with one another, and be inflected to form a sentence (grammar)?
–What language should be used for this information?
–Which special modes of expression are used (stylistics, aesthetics, precision of expression, and formalisms)?
–Are the sentences syntactically correct?
–Does the recipient understand the language? (Understanding the contents is not yet relevant.)
The following two sample sentences illustrate the syntax level once again:
Sentence B is perfectly correct syntactically, but it is semantically meaningless. In contrast, the semantics of sentence A is acceptable, but its syntax is erroneous.
By the syntax of a language is meant all the rules which describe how individual language elements could and should be combined. The syntax of natural languages is much more complex (see appendix A2) than that of formal artificial languages. The syntactic rules of an artificial language must be complete and unambiguous because, for example, a compiler program which translates written programs into computer code cannot call the programmer to clarify semantic issues.
Knowledge of the conventions applying to the actual encoding as well as to the allocation of meanings is equally essential for both the sender and the recipient. This knowledge is either transferred directly (e.g., by being introduced into a computer system or by being inherited in the case of natural systems), or it must be learned from scratch (e.g., mother tongue or any other natural language).
No person enters this world with the inherited knowledge of some language or some conceptual system. Knowledge of a language is acquired by learning the applicable vocabulary and grammar as they have been established in the conventions of the language concerned.
When we read the previously mentioned book B, we are not interested in statistics about the letters, neither are we concerned with the actual grammar, but we are interested in the meaning of the contents. Symbol sequences and syntactic rules are essential for the representation of information, but the essential characteristic of the conveyed information is not the selected code, neither is it the size, number, or form of the letters, or the method of transmission (in writing, or as optical, acoustic, electrical, tactile or olfactory signals), but it is the message being conveyed, the conclusions, and the meanings (semantics). This central aspect of information plays no role in storage and transmission, since the cost of a telegram, for example, does not depend on the importance of the message, but only on the number of letters or words. Both the sender and the recipient are mainly interested in the meaning; it is the meaning that changes a sequence of symbols into information. So, now we have arrived at the third level of information, the semantic level (Greek semantikós = characteristic, significance, aspect of meaning).
Typical semantic questions are:
a) Concerning the sender:
–What are the thoughts in the sender’s mind?
–What meaning is contained in the information being formulated?
–What information is implied in addition to the explicit information?
–What means are employed for conveying the information (metaphors, idioms, or parables)?
b) Concerning the recipient:
–Does the recipient understand the information?
–What background information is required for understanding the transmitted information?
–Is the message true or false?
–Is the message meaningful?
Theorem 13: Any piece of information has been transmitted by somebody and is meant for somebody. A sender and a recipient are always involved whenever and wherever information is concerned.
Comment: Many kinds of information are directed to one single recipient (like a letter) and others are aimed at very many recipients (e.g., a book, or newspaper). In exceptional cases, the information never reaches the recipient (e.g., a letter lost in the mail).
It is only at the semantic level that we really have meaningful information; thus, we may establish the following theorem:
Theorem 14: Any entity, to be accepted as information, must entail semantics; it must be meaningful.
Semantics is an essential aspect of information because the meaning is the only invariant property. The statistical and syntactical properties can be altered appreciably when information is represented in another language (e.g., translated into Chinese), but the meaning does not change.
Meanings always represent mental concepts; therefore, we have:
Theorem 15: When its progress along the chain of transmission events is traced backward, every piece of information leads to a mental source, the mind of the sender.
Sequences of letters generated by various kinds of statistical processes are shown in Figure 38 (appendix A1.5). The programs used for this purpose were partially able to reproduce some of the syntactic properties of the language, but in the light of Theorems 16 and 17 these sequences of letters do not represent information. The next theorem enables one to distinguish between information and noninformation:
Theorem 16: If a chain of symbols comprises only a statistical sequence of characters, it does not represent information.
Information is essentially linked to a sender (a mental source of information) according to Theorems 13 and 15. This result is independent of whether the recipient understands the information or not. When researchers studied Egyptian obelisks, the symbols were seen as information long before they were deciphered because it was obvious that they could not have resulted from random processes. The meaning of the hieroglyphics could not be understood by any contemporaries (recipients) before the Rosetta Stone was found in 1799, but even so, it was regarded as information. The same holds for the gyrations of bees which were only understood by humans after being deciphered by Karl von Frisch. In contrast, the genetic code is still mostly unknown, except for the code allocations between the triplets and the amino acids.
All suitable ways of expressing meanings (mental substrates, thoughts, or nonmaterial contents of consciousness) are called languages. Information can be transmitted or stored in material media only when a language is available. The information itself is totally invariant in regard to the transmission system (acoustic, optical, or electrical) as well as the system of storage (brain, book, data processing system, or magnetic tape). This invariance is the result of its nonmaterial nature. There are different kinds of languages:
A common property of all languages is that defined sets of symbols are used, and that definite agreed-upon rules and meanings are allocated to the single signs or language elements. Every language consists of units like morphemes, lexemes, expressions, and entire sentences (in natural languages) that serve as carriers of meaning (formatives). Meanings are internally assigned to the formatives of a language, and both the sender and the recipient should be in accord about these meanings. The following can be employed for encoding meanings in natural languages: morphology, syntax (grammar and stylistics), phonetics, intonation, and gesticulation, as well as numerous other supplementary aids like homonyms, homophones, metaphors, synonyms, polysemes, antonyms, paraphrasing, anomalies, metonymy, irony, etc.
Every communication process between sender and recipient consists of formulating and understanding the sememes (Greek sema = sign) in one and the same language. In the formulation process, the information to be transmitted is generated in a suitable language in the mind of the sender. In the comprehension process, the symbol combinations are analyzed by the recipient and converted into the corresponding ideas. It is universally accepted that the sender and the recipient are both intelligent beings, or that a particular system must have been created by an intelligent being (Figures 23 and 24, chapter 7).
Let us again consider book B mentioned initially to help us understand the nature of the next level. There is a Russian saying that “The effect of words can last one hour, but a book serves as a perpetual reminder.” Books can have lasting effects. After one has read a software manual, for example, one can use the described system. Many people who read the Bible are moved to act in entirely new ways. In this regard, Blaise Pascal said, “There are enough passages in Scripture to comfort people in all spheres of life, and there are enough passages that can horrify them.” Information always leads to some action, although, for our purposes, it is immaterial whether the recipient acts according to the sender’s wishes, responds negatively, or ignores it. It often happens that even a concise but striking promotional slogan for a washing powder can result in a preference for that brand.
Up to the semantic level, the purpose the sender has with the transmitted information is not considered. Every transmission of information indicates that the sender has some purpose in mind for the recipient. In order to achieve the intended result, the sender describes the actions required of the recipient to bring him to implement the desired purpose. We have now reached an entirely new level of information, called pragmatics (Greek pragmatike = the art of doing the right thing; taking action).
Some examples of pragmatic aspects are:8
–What actions are desired of the recipient?
–Has a specific action been formulated explicitly, or should it be implicit?
–Is the action required by the sender to be taken in only one predetermined way, or is there some degree of freedom?
–To what extent does the received and understood meaning influence the behavior of the recipient?
–What is the actual response of the recipient?
Theorem 17: Information always entails a pragmatic aspect.
The pragmatic aspect could:
–be unnegotiable and unambiguous without any degree of freedom, e.g., a computer program, activities in a cell, or a military command;
–allow a limited freedom of choice, like instinctive acts of animals;
–allow considerable freedom of action (only in the case of human beings).
Note: Even if there is considerable variation in the pragmatics resulting from the semantics, it does not detract anything from the validity of Theorem 17.
When language is used, it does not simply mean that sentences are jumbled together, but that requests, complaints, questions, instructions, teachings, warnings, threats, and commands are formulated to coerce the recipient to take some action. Information was defined by Werner Strombach [S12] as a structure which achieves some result in a receiving system. He thus referred to the important aspect of taking action.
We can distinguish two types of action:
–programmed actions (e.g., mechanical manufacturing processes, the operation of data processing programs, construction of biological cells, respiration, blood circulation, and the functioning of organs)
–instinctive acts (behavior of animals)
–trained actions (e.g., police dogs, and circus performances involving lions, elephants, horses, bears, tigers, dogs, seals, dolphins, etc.)
–learned activities like social manners and manual skills
–sensible actions (humans)
–intuitive actions (humans)
–intelligent actions based on free will (humans)
All the activities of the recipient can depend on information that has previously been conceptualized by the sender for the intended purpose. On the other hand, intelligent actions that do not derive from a sender are also possible.
A relevant theorem is the following:
Theorem 18: Information is able to cause the recipient to take some action (stimulate, initialize, or implement). This reactive functioning of information is valid for both inanimate systems (e.g., computers or an automatic car wash) as well as living organisms (e.g., activities in cells, actions of animals, and activities of human beings).
We consider book B for the last time to illustrate one further level of information. Goethe once said, “Certain books seem to have been written not so much to enable one to learn something, but to show that the author knew something.” This reason for writing a book, which is of course not worth emulating, does, however, express something of fundamental importance: The sender has some purpose for the recipient. The purpose of a promotional slogan is that the manufacturing firm can have a good turnover for the year. In the New Testament, John mentions a completely different purpose for his information: “I write these things to you who believe in the name of the Son of God so that you may know that you have eternal life” (1 John 5:13). We conclude that some purpose is pursued whenever information is involved.
We now realize that any piece of information has a purpose, and have come to the last and highest level of information, namely apobetics (the teleological aspect, the question of the purpose; derived from the Greek apobeinon = result, success, conclusion). The term “apobetics” was introduced by the author in 1981 [G4] to conform to the titles of the other four levels. For every result on the side of the recipient there is a corresponding conceptual purpose, plan, or representation in the mind of the sender. The teleological aspect of information is the most important, because it concerns the premeditated purpose of the sender. Any piece of information involves the question: “Why does the sender communicate this information, and what result does he want to achieve for or in the recipient?” The following examples should elucidate this aspect:
–The male bird calls a mate by means of his song, or he establishes his territory.
–Computer programs are written with a purpose (e.g., solution of a set of equations, inversion of matrices, or to manipulate some system).
–The manufacturer of chocolate A uses a promotional slogan to urge the recipient to buy his brand.
–The Creator gave gregarious insects a pheromonal language for the purpose of communication, for example to identify intruders or indicate the location of a new source of food.
–Man was gifted with a natural language; this can be used for communicating with other people, and to formulate purposes.
–God gives us a purpose in life through the Bible; this is discussed more fully in Part 3 of this book.
Examples of questions concerning apobetics, are:
–Has an unambiguous purpose been defined?
–What purpose is intended for the recipient?
–Can this purpose be recognized directly, or could it only be deduced indirectly?
–What purpose is achieved through the actions of the recipient?
–Does the result obtained in the recipient correspond to the purpose which the sender had in mind?
–Did the recipient find a purpose which the sender had not intended (e.g., the evaluation of historical documents could serve a purpose which was never thought of by the author)?
The sender’s intention can be achieved in various ways by the recipient:
–completely (doing exactly what the sender requested)
–partly
–not at all
–doing exactly the opposite
The response to an unambiguously formulated purpose (e.g., computer program, commands given personally, or promotional material) could be any one of these different actions. The purpose could, however, not even be mentioned, or could not have been imagined by the sender (e.g., documents with trivial contents surviving from previous centuries which provide researchers with important clues not intended by the original author).
In this case also we can formulate significant empirical theorems:
Theorem 19: Every piece of information is intentional (the teleological aspect).9
Theorem 20: The teleological aspect of information is the most important level, since it comprises the intentions of the sender. The sum total of the four lower levels is that they are only a means for attaining the purpose (apobetics).
Note: The teleological aspect may often overlap and coincide with the pragmatic aspect to a large extent, but it is theoretically always possible to distinguish the two.
Theorem 21: The five aspects of information (statistics, syntax, semantics, pragmatics, and apobetics) are valid for both the sender and the recipient. The five levels are involved in a continuous interplay between the two.
Theorem 22: The separate aspects of information are interlinked in such a way that every lower level is a necessary prerequisite for the realization of the next one above it.
Whenever the teleological aspect is minimized or deliberately ignored, we should be aware of the fact that Theorem 19 is violated. Evolutionary doctrine deliberately denies any purposefulness that might be apparent. In the words of G.G. Simpson, an American zoologist, “Man is the result of a materialistic process having no purpose or intent; he represents the highest fortuitous organizational form of matter and energy.”
In this respect, one more theorem is required:
Theorem 23: There is no known natural law through which matter can give rise to information, neither is any physical process or material phenomenon known that can do this.
Synopsis: It should be clear that information is a multi-layered concept. Shannon’s theory embraces only a very small fraction of the real nature of information, as can easily be ascertained in terms of the five levels that we discussed. Contradictory statements and erroneous conclusions of many authors are a result of discussing information without being clear about the relevant level, nor whether the appropriate level lends itself to wide ranging conclusions. It is, for example, not possible to find answers about the origin of biological systems, when one only considers the statistical level. Even when impressive mathematical formulations are forthcoming, they will bring no clarification if they are restricted to the level of Shannon’s theory. Well-founded conclusions are only possible when the sender/recipient problem is treated fully at all five information levels.
All of the Theorems 1 to 23 formulated thus far, as well as Theorems 24 to 30, which will follow, are based on empirical reality. They may thus be regarded as natural laws, since they exhibit the characteristics of natural laws as explained in chapter 2. These theorems have been tested in real situations (compare Theorem N1 in paragraph 2.3). Any natural law can be rejected the moment a single counter example is found, and this also holds for these information theorems. After many talks by the author at colleges and universities, both abroad and at home, no researcher could mention one single counter example. In one case, somebody said that it might be possible that one of these theorems could be negated a few million years in the future, when a counter example may be found. My answer was that it was possible, as in the case of all natural laws. However, even if one or more of the theorems could be nullified by a counter example after a few million years, we still have to accept them and live with them now.
The seven most important results are repeated once more:
These seven theorems can also be formulated as impossibility theorems, as has been shown in paragraph 2.5 for practically all laws of nature:
We still have to describe a domain of definition for all these theorems; this will be done in the next chapter.
Figure 14 may serve the purpose of ordering the proposed theorems. Three phenomena are represented hierarchically, namely matter, information, and life, with matter at the lowest level. All known natural laws belong here (e.g., conservation of energy, strength of materials, and electric charge). According to Theorem 1, information is not a property of matter, and thus requires a next higher level. All information theorems belong to this level. The highest level is that of life. Natural laws belonging to this level may be called life theorems. A fundamental theorem at this level was formulated by Louis Pasteur (1822–1895), and it has not yet been contradicted by any experiment: “Life can only come from life.” The following statements can be made about the three hierarchical levels shown in Figure 14:
Because of the philosophical bias, both information and life itself are regarded as purely material phenomena in the evolutionary view. The origin and the nature of life is reduced to physical-chemical causes. In the words of Jean B. de Lamarck (1744–1829), “Life is merely a physical phenomenon. All manifestations of life are based on mechanical, physical, and chemical causes, being properties of organic matter” (Philosophie Zoologique, Paris, 1809, Vol. 1, p. 104 f). The German evolutionist Manfred Eigen expressed a similar view [E2, p. 149]: “The logic of life originates in physics and chemistry.” His pupil, Bernd-Olaf Küppers, paved the way for molecular Darwinism, but the present author has already responded to this materialistic view [G14, p. 90–92]. All such ideas have in common that biological facts are interwoven with subjective representations which cannot be justified scientifically. The information theorems formulated in this book, should enable the reader to distinguish between truth and folly.
The code systems used for communication in the animal kingdom have not been “invented” by them, but were created fully functional according to Figure 24.
Between the covers of this book may well be the most devastating scientific argument against the idea that life could form by natural processes.
Read OnlineAnswers in Genesis is an apologetics ministry, dedicated to helping Christians defend their faith and proclaim the good news of Jesus Christ.