October 25, 2016

The super-informed society

Pages:  1   2   3   ALL 

Or “Many paths to nonsense: information theory applied to the living world”.

Will the proliferation of information technology really help us to solve the important issues we face today, or will it simply add to our already mounting problems?

Goldsmith argues (in 1982) that the impending development of the internet and resulting “information revolution” will not be all that it is claimed. He further argues that there is an essential difference between the kind of information that engineers and scientists have conceived of, and design for, and that found in the living world.

Published in The Ecologist Vol. 12 No. 3, May–June 1982.

It is at last clear that because of resource and pollution constraints, material progress has ceased to be a realistic goal for humanity.

Since the idea of ‘progress’ underlies that whole shaky edifice of beliefs with which we have been so deeply imbued since our earliest childhood and which we can refer to as the worldview or Paradigm of Industrialism, psycho­logically, we are utterly committed to it.

We thus have no alternative but to redefine ‘progress’ which means determining some way in which we can ‘advance’ other than by accumulating material goods.

Since the quasi-religious worldview of industrialism accentuates the quantitative at the expense of the qualitative and largely ignores the notion and implications of organisation, ‘progress’ is conceivable by us only in terms of the accumulation of some quantity and if this cannot be of consumer goods, it must be of something else that is equally quantifiable and hence, equally easily accumulated.

‘Information’ admirably fits the bill. It is in the field of information processing and communications that our scientists and technologists are making the most rapid progress and in which the most spectacular advances are yet to be expected.

What is more, the energy and resources required for constructing the equipment required for processing, emitting and receiving information is supposed to be so modest and its cost so low, that its commercialisation on a massive scale is not seen as constrained by resource shortages, pollution emission standards nor the present, it is hoped, short-lived decline in purchasing power.

In these conditions, it is not difficult to persuade ourselves that the growth of the ‘information’ industry is really serving some useful purpose – indeed that it may well provide the next stage in the progress of man in his quest for Paradise. As Hald, one of the high priests of this new Progress, tells us:

“We are moving from a society perceived as resource-constrained to one that is ‘information-rich’. We are entering a new era in which economic growth is derived from the exchange of information and the creation of knowledge rather than from the accelerated consumption of natural resources.” [1]

Hald, and he seems to be echoing a view that is very widely held, believes that the proliferation of low cost computers will “fundamentally transform how we think and perceive reality”:

“Individual computer capability tied into sophisticated satellite-, cable- and broadcast-based telecommunication systems, will permit millions of people to communicate simultaneously in vast interactive networks.” [1]

Among other things, he contends, this will not only make people more aware but will also enable them to think properly, “relating ideas and weaving patterns of understanding, developing a form of thinking that will be highly conceptual.” He goes so far as to suggest that “our children’s children may become the first genius generation”. According to Hald, governments will also be transformed:

“Individuals in open societies will be able to develop consensus networks through which an ongoing process of large-scale, many-to-many interaction could distil meaningful options for the future. Political leaders tied into such consensus network would, by necessity, feel closer to the voters and more committed to the directions chosen.” [1]

Wishful thinking

That the information-rich society is technically feasible, I have no doubt. That its development might enable us to prolong for a few more decades many of the features of our moribund industrial society is also possible – though very much more doubtful. That it will create a race of supermen and make our government truly democratic, let alone solve any of our real problems, I regard as no more than the most naive wishful thinking.

To show exactly why I believe this to be so would mean covering a lot of ground. In this article, all I propose to do is to take the first logical step in this direction and consider what the term ‘information’ means when it is used in a precise and quantitative way. I refer to the concept of information as developed by Shannon and Weaver. [2] I shall try to show that though their theory may be useful in the field of communications, it has little relevance to the world of living things – contrary to what is generally assumed by many scientists.

This suggests that we should reconsider exactly what ‘information’ is, before we can talk seriously of mass-producing it as a means of solving the problems that our society faces today.

Back to top

Shannon and Weaver’s concept of information

Shannon and Weaver’s theory of information was developed as long ago as 1948. Since then other theories of information have been proposed, but they seem to constitute little more than minor variations on the original theme. In any case, they do not appear to have earned any general acceptance among scientists. They are listed, together with their most salient features, by Roger and Kincaid. [3]

Both Shannon and Weaver, when they developed their theory, were working for the Bell Telephone Company. Their chief concern was to determine how to maximise the amount of ‘information’ – not just the number of signs – that could be transmitted via a communications channel with limited capacity. They found it convenient, for their purposes, to define information in such a way that it could be measured in terms of Boltzman’s mathematical formula for the measurement of entropy. [4]

Information is thereby equated with entropy, with the difference that, whereas entropy is seen as the most probable arrangement of molecules in a particular energy state, information measures the most probable arrangement of signs in a message, both equating probability with randomness, in accordance with the Second Law of Thermodynamics, or the Entropy Law.

Back to top

Randomness the “ideal”?

It is difficult to understand the philosophy underlying this notion unless one realises that, for the communications engineer, randomness (and thus the absence of any organisation or constraints on the order in which the signs appear) is equated with the freedom he enjoys in choosing the message he wishes to send and hence the order in which the signs must appear, so as to satisfy his professional requirements. Randomness, or entropy, is thus for him the ‘ideal’, and must thereby, for his purposes, be associated with the highest information.

To quote Weaver,

“Information is highest when the probabilities of the various choices are as nearly equal as circumstances permit – when one has as much freedom as possible when making a choice, being driven as little as possible towards some certain choices which have more than their share of probability.” [2]

On the other hand, when a “situation is highly organised, it is not characterised by a large degree of randomness or of choice” and in these conditions, “the information (or the entropy) is low”.

Back to top

Linguistic constraints

The sort of constraints that Shannon and Weaver regard as reducing this freedom of choice (and hence the information content of a message) are linguistic constraints. Each language has a particular structure or organisation. In terms of this structure, one can predict with a measure of confidence that certain words are more or less likely to follow other words. Thus to quote Weaver,

“After the three words ‘in the event’ the probability for ‘that’ as the next word is fairly high, and for ‘elephant’ as the next word is very low.” [2]

These linguistic constraints reduce the information content of a message by forcing the sender to include signs in his message, not because he wants to but because they are imposed on him by the structure of the language in which the message is formulated.

To him these former signs are redundant. Different languages are seen as having a different built-in redundancy, that of the English language being about 50 percent. Thus one can say that the higher the organisation and hence the lower the entropy, the greater must be the constraints, the higher must be the redundancy, and the lower must be the information contained.

Back to top

Measuring information

The amount of information in a message is calculated in terms of the logarithm (base 2) of the number of choices. The result is formulated in terms of ‘bits’ (the term ‘bit’ was first suggested by John W. Tukey, as an abbreviation for ‘binary digit’). When numbers are expressed in the binary system there are only two digits, zero and one. These may be taken symbolically to represent any two alternate choices. In a situation in which there are only two choices, there is said to be one bit of information.

The greater the number of free unconstrained choices, the greater the amount of information. If there are 16 choices from among which we are equally free to choose, then such a situation is associated with four bits of information.

Back to top


The greater the freedom enjoyed by the sender in the selection of signs or messages for emission, the greater must be the improbability that a particular sign or message will be sent. To illustrate this, I shall assume that Shannon and Weaver’s ‘information’ takes ‘meaning’ into account. Thus, a message that told us that a horse called Green Crocodile would win a race in which there were 16 contestants of unknown breeding and with no previous form (i.e. all in theory having the same chance of winning) would communicate 4 bits of information.

If we knew something about their breeding and form, and, on this basis, could classify the horses in accordance with what appeared to be their chances of winning the race, the information communicated would be correspondingly reduced. If one horse were backed down to even money, on the theory that it had one chance out of two of winning the race, then a message informing us that it would win would communicate still less information, in fact no more than one bit – the same amount of information as it would communicate, were Green Crocodile to have but a single other contestant to deal with, rather than 15 others.

This is clearly a very sensible way of calculating the value of information from the point of view of communications. The greater the number of bits ascribed to a message, the more valuable the information must be. This is certainly so in the case cited, in any case, to both the bookmaker and the punter.

Back to top


In reality it does not quite work this way since Shannon and Weaver are not concerned with the probability or improbability of a statement being true or false. This is the concern of the epistemologist, not of the communications engineer. The latter is not even preoccupied with the probability or improbability of a particular statement, nor even of a particular word, but only of particular signs being emitted – regardless of whether these signs make up intelligible words or whether such words make up intelligible sentences.

In other words, the information content of a message, for them, does not take into account its meaning. This both Shannon and Weaver fully admit. Thus Weaver writes,

“Information must not be confused with meaning.”

Likewise Shannon:

“The semantic aspects of communication are irrelevant to the engineering aspect.” [2]

This means, as again they freely admit, that their use of the term ‘information’ is very different from its normal use in the English language.

Back to top

Reducing the information content

An essential feature of Shannon and Weaver’s theory is, that during the emission of a message its information content is reduced. The reason is that as a message is spelled out along a channel, so does the probability or improbability of specific signs occurring become easier to calculate. Linguistic organization is seen to build up – as does ‘redundancy’, which means that ‘entropy’ and ‘information’ are correspondingly reduced.

Back to top


Another reason why the amount of information contained in a message must fall as it is spelled out, is that communication channels are subject to ‘noise’ or ‘randomness’. Noise, of course, increases uncertainty or improbability. One might think that it would thereby lead to increased (rather than decreased) information. However, Shannon and Weaver distinguish between the type of uncertainty caused by noise, which they regard as undesirable and desirable types of uncertainly which they identify with “freedom of choice”, and hence with information.

The information content of a message is thereby not equal to uncertainty but to ‘desirable’ uncertainty minus ‘undesirable’ uncertainty or noise.

Back to top

The extension of the theory

The fact that the equations used to measure entropy and information are the same is to Weaver highly significant. He points out that for Eddington

“the law that entropy always increases – the second law of thermodynamics – holds, I think, the supreme position among the laws of Nature”. [2]

Thus, Weaver notes, when the engineer

“meets the concept of entropy in communications theory, he has a right to be rather excited – a right to suspect that one has hold of something that may turn out to be basic and important.” [2]

It is undoubtedly this feature of Shannon and Weaver’s concept of ‘information’, (and by the same token its compatibility with the paradigm of science and hence with that of industrialism) which make it so attractive to the scientist and which, quite wrongly, seems to justify taking it out of its original context – that of communications engineering – and seeking to apply it to the world of living things, which of course, has only served to confuse the issue and, at the same time, to delay the development of a theory that really explained what is information and what are the principles governing its use in the world of living things.

Back to top


To begin with, one of the aspects of the Communications Theory of Information that makes it so attractive to the scientist, is its quantifiability. For quantification to be possible, however, as Apter points out, we must know the exact number of possible messages that could be transmitted at any one time. This may well be possible in the field of communications but not in the field of behaviour.

It is for this reason, as Brillouin points out, that

“the modest but, we think, significant applications of information theory to various psychological experiments have occurred in precisely those situations in which the set in question was strictly defined: a list of syllables to be memorised, associations to be formed, responses selected from etc.. There was therefore no difficulty in quantifying the associated ‘amounts of information’ and relating such amounts to certain aspects of performance”. [5]

But, as he points out, such situations are “banal”. We can add to this that they do not normally occur in the living world.

Back to top

Entropy and disorder

A further consideration is that the concept of entropy itself does not apply to the world of living things, any more than does that of information. [4]

It is supposed to be equated with biospheric disorder but this means looking at the biosphere in purely energetic terms i.e. in terms of but one of its innumerable components – an error I have referred to elsewhere as energy-reductionism. [4]

An increase in entropy really means the homogenisation of temperature – and it is simplistic to equate such a process with the disintegration of a natural system. Even if we accepted that it were, and identified entropy with biospheric disorder, it is easy to show that the Entropy Law has not applied to the world of living things, which over the last 3 billion years, rather than become increasingly disorderly, has, on the contrary, moved in precisely the opposite direction – towards ever greater complexity and order. Those who still believe that the entropy law is the supreme law of the Universe try to explain away this embarrassing fact in a number of ways, but none of them are at all convincing. [4]

In reality, disorder in the biosphere and hence, if we like, entropy, rather than being a highly probable state, is, on the contrary, an extremely improbable one, as must be Shannon and Weaver’s ideal source of information.

Back to top

The sender and the source

In any case, the notion of a passive source of information (whether it displays order or disorder) – from which messages are selected by an external agent – does not correspond to anything that exists in the world of living things.

The natural systems that make up the biosphere are dynamic not static, active not passive and, what is more, they are self-regulating not regulated from the outside (asystemically) by an external agent such as a communications engineer.

The source of information and the sender of the message in the world of living things are, in fact, part and parcel of the same self-regulating system.

If we integrate Shannon and Weaver’s sender of messages and the source of the messages into the same system, it must cease to display entropy, for one of the basic features of entropy is randomness and hence non-purposiveness, but the sender of messages acts purposefully since, as we are told, he selects for emission those messages that display the minimum redundancy, and hence the maximum information content.

What is more, if the system is to achieve its goal efficiently, then the information it contains must be organised in that way which most favours the achievement of that goal. This we can predict with confidence on the basis of our empirical knowledge of the way patterns of information (brains, genes, genomes, gene-pools, etc.) are organised in the world of living things.

  • Twitter
  • Facebook
  • Digg
  • Reddit
  • StumbleUpon
  • Diaspora
  • Identi.ca
  • email
  • Add to favorites
Back to top

Pages:  1   2   3   ALL