October 23, 2016

Can pollution be controlled?

Pages:  1   2   3   4   5   6   ALL 

This is a discussion of the multitude of pollutants, chemical and radiological, that are being pumped into the environment in the name of progress and development, and the failure of regulators to tackle the growing problem as to do so would challenge the principle of never-ending economic growth that threatens the entire biosphere.

The article was originally published in The Ecologist Vol. 9 Nos. 8/9, October-December 1979, and later released in 1988 as Chapter 5 of The Great U-Turn.

At the Stockholm Environmental Conference, in answer to environ­mentalists’ demands for the banning of supersonic aircraft, Lord Zuckerman, ex-chief scientist to the British govern­ment, answered that

“If it were an ineluctable conclusion that the use of supersonic civil transport would irrevocably wreck the ozone layer which overlies our atmosphere, can we seriously imagine that we would not find ways of inhibiting the use of such aircraft, as our knowledge of their secondary effects, if any, become more apparent? What are we: ants, lemmings or rational human beings?” [1]

Lord Zuckerman appears to be living in a world of his own. Indeed the experience so far has been that very harmful pollutants have often been produced for a very long time before even their harmfulness has been noted, during which time they have entered into so many different manufacturing processes that they have become general environmental contaminants.

Even then, efforts to introduce any controls at all have been so feverishly opposed by industry and government with the aid of the scientific experts whom they employ, that still more time has elapsed before they have met with any success however limited. And limited they always have been. Indeed with respect to the control of pollution we have behaved very much like the legendary lemmings. Let us consider a few examples.

Polychlorinated biphenyls (PCBs) are today regarded as known carcinogens, also ones that appear to be toxic at much lower levels than previously thought. They were first brought into use in 1929. It was nearly 40 years before their environmental hazards were recognised. In the intervening years, 30,000 tons had been dispersed in the atmosphere, 60,000 tons into water systems and 300,000 tons had been dumped. [2]

If DDT can be transformed into PCBs, which seems to be the case under the action of ultra-violet rays, then the quantities are much bigger still. What is certain is that PCBs are now a general contaminant of our environment. They are present in water, air, soil and sediment and tend to accumulate in the fatty tissues of animals. [3]

DDT is now also recognised as causing serious biological damage to biological organisms and is a suspected carcinogen. Since it first entered into use the total amount produced is somewhere in the area of two million tons and, like PCBs, it is now a general environmental contaminant. Though its effects were revealed over 30 years ago with the publication of Rachel Carson’s Silent Spring, no real action was taken until 1972. Today, even though it is no longer allowed to be sold in the US, exports are unaffected and it continues to be produced at around the rate of 100,000 tons a year.

Another general contaminant of life on this planet is vinyl chloride monomer (VCM), which is now a recognised carcinogen. Prior to the discovery of its hazards however, an estimated 100 million pounds a year were being lost to the environment during manufacture and 2 percent of the US output of five billion pounds was being released “through deliberate dispersive use”, though the most hazardous of these uses, as a propellant in aerosol spray cans, was banned in 1974. [4]

The use of this pollutant is now subject to certain controls in many countries: however, its production in total terms has not been affected. In particular it is still used as a plasticiser for wrapping materials and is known to leach in small but significant amounts into the food-stuffs contained.

The carcinogenic effect of asbestos has also been known at least since the early 1930s. The time lag between first reports of asbestos-related disease and control measures to reduce the risk, was about 30 years. During this period, to quote Lawrence McGinty,

“The toll of death from cancers and lung disease caused by asbestos will never be counted. Some are buried with conveniently incorrect death certificates, others died from lung cancers indistinguishable from those caused by smoking cigarettes. Although asbestos is not the most potent toxic substance used industrially, its very pervasiveness means that the number of people exposed to it – workers, consumers, and those living in cities – is enormous. It would be quicker to count those who haven’t been exposed.” [5]

Other carcinogens such as red dye No.2 or amaranth, another established carcinogen, have entered into an astonishing variety of processed foods. In the US each year, about nineteen million dollars-worth of these have been produced and added to $19-25 billion worth of food. According to the Federal Drug Administration (FDA), amaranth was used in

“ice cream, processed cheese, luncheon meat, frankfurters, fish fillets, shell fish, cornflakes, shredded wheat or wheat cereal, rice flakes or puffed rice, rolls (sweets, cinnamon, Bismark etc.) snack items (pretzels, corn chips, crackers etc.) cookies, pie crust, cake-mix, pickles, canned peaches, citrus juice and other canned fruit juices, other canned fruits and fruit cocktails, salad dressings, jelly, pudding mixes, syrup, jam, candy bars, vinegar and cola drinks. [6]

Americans are said to have ingested about five hundred tons of it a year. The FDA attempts to ban it were delayed for 15 years. It has now been banned in the US but is still in general use in other countries.

Hexachlorobenzene (HCB) has been widely used as a fungicide for seed protection. World production is thought to be around four million pounds a year. It was found to be highly toxic as early as 1955 when grain seed in Turkey treated with HCB that was intended for sowing, was used instead for bread production. Five thousand people were affected by eating the contaminated loaves and between 250 and 500 died.

WHO has shown that children under the age of two taking HCB via their mother’s milk, suffered a 90 percent mortality rate. In the US, HCB is a trace contaminant of human milk and levels are also to be found in other foodstuffs including butter. So far efforts to ban this substance have failed. [7]

Hexachlorophane is also a highly toxic chemical. In the summer of 1972, thirty-nine infants in a French hospital died from being rubbed with a baby powder containing 6 percent Hexachlorophane. It was banned by the FDA in January 1972, but only after being used for thirty years in a host of non-prescription products including 400 categories of deodorants, soaps, shampoos, toothpastes, cleansers, and cosmetics, involving thousands of brand names and hundreds of millions of dollars in retail sales. [8]

Chloroform was shown to cause liver cancer in small animals over 30 years ago, but it is still used in cough mixtures, mouthwashes and toothpastes. It is also used as a preservative. In fact it is now in such general use that when it was suggested to a UK expert that this substance might be banned he answered, “it would be like trying to get rid of alcohol – there is a little bit everywhere.” [9]

It is important to realise that I have named but a handful of the four million chemicals which, according to OECD, we have introduced into our environment; 563,000 of these are thought to be in common use and a hundred are produced in excess of 50,000 tonnes a year. [10] It is also important to realise that very few of these chemicals have been properly tested – a question I shall examine in greater detail further on in this chapter.

It should thereby be fairly obvious that we live in a highly contaminated environment – and when we consider that possibly 1,000 new chemicals are introduced every year – some say 3,000 – and that the quantities of the existing ones generated by our industrial activities continue to increase with the growing world economy, our environment is clearly becoming more highly contaminated every year – which explains the current and growing epidemic of pollution-induced diseases, in particular cancer.

What is more, it is equally obvious that our efforts to control pollution are very ineffective – that we are, in fact, behaving much more like lemmings than like Lord Zuckerman’s ‘rational human beings’. Why should this be so?

The excuses

When the pollution caused by a particular activity or set of activities is pointed out to the polluters or to the authorities and it is suggested that something be done about it, the answer is nearly always the same. We are told that this is not possible until further scientific research is undertaken in order to obtain ‘the hard scientific evidence’ required to determine the exact effects of the pollutants on living things.

Consider, in this respect, the efforts made by Chemie Grunentaile (the firm responsible for making and distributing thalidomide) to avoid the banning of this product. Among other things, a famous expert, Professor Eric Blechschmidt, Director of the Institute of Anatomy of Gottingen University, was prevailed upon to state in a Court of Law that

“so long as there is no complete certainty about how the thalidomide might execute its effect on any embryo, theories about the drug’s detrimental quality are premature and represent no more than pure speculation. Any binding thesis about a causal link between thalidomide and deformities does not yet exist.” [11]

Consider, too, the efforts made to control the pollution of the Mediterranean. [12] The United Nations Environmental Programme sponsored a series of international meetings to try to reach agreement on actions required to prevent this sea from becoming a lifeless waste but so far these have been in vain. The excuse for inaction is, as usual, ignorance as to the exact nature of the pollutants which are causing all the damage.

“We do not have enough evidence yet,” said Dr. Keckes, the scientist in charge of the ‘scientific assessments’ of the state of this highly polluted sea. “The rate of implementation of the Treaty will depend on how fast we can define the standards scientifically.” [13]

Consider yet another example. The National Cancer Institute (NCI) has demonstrated the carcinogenicity of trichlorethylene which is used as an industrial solvent particularly for degreasing machine parts, in foodstuffs for de-caffeinating coffee and has been used by solvent abusers with fatal results.

In the UK, the Health and Safety Executive (HSE) however, refuses to reduce existing safety levels which are at present 100ppm. One reason is that the NCI tests were on animals and not on humans. This is a weak excuse, since substances which are carcinogenic to one form of life tend to be carcinogenic to others – the informational medium contained in the genes and in the nucleus of a cell affected by a carcinogen being expressed in precisely the same medium, DNA. Tighter controls, they insist, will have to await the result of surveys of workers exposed to this chemical.

What is more, a spokesperson has admitted that the government Employment Medical Advisory Service has no immediate plans for such a survey. [14] Indeed whenever we are told that no scientific evidence has been found to incriminate a particular chemical, the chances are that no one has bothered to look for the evidence. [15]

The question we must ask is how much can governments justify allowing several million chemicals, to which organisms and ecosystems have never been exposed during the course of their evolution, to be released into our environment without such tests being carried out? Greater irresponsibility is hared to imagine. But would carrying out such tests serve any real purpose? Would they yield serious information that could really be of use in determining an effective strategy for controlling offending chemicals? As I shall now attempt to show, the answer is undoubtedly no.

The theoretical basis of pollution control is provided by the principle that dangerous chemicals are only dangerous when used at sufficiently high levels and that there must be a level at which concentrations of the chemical are biologically harmless. If this is so then it suffices to ensure that these levels are never exceeded so that no biological damage to occurs.

The more we know about the biological effect of chemicals, however, the more it becomes apparent that this is simply not true. Serious efforts have been made to establish safe levels of different pollutants but these studies have always proved to be in vain.

This is the case, for instance, with radiation. It is now generally accepted that any increase in radiation levels during the course of our evolution must be reflected in some biological damage. The same seems to be the case with asbestos. The US National Institute of Occupational Safety and Health (MOSH) has stated quite explicitly that

“excessive cancer risks have been demonstrated at all fibre concentrations studied to date. Evaluation of all human data available provides no evidence for a threshold or for a “safe” level of asbestos exposure.” [16]

In general this is true for all carcinogens, a principle that is accepted by the Health Education and Welfare (HEW) in the US. As its former secretary, Arthur Flemming, has said, “Scientifically there is no way to determine a safe level for a substance known to produce cancer in animals.”

It is this principle that provides the rationale for the famous Delaney Clause which makes it illegal in the US to add any chemical to foodstuffs that can be shown to be carcinogenic, even in very small amounts. Since the Delaney Clause was passed no significant progress has been made in our ability to determine a safe level of a cancer-causing chemical. On the contrary, according to Anita Johnson, “recent evidence suggests that estimating a safe dose is more difficult than was previously thought.” [17]

For instance the FDA conducted a massive ‘megamouse’ study at a cost of more than $5 million to determine if very low doses of a known carcinogen over a period were in any way safer than high doses, as is generally maintained by the chemical industry. 24,000 animals were used in this study and the results showed that low doses were not safe at all. Liver cancers were produced at the lowest dose as well as at the highest.

If this is so, then the acceptable levels fixed for potentially dangerous chemicals released into our environment have no scientific basis whatsoever. In fact, it is very easy to show that they are simply the minimum levels that can be achieved without compromising economic priorities.

Thus the WHO standard of 0.02 to 0.05 ppm mercury in food

“is simply the practical residue limit, the concentration of mercury expected in the diet from natural background and environmental contamination.” [18]

In Sweden, the maximum permissible level was maintained at 1.00 ppm for a very long time. The reason is that if the limit of 0.5 ppm had been adopted in the late 1960s as was then proposed, it would have become necessary to close down more than 45 percent of Sweden’s inland fisheries. [19]

The acceptable level of lead in drinking water is above that which tends to be present today. When WHO recently raised it from 50 microgrammes to a hundred microgrammes per litre, this was not the result of the sudden discovery that man was less sensitive to lead poisoning than was previously thought, but because few water authorities could provide water to this standard.

Indeed the Cox report, drawn up with the aid of experts from twenty-four countries, established that toxic effects are noted above fifty micro-grammes per litre – while Dr Kehoe who, according to Professor Bryce-Smith “has probably carried out more studies on lead than almost any other authority living”, argued very strongly that the level should be set as low as twenty microgrammes per litre. [20]

The level fixed by the FDA for aflatoxins, highly carcinogenic substances produced by certain moulds left on improperly dried crops such as peanuts and cereals, is 20 ppb. Yet we know that aflatoxins can cause cancer in animals when fed at levels of 45 ppb to fish, 1 ppb to rats; while at 15 ppb aflatoxin, 100 percent of rats get cancer. As Anita Johnson points out, these very high tolerances are an indication of the commercial pressures felt daily at the FDA.

In the same way, the level set by the FDA for PCBs is 5 ppm in fish and poultry in spite of the fact that PCBs are so toxic that they have been shown to suppress reproduction in animals at a dose of 2.5 ppm. To quote Anita Johnson again,

“The FDA appears to have chosen 5 ppm simply because it would permit the vast majority of FCB-contaminated products to be marketed as usual.”

Efforts to reduce permissible levels by various government agencies are consistently being thwarted by commercial pressures. The Department of Labor, for instance, was expected to announce a reduction in the acceptable level for occupational exposure to lead in the air, by 50 microgrammes per cubic feet of air. This would have reduced the existing level by four times.

The US Regulatory Analysis and Review Group opposed this on the grounds that the cost to industry of maintaining these levels would be in excess of $1 billion. New levels for exposure to benzene proposed by OSHA were also successfully quashed in the Federal Appeal Court in New Orleans on the grounds that the agency had failed to demonstrate “a reasonable relationship” between anticipated benefits and costs. [21]

  • Twitter
  • Facebook
  • Digg
  • Reddit
  • StumbleUpon
  • Diaspora
  • Identi.ca
  • email
  • Add to favorites
Back to top

Pages:  1   2   3   4   5   6   ALL