November 19, 2017

Dams and disease

Published as Chapter 7 of The Social and Environmental Effects of Large Dams: Volume 1. Overview. Wadebridge Ecological Centre, Worthyvale Manor Camelford, Cornwall PL32 9TT, UK, 1984. By Edward Goldsmith and Nicholas Hildyard.

Introduction

When a river is dammed and a large artificial lake is created, those forms of life which were adapted to the previous riverine ecosystem are likely to disappear. In their place, other species will emerge that are better adapted to the new environment. Some will thrive in the lake: others in the irrigation channels which it feeds; and still others in the new towns and cities spawned by the ‘development’ the dam brings. Not only will animal and bird life be affected, but also plants, fungi, protozoa, bacteria and other micro-organisms.

Many of those species play an integral part in the transmission of infectious diseases. It follows that as the composition of species in the new environment changes, so the pattern of disease will change also. Unfortunately, such change is generally for the worse. Indeed, in most cases, it has led to an upsurge in waterborne and other diseases.

Back to top

Malaria

In spite of the efforts of the World Health Organisation (WHO), malaria remains one of the most widespread and lethal diseases in the world. At one time, WHO experts were confident that new pesticides would put an end to malaria once and for all by eradicating the mosquitoes which act as its vector. Today that much vaunted malaria eradication programme is in tatters. Thousands of tons of pesticides have been sprayed – with disastrous ecological consequences – but to little avail. Every year, malaria still kills 1 million people. At any given moment, 160 million people – the equivalent of the entire population of Japan, Malaysia and the Philippines – suffer from the disease. [1]

In man, malaria is cause by four species of parasite, all belonging to the genus PlasmodiumPlasmodium Malariae , P. Vivax, P. Ovale and P. Falciparum , of which P. Vivax and P. Falciparum are the most common. The parasite has a complicated life-cycle. Reproducing only within mosquitoes of the genus Anopheles , the parasite must pass an ‘asexual’ phase within humans. When an infected mosquito bites a human, thousands of plasmodium parasites are released into the blood. Those parasites then establish themselves in the human liver, where they incubate and eventually release their offspring into the bloodstream, where they invade the red corpuscles.

A mosquito biting a human after the parasite has incubated, will inevitably eat a meal of infected blood. The parasites then reproduce, living within the mosquito’s gut where they form a cyst which finally bursts to release new parasites into the mosquito’s blood stream. Inevitably some of those parasites make their way to the mosquito’s salivary glands, the cycle repeating itself when the mosquito bites another human.

The role played by the plasmodium parasite and its mosquito ‘vector’ was not discovered until the early part of the 20th century. Previously, many theories had been put forward to explain the disease -the most common being that it was caused by the stench and fumes from stagnant water. Indeed, the word malaria is derived from the latin for ‘bad air’ (mal aria ) and there is little question that the Romans associated the disease with swamps and marshlands. That association is entirely accurate: swamps, marshes and stagnant pools are indeed ideal breeding grounds for malaria’s mosquito vectors. [2]

Not surprisingly, such areas were generally avoided – and not just by the Romans. Writing in the early 19th Century, the historian Sir James Emmerson Tennent, for example, tells us:

“Compared with Bengal and the Deccan Plains, the climate of Ceylon presents a striking superiority in mildness and exception from all the extremes of atmospheric disturbance: and except in particular localities, all of which are well known and avoided from being liable after the rains from malaria , or infected at particular seasons with agues and fever, a lengthened residence in the island may be completed without the slightest apprehension of prejudicial results.” [3]

In those parts of the world where malarial areas have been inhabited, a whole range of cultural strategies have been developed to keep the incidence of the disease to a minimum. Thus, the Montagnards of Vietnam build their houses on stilts, do not equip them with chimneys and keep their cows underneath. Since mosquitoes prefer to sting cows rather than people, have an aversion to smoke and do not fly high off the ground, few Montagnards suffer from the disease. By contrast, the incidence of malaria is high amongst those immigrants who have come into the area from the coast and who have been housed by the government in modern, concrete buildings. [4]

Where people have lived for long periods in malarial areas, genetic adaptations to the disease have been developed. Many Africans, for example, have developed a particular haemoglobin molecule, Haemoglobin S. Those who are homozygous suffer from a disease known as sickle-cell anaemia from which they often die. On the other hand, those who are heterozygous for Haemoglobin S have considerable resistance against malaria.

There are some 300 species of Anopheles mosquito, of which an estimated 40 are thought to be important transmitters of malaria. [5] Each has its own breeding, feeding and day-time resting habits: its own range of flight and dispersal patterns; and its own human biting-rate. There are also many different species of the plasmodium parasite. Like their mosquito vectors, the plasmodium protozoa have different lifestyles and are favoured by different environmental factors, the most important of which are climate and temperature.

Thus P. Falciparum , the parasite which is responsible for the most lethal form of malaria, requires a summer temperature above 19 degrees C in order to survive. Humidity is also important: if the mosquitoes are to live beyond the period required for the Plasmodium parasite to develop within their bodies – that is, until they reach the infective stage – then the relative humidity during the summer months must be maintained at an average of 52 percent.

Because these conditions are rarely satisfied in temperate areas – where P. Vivax is the parasite – malaria never had more than a tenuous hold in such regions and was easily controlled by such measures as marsh drainage. Even in sub-tropical climates, the right conditions for the disease to become established only prevail during the humid summer months. It is then that transmission occurs. Where ‘dry farming’ is practised or where crops are irrigated during the winter – and thus outside the transmission period – the mosquito vectors are presented with few niches in which they can breed. Malaria is thus likely to be absent or, at worst, its incidence will be minimal. [6]

The introduction of modern, perennial irrigation schemes has, however, greatly favoured both the incidence and the lethality of malaria. Not only have such projects created permanent and vastly extended habitats for the mosquito vectors of the disease, they have also created precisely those habitats favoured by the most efficient of the vectors. In Africa, water development projects have led to a proliferation of Anopheles Gambiae and Anopheles Funestas , the former breeding in flooded rice fields, the latter in drainage and irrigation canals. [7 ]

A. Gambiae has the reputation of being the most efficient of all the malarial vectors. Not only does it bite man in preference to other animals but it also has a high infectivity rate and is capable of transmitting malaria even when its population density is extremely low (about one mosquito per 30 persons). [8] In South Asia, irrigation schemes have favoured the mosquito which acts as the vector for both P. Vivax and P. Falciparum .

Because perennial agriculture makes possible two crops a year, it correspondingly increases the period during which mosquitoes have habitats in which to breed. Inevitably, too, by increasing the land area under water, irrigation schemes also increase the total mosquito population – and hence too the likelihood of infection. Moreover, the introduction of irrigated agriculture appears to bring changes in the biting habits of mosquitoes. As the human population increases and crops take over from livestock, so the mosquitoes switch from biting animals to biting humans. Thus, in those areas of the Kano plains where herding is still the principal means of livelihood, it was found that 70 percent of A. Gambiae fed on cattle and only 30 percent on man. The reverse was true in the modern irrigated rice-growing areas.

Nor is the risk of infection reduced by the design of irrigation schemes. It is known, for instance, that mosquitoes rarely fly more than a kilometre or so beyond the canals and rice-fields in which they live and breed. Those living at a kilometre’s distance from the canals will thus be beyond the mosquitoes’ reach. In the majority of large-scale irrigation schemes, however, the tendency has been to build the houses much closer to the irrigated fields. Where irrigation schemes have been built in settled areas, or where land has been flooded in order to build a dam, it has often proved difficult to persuade those whose existing houses are now next to the new irrigation schemes to move. [9]

The problem of malaria has been made still worse by the increased mobility which inevitably accompanies development. For malaria to be transmitted, both the vector and the parasite must be present in the same area at the same time. But whilst the parasite cannot exist without the vector, the opposite is not the case. Thus, many areas which are malaria-free have populations of the vector but not of the parasite. (Ironically, the vector may only have become established as a result of an irrigation scheme).

Small wonder, then, that the movement of people from infested areas (where the parasite is present) to non-infested areas (where the vector is present but not the parasite) has frequently led to the disease becoming established in areas where it was previously unknown, the parasite being brought to the vector in the blood of the migrants.

For those who contract the disease, the prospects of recovery are often diminished by their poor nutritional status. It has long been recognised that malaria is particularly lethal amongst people weakened by malnutrition. Indeed, the French have an old saying: ‘Le traitement du paludisme est dans la marmite’ (the cure for malaria lies in the cooking pot). [10]

That saying is particularly relevant where large-scale water projects are concerned. Inevitably, such projects tend to draw immigrants to the area, attracted by the prospect of jobs. More often than not those immigrants are impoverished and under-nourished -a point eloquently made by Dr. Robert Goodland, a senior ecologist at the World Bank. Describing the extent of malnutrition in the area around Brazil’s Tucurui Dam, for instance, he writes:

“In general, self-sufficient peasants or tribesmen whose varied diets of fresh foods and whose lifestyle as a whole was perfectly adapted to the conditions in which they lived, are replaced with the water-development scheme by an impoverished proletariat whose members are often employed as casual labour, who suffer from seasonal or permanent unemployment, and whose diet and lifestyle are usually unadapted to local conditions. Not surprisingly, the nutritional status of these people is particularly poor and this makes them particularly vulnerable to infectious diseases.” [11]

Sadly, the conditions at Tucurui are by no means exceptional. Perhaps it is not surprising, then, that wherever large-scale irrigation works have brought a rise in the incidence of malaria, the disease has increasingly proved a killer.

Unfortunately, once the conditions for malaria have been established, the disease is virtually impossible to control. The experience of the Middle East provides a case in point. Thus, successive malaria eradication campaigns have failed to stem malaria both in the Delta region of Egypt (particularly in the rice-growing areas) and in those areas of Iraq, Syria and Iran which are under perennial irrigation. [12]

Not least amongst the problems involved is the remarkable ability of the mosquito vectors to develop genetic resistance to the insecticides being used to destroy them. In 1981, WHO reported that 51 species had developed resistance to one or more insecticide; 34 were resistant to DDT, 47 to Dieldrin and 30 to both DDT and Dieldrin. Resistance to organophosphate pesticides had been recorded in 10 species and resistance to carbamates in 4 species.

As a result, there has been a resurgence of malaria in many countries in which it was once thought to be practically eliminated. Discussing the history of WHO’s malaria eradication campaign, Georganne Chapin and Robert Wasserstrom of Columbia University, write:

Initially, at least, it seemed that WHO’s campaign enjoyed almost unmitigated success. In India, for example, after 10 years of anti-malaria efforts (1961) only 50,000 cases of the disease were uncovered by government officials and a number of states had passed from attack to consolidation or maintenance. Similar triumphs were registered in Pakistan, Sri Lanka, paraguay, Venezuela, Mexico and Central America, which devoted considerable resources to this task. Moreover, in 10 other countries Plasmodium infection was completely overcome. But within a short time the campaign began to falter. Between 1961 and 1966, disease rates in India increased 3-fold; by 1970, half a million people caught malaria each year – many in areas where health authorities had recently scored impressive victories. Much the same course of events took place in Sri Lanka, which in 1968 experienced an epidemic that left 1.5 million people stricken. Around the globe, in El Salvador, Nicaragua and Honduras (where anti-malaria measures began in the late 1960s) the incidence of disease in 1975 was 3 times higher than it had been a decade earlier before the programme had started. [13]

With success apparently around the corner and with insecticide resistance necessitating the use of more expensive sprays (sometimes costing 10 times as much as the more common insecticides), many governments had abandoned the eradication programme as soon as infection rates dropped. When the mosquitoes returned, they did so with a vengeance. For its part, WHO accepted defeat: “Faced with problems for which they had no solutions, in 1973 WHO officials reluctantly transformed the Malaria Eradication Division into the Division of Malaria and other parasitic Diseases”. [14]

Such defeat was predictable. Thus it had always been recognised that for eradication to be successful, malaria would have to be eliminated worldwide. Unless that target was achieved, it would only be a question of time before areas which had been cleared became ‘re-infested’ by people moving in from areas where the disease was still endemic. Quite apart from the problems of resistance – a problem which was continually exacerbated by the use of agricultural chemicals to which the mosquitos also developed resistance – worldwide eradication of malaria would have entailed the extermination of every single Anopheles mosquito.

The only other option – destroying the plasmodium parasite through drugs – was considered not only too expensive but also almost impossible to achieve. Not least among the problems were the rate at which the plasmodium parasite reproduces within the bloodstream of its victims – and the sheer logistical problems of screening whole populations for the disease.

Quite apart from causing numerous pesticide-related deaths and poisonings, the spraying programmes still being carried out by WHO and other agencies have wrought untold ecological damage. Many pesticides – particularly DDT and other chlorinated hydrocarbons – tend to accumulate in the fatty tissue of those animals which eat them. As one moves up the food chain, so predators accumulate more and more pesticide in their fat.

The process is well illustrated by the wholesale contamination of the aquatic food chain in California’s Clear Lake after it was sprayed with DDD, a close chemical cousin of DDT, in order to control gnats. Levels of the chemical in the water were as low as 0.02 parts per million: plankton and other microscopic organisms feeding in the lake, however, accumulated DDD residues at 4 parts per million; fish eating the plankton concentrated the pesticide still further – to levels as high as 2,000 parts per million; and birds feeding on the fish were found to be contaminated with 80,000 times the level of DDD present in the lake – with the result that thousands died. [15]

Such ‘bio-accumulation’ can have unforeseen and tragic consequences. Anne and Paul Ehrlich, two of America’s most famous biologists, tell of the macabre experience of a village in Borneo:

“Some years ago, large quantities of DDT were used by the World Health Organisation in a programme of mosquito control in Borneo. Soon the local people, spared a mosquito plague, began to suffer a plague of caterpillars, which devoured the thatched roofs of their houses, causing them to fall in. The habits of the caterpillars limited their exposure to DDT, but predatory wasps that had formerly controlled the caterpillars were devastated.

Further spraying was done indoors to get rid of houseflies. The local gecko lizards which previously had controlled the flies, continued to gobble their corpses – now full of DDT. As a result, the geckos were poisoned, and the dying geckos were caught and eaten by house cats. The cats received massive doses of DDT, which had been concentrated as it passed from fly to gecko to cat, and the cats died. This led to another plague, now of rats. They not only devoured the people’s food but also threatened them with yet another plague – this time the genuine article, bubonic plague. The government of Borneo became so concerned that cats were parachuted into the area in an attempt to restore the balance.” [16]

Many countries are now exploring the possibilities of using ‘biological control’ to keep malaria in check. There is a tragic irony in that development – for the best practitioners of biological control are precisely those traditional societies which have been most disrupted by the introduction of modern agriculture.

In Sri Lanka, for instance, traditional farmers knew well the value of the fish which lived in their paddy fields – Lula, Kawaiya, Hadaya and Ara being the most common species. Those fish not only ate mosquito larvae but could also survive in the dried up ponds most favoured by the mosquitos for breeding. Today, few such fish species survive: many have been killed by agricultural chemicals, whilst still others have been eaten by Tillapia, a fish introduced from Africa by the government for the purposes of fish-farming. Unfortunately, Tillapia – whilst it eats mosquito larvae – cannot survive in small puddles during the dry season and does not venture into the paddy fields. [17] As a result, malaria has became a major problem in areas where it was previously under control.

Meanwhile, the introduction of perennial irrigation schemes continues apace throughout the world. Despite the serious malaria hazards associated with such schemes, the mistakes of the past seem doomed to be repeated. Some ascribe those past failures to ‘lack of vision’ on the part of the authorities: others – ourselves included – are less sanguine. The truth is that the totally predictable increase in the incidence of malaria as a result of introducing irrigation into the tropics and arid sub-tropics is considered an acceptable political price to pay for the political and economic gains to be made from the scheme. Foreign exchange must take priority over health.

Back to top

Schistosomiasis

In 1947, an estimated 114 million people suffered from schistosomiasis. [18] Today, 200 million people are affected, the equivalent of the entire population of the United States of America. [19] The disease is widespread in Africa, Japan, the Philippines, Thailand, Laos and other parts of Asia, the Middle East, the West Indies and parts of South America. In all, 71 countries are affected. [20]

The disease is caused by parasitic flatworms, known as ‘schistosomes’. Three common species infect man: S. Haematobium , S. Mansoni , and S. Japoni . The larvae of the schistosomes – the miracidium -develop within the bodies of freshwater snails, each of the three types of schistosomes requiring a different snail host. Within the snail, the miracidium are transformed into a second larval form (the sporcyst) which in turn produces thousands of larvae of a third type (the cercaria).

When people swim or wade in water contaminated by infected snails, the cercaria larvae bore through their skin and enter their bloodstream. From there they move to the liver, where they mature in a few weeks and mate. The female adult of S. Mansoni and S. Japonicum produce eggs every day during their estimated 35 year life. The S. Haematobium , on the other hand, settles down in the vesical vein where it spends about eight weeks laying eggs. Since its life cycle is very much shorter, it is theoretically much easier to control. [21] The eggs leave the human body via urine or faeces. Once back in water, they hatch into miracidium, which to survive must find and penetrate the body of a new snail host. The cycle is then repeated.

The eggs of all three species tend to spread to various organs within the human body: indeed, they have been recovered from the brain, the spinal cord, the lungs, bladder, appendix, rectum, uterus, spleen and liver. The eggs occur in such quantities that they tend to damage neighbouring tissue – a process which is exacerbated by the toxic chemicals produced by the body in order to kill the worms. [22]

When S. Haematobium develops in the bladder or genitals, the victim is seriously weakened and blood is passed in the urine. Where the disease is caused by S. Mansoni and S. Japonicum it leads to diarrhoea, enlargement of the lymph glands, of the spleen and of the liver. Cirrhosis of the liver may also develop. The form of the disease caused by S. Japonicum is often fatal. [23 ]

The dramatic spread of schistosomiasis over the last 35 years is now recognised largely to be a result of large-scale water projects. Such projects provide ideal habitats for both fresh water snails and the schistosome parasite. Indeed, the connection is so well established that Gilbert White has claimed: “The invasion by schistosomiasis of irrigation schemes in arid lands is so common that there is no need to give examples. The non-invasion of schemes in a region where the disease exists is exceptional.” [24]

Asit Biswas is equally emphatic:

“The incidence and extension of schistosomiasis and other waterborne diseases can be directly related to the proliferation of irrigation schemes, the stabilisation of the aquatic biotope and subsequent ecological changes.” By contrast, “when agriculture depended primarily on seasonal rainfall, the relationship between snail host, schistosome parasite and human host was somewhat stabilised, and infection rates were low.” [25]

Under such an agricultural regime, it was principally during the rainy season – when cultivation took place – that the parasite had the opportunity to come into contact with humans. It was then that infections were at their height. During the dry season, on the other hand, little infection took place.

The building of large-scale reservoirs and perennial irrigation schemes changes all that. Not only is the snail’s habitat vastly extended, but also the conditions are created for much longer breeding periods. Thus the Bulinus and Biomphalaria snails which act as intermediary hosts during the schistosomiasis cycle tend to flourish in the habitat provided by those aquatic weeds which, as we have seen, invariably invade large man-made lakes: indeed in Lake Volta, submerged weeds – principally Ceratophyllum – harbour more Bulinus snails than any other plant communities. [26]

More generally, John Waterbury, discussing the rise of schistosomiasis after the building of the Aswan Dam, comments:

“The impact of perennial irrigation is threefold: it eliminates fallows during which the snail host would be killed off as canals dried out; it increases the amount of still or stagnant water and the amount of aquatic weeds to which the snails cling; and it increases the amount of time the fellahin (peasants) spend in the water.” [27 ]

Inevitably, infection rates rise as do worm loads per man. Thus, in the case of Egypt, according to one study, the building of the Aswan Low Dam caused the schistosomiasis rate amongst the population in some areas to rise from 21 percent to 75 percent. Once the Aswan High Dam had been completed, the incidence of the disease increased still further – in some communities reaching an estimated infected rate of 100 percent. [28]

Elsewhere, the story is the same. Biswas, for example, cites a study undertaken in four selected areas three years after the introduction of perennial irrigation. In one area, the schistosomiasis rate had risen from 10 to 44 percent: in another, from 7 to 50 percent; in a third, from 11 to 64 percent; and in the fourth, from 2 to 75 percent. In Kenya, schistosomiasis now affects almost 100 percent of schoolchildren in those areas around Lake Victoria which have been irrigated. In the Transvaal, South Africa, the S. Mansoni infection rate is 68.5 percent amongst indigenous people working on European farms and only 33.5 percent in the reserves. The reason, of course, is that only the former are irrigated. [29]

In the Sudan, the massive Gezira scheme, which now covers 900,000 hectares, has seen a veritable epidemic of the disease. Before 1925, when the scheme was started, schistosomiasis was practically unknown in the Blue Nile area. Immediately the scheme came into operation, the disease started spreading and continued to do so in spite of all the measures undertaken to control it. In 1958, the Gezira Medical Officer of Health claimed that the campaign to control the disease had been a complete success, an assertion which seems to have been accepted by all the authorities concerned. Indeed, in 1961, it was claimed that the disease was under complete control.

Fifteen years later, however, the infection rate of S. Mansoni had increased to 70 percent in some villages, although infection with S. Haematobium had declined. [30] A still more recent survey, undertaken in 1979, puts the general infection rate at 60-70 percent, with the rate amongst schoolchildren reaching over 90 percent. All in all, nearly 1.4 million people are affected. [31] It is difficult to imagine what economic gains can possibly justify inflicting the terrible miseries of this most pernicious of diseases on such an enormous body of people – particularly when so many of the victims are children.

Few doubt that the incidence of the disease is on the increase – or that it will spread still further throughout the world as new irrigation works are built. Even those areas where schistosomiasis is at present unknown may soon succumb. In that respect, the experience of South America is instructive. Thus, for many years, it was claimed that the highly acidic waters of Amazonia – in which the schistosomes’ snail vector cannot breed – would ensure that the region remained free from the disease. [32]

In fact, the experts have now been proved wrong. The construction of large-scale dams and the creation of vast reservoirs has dramatically altered the chemistry of the water in many areas. In the 12 years since Surinam’s Brokopondo dam was closed, for example, the once acid waters of its reservoir have been effectively neutralised. “Acidity can no longer prevent the proliferation of the schistosomiasis snails there,” says Robert Goodland. [33]

Elsewhere, Goodland points out that snails capable of acting as a vector for schistosomiasis have already been discovered in rivers near the site of the Itaipu dam on the Rio parana in Brazil. Although the disease is not endemic to the area, cases have been reported amongst immigrants seeking work at the dam site. Goodland warns,

“The creation of the reservoir could lead to a great increase in the snail and the influx of infected construction workers could infect the snail, thus completing the cycle. Once infected, the snails will be extremely difficult to disinfect or to remove . . . Human medication is difficult, dangerous and ineffective.” [34]

To combat the spread of the disease, Goodland recommends that great care should be taken with the hiring of workers. But how likely is it that the tens of thousands of migrant labourers, drawn to the dam in the search for employment, can be adequately screened? And what of their relatives and friends who may come to visit them, if only for a few days? Will they also be checked to see if they harbour the disease? It would seem more probable that the dam will go the way of so many others in the tropics and fall prey to schistosomiasis.

Perhaps it is not surprising, then, that Letitia Obeng of the United Nations Environment Programme has warned that the current incidence of schistosomiasis is only the “thin end of the wedge”. Indeed, in her opinion, the disease can (and presumably will) establish itself in those other areas “where climatic, ecological, social, cultural and economic conditions favour its establishment”. [35] It is a grim prospect.

Back to top

Filariasis

Filariasis is caused by a parasitic worm, the principal species being Wucheria Bancrofti , which is transmitted by several species of mosquito. The disease itself can take several forms, most notoriously ‘elephantiasis’. The arms, legs, genitals and breasts of elephantiasis victims swell to monstrous proportions, in some cases retaining as much as 40 pounds of fluid. Where the genitals are affected, men with the disease frequently have to carry them in wheelbarrows. There seems to be no remedy other than the surgical removal of the affected parts.

Over 250 million people – the equivalent of the entire population of the Soviet Union – now suffer from filariasis. [36] \ The incidence of the disease is increased drastically by large-scale irrigation schemes since its mosquito vector tends to breed in water bodies rich in organic matter – marshes, sewers and badly maintained drains, for example.

Back to top

Onchocerciasis – River Blindness

River blindness is widespread in the river basins of Central and South America and tropical Africa. In the Volta River basin, 70,000 people are now blind as a result of the disease. Another million people have been weakened by its effects. In sane valleys, over 30 percent of the population are now blind. Other valleys have been abandoned altogether because of the disease. [37] Many of those valleys are particularly fertile and their former inhabitants are now forced to eke out a living on crowded and less fertile land in other areas.

Judging by the increased rate at which affected areas are being abandoned, the incidence of the disease in the Volta region is still on the increase. Elsewhere, in North-East Ghana , 40 percent of the total land area has been abandoned as a result of river blindness. [38]

The disease is caused by a parasitic worm of the filaria group, Onchocerus Volvulus , which is carried by the black fly Simuliidae. The vector tends to breed in fast flowing rivers. For that reason, it was generally thought that the damming of such rivers would actually reduce the incidence of the disease by reducing the vector’s breeding grounds. In fact, the spillways of large dams have proved excellent substitutes for the free flowing rivers. Thus, black fly have bred successfully at the Volta Dam, the Owen Falls Dam on the Nile and the Kainji Dam in Nigeria.

River blindness is extremely difficult to both treat and control. Indeed, the World Health Organisation’s attempts to eradicate the black fly have been singularly disappointing. In 1975, for instance, a control programme was initiated in the Volta River Basin. It was expected to cost $60 million and to last 20 years – during which time the disease was to be kept in check by the spraying of local waterways with insecticides.

Although the programme led to an initial reduction in black-fly populations, that reduction only proved temporary. Indeed, as Professor John Hunter of Michigan State University, points out, it has been found that the black fly is capable of wind-assisted dispersal over great areas – possibly as far as 500 kilometres. [39] As a result, the area treated with insecticides is constantly invaded by flies from other areas. In such conditions, the programme – like others of its type – would appear doomed to failure.

At the same time, the constant spraying has caused considerable ecological damage. Non-target invertebrate organisms in the rivers of the Volta basin have diminished by 25-30 percent. [40] Ironically, there is at least one recorded instance of the spraying programme actually increasing the population of blackfly larvae. In that instance, the reason was that the insecticide had eliminated the major Simulium predators.

Back to top

Diseases introduced as an indirect result of water projects

The building of large-scale irrigation schemes and other water projects has frequently created conditions which favour the transmission of other diseases, not all of which are waterborne. As we have seen, hundreds of thousands of people may be moved to make way for new water projects. Usually, they are dumped – and dumped is literally the word – wherever a large enough, uninhabited area can be found. That such areas exist is usually a testament to their barrenness and inhospitability. They are also frequently areas where disease is rife – one good reason why they have previously been avoided.

Thus, in the case of the Kariba Dam in Zambia, those who were resettled – mainly Tonga tribesmen – were relocated in an area infested by tetse fly. [41] As a result, the Tonga now suffer from a high incidence of sleeping sickness. Similarly, in Sri Lanka, many of those being resettled under the Mahaweli scheme are being moved from the highly fertile Dumbala Valley near Kandy, where the incidence of malaria is low, to the infertile and malaria-infested plains of the dry zone. Inevitably, many will succumb to both malaria and malnutrition.

To make matters worse, the authorities rarely lay on enough facilities to accommodate even a small proportion of those who are resettled – let alone those migrant workers who are inevitably attracted to newly irrigated areas in search of jobs. Linney and Harrison write,

“Population relocated as the reservoir is filled are frequently not planned for. Thus, they end up in refugee-camp circumstances, which are notorious for causing epidemics. Many of the 50,000 Tonga tribesmen dislocated by the Kariba Dam caught influenza as they queued up for relief supplies.” [42]

Moreover, unless satisfactory sanitation is introduced, the local water supply quickly becomes contaminated with human faeces. Under such conditions, the transmission of roundworm and hookworm – which together affect almost a billion people around the world – will inevitably increase. Bad sanitation and poor quality water are also responsible for the transmission of dysentry, gastroenteritis, diarrhoea, hepatitis, cholera and guinea worm. Pneumonia, too, is also likely to be rife where people are forced to live in over-crowded conditions, as are respiratory infections and tuberculosis. Finally, the predominance of single men amongst the migrant population has invariably led to prostitution and the rise of venereal diseases, particularly gonorrhea and syphilis.

There is a tragic irony in the spread of the above diseases – particularly those associated with poor quality water – since it is the proud boast of the dam-building industry that large scale dams actually improve water supplies. At a time when 25 million people – 6 million of them children under five years old – die every year from diarrhoea alone, the need for better water supplies is clear enough. Indeed, 80 percent of all sickness and disease in the Third World is now attributed to inadequate water supplies or poor sanitation.

Yet, in seeking to provide water through the construction of giant reservoirs, the industry has only succeeded in exacerbating the problem. As Jane Stein points out in Environment , “Once the minimum requirements for human needs is satisfied, the quality of water is more important than the quantity.” [43] Where, after all, is the gain in being able to draw water at will, if that water is thoroughly contaminated? In that respect, those with an intermittent supply of water may be better off – a point well made by Professor Gilbert White. Thus he writes:

“A Lanjo housewife may withdraw abundant supplies of Nile water to use for bathing, but the water may also be so contaminated that her household chronically suffers intestinal diseases. By contrast, a family in the dry areas of Rajasthan may lament the small volume it can draw from the village standpipe during the dry season but enjoys the security of a fully protected supply.” [44]

It is a point, however, that has been lost in the euphoria for building larger and larger dams.

Back to top

Efforts to combat disease

In terms of disease and consequent human suffering, the toll exacted by water development projects has been truly appalling. One might be forgiven, therefore, for thinking that every effort would be made by the authorities to ensure that the tragedies of the past would not be repeated in future schemes. Instead, however, one finds that the health hazards of water development projects continue to be either played down or (worse still) ignored.

Significantly, we have not come across even a suggestion that a particular dam project should be cancelled because of the diseases it is likely to inflict on the local population. Indeed, even those countries which have already been ravaged by dam-related epidemics appear blind to the lessons of their past.

A case in point is Ghana. Thus – despite the epidemic of schistosomiasis and river blindness which followed the building of the Volta Dam – plans have been announced by the Upper Region Agricultural Development Project to build a further 120 relatively small dams in its area by 1985, in addition to rehabilitating 100 older dams which have silted up. Those older dams were built as part of a previous programme, launched in the late 1950s, which saw the construction of 104 dams in just three years. [45]

That programme – undertaken without any accompanying measures to prevent water-borne diseases – led to a predictable escalation in the rate of schistosomiasis haematobium infections, the incidence of the disease tripling in the area. Neither that experience, nor the experience at Lake Volta, however, appears to have influenced the planning of the new dam programme in the Upper Region. As John Hunter of Michigan State University, writes in a study of the programme,

“To the cynic, perhaps sadly but not surprisingly, it would appear that history may be about to repeat itself, even with compounding effect. Once more, as in 1958-68, no integrated or other measures for disease prevention are included in the development plans. It is predictable, therefore, that there will be acute exacerbation of endemic schistosomiasis and of other water-related diseases. Once again, in a typical sectoral dichotomy, the Ministry of Agriculture will leave the Ministry of Health, with its most meagre resources, to cope on a curative basis, if in fact any special action at all will be possible.” [46]

The same lack of concern for preventative health measures, notes Hunter, is evident elsewhere in Ghana; at the Navrongo Dam, for instance, “no disease prevention methods are envisaged in the engineering and agronomic management plan.” Small wonder, perhaps, that Hunter concludes his study by asking: “Are acute and chronic illness, morbidity and death to be accepted as a necessary and inevitable cost of agricultural development?”

Ghana’s experience is by no means unique. In water project after water project, we find that no efforts are made to control disease until too late – until, in fact, the dams and associated irrigation works have been built and all the problems created. The point is well made by Alexis Coumboros of the Faculty of Medicine of the University of paris. Arguing that health measures must be incorporated into the planning of a water project from its very inception, he comments:

“Unfortunately one has to face the fact that this . . . is only done in exceptional cases and that the advice of those concerned with health is only requested once all the options have been closed. Even then it is rare that any money is set aside for dealing with health problems until, of course, health problems start appearing.” [47]

That lack of committment on the part of governments makes it almost impossible to safeguard the health of those whose environment has been transformed by water development schemes. Even if the technology were available to ensure proper standards of health, the political will to implement it is lacking. In that respect, the politicians are cutting their nose to spite their face. Indeed, those writers concerned with the health effects of large water projects invariably stress the economic cost of disease.

Thus M. A. Amin of the London-Khartoum Bilharzia Project points to the “absenteeism or the reduction in the work capacity resulting from schistosomiasis as well as the cost of medical care and the cost of control.” [48] So too, W. H. Wright has shown that the actual monetary cost of schistosomiasis for Africa as a whole is approximately $212 million – a figure that does not include the cost of medical treatment and public health programmes. [49]

Meanwhile, Dr. M.A. Farooq argued – as early as 1963 – that the economic losses caused by schistosomiasis in the Philippines came to approximately $6.5 million a year – a figure which, in this instance, includes the cost of medical attention and the loss in production caused by the disease. In Egypt, Farooq estimated, schistosomiasis was causing an economic loss of $560 million a year. [50]

Given the appalling suffering caused by water-borne diseases, it is a sad sign of our times that such suffering must be translated into economic terms in order to attract the attention of politicians. Worse still, there is a distasteful tendency on the part of the authorities to blame the increased incidence of water-borne diseases on its victims. Thus it is pointed out how difficult peasants are to ‘educate’ and how they insist on maintaining their traditional patterns of behaviour even though that behaviour is no longer adaptive in their new environment. If only the local people would refrain from using rivers or reservoirs in the way they do, it is argued, they would not be vulnerable to so many water-borne diseases.

But is such talk realistic? John Waterbury writes,

“In theory, it would be relatively easy to eliminate bilharziasis. All that is required is that people neither urinate nor defecate in the irrigation canals, thus breaking the cycle between hosts. But theory could become reality only if the entire context of rural living in Egypt was completely transformed. When most villages do not yet have electricity, it is ludicrous to talk of modern plumbing.” [51]

More important still, the authorities appear blind to the crucial part played by rivers in the everyday lives of those who live on their banks. Children are constantly in the water; traders swim out to passing boats in order to sell their wares: and housewives rely on the river to wash the family’s clothes. In India, the Ganges plays an even more central role in social life: a holy river, millions make the pilgrimage to Benares in order to submerge themselves in its (spiritually at least) cleansing waters. Can we really ask those people to abandon their ancient customs and transform their whole way of life simply to serve short-term political interests?

Back to top

References

1. Anil Agarwal, Water and Sanitation for All? Earthscan Briefing Document No. 22, 10 November 1980, p.25.

2. J. L. Cloudsley-Thompson, Insects and History, Weidenfeld and Nicolson, London, 1977, p.83.

3. Sir J. Emerson Tennent, Ceylon (2 Vols), London, 1860.

4. George J. Armelagos and Alan McArdle, ‘The Role of Culture in the Control of Infectious Diseases’, The Ecologist Vol. 6, No. 5, 1976, p.179.

5. J. L. Cloudsley-Thompson, op.cit. 1977, p.84.

6. M. A. Farid, ‘Irrigation and Malaria in Arid Lands’, in E. Barton Worthington (Ed), Arid Land Irrigation in Developing Countries: Environmental Problems and Effects, Pergamon, Oxford, 1977, p.416.

7. M. N. Hall, J. A. Chandler, R. B. Highton, ‘A Comparison of Mosquito Populations in Irrigated and Non-irrigated areas of the Kano Plains, Nyanza Province, Kenya’, in E. Barton Worthington, (Ed), op.cit. 1977, p.314.

8. M. A. Farid, op.cit. 1977, p.417.

9. M.N. Hall, J. A. Chandler and R. B. Highton, op.cit. 1977, p.337.

10. M. Lariviere, ‘Santé Publique et Projets d’Irrigation: role de l’Assainissement et de l’hygiene du milieu’, in E. Barton Worthington (Ed), op.cit. 1977, p.396.

lang2057 11. Robert Goodland, Environmental Assessment of the Tucurui Hydroelectric Project, Electro-norte, Brazilia, 1978.

12. M. A. Farid, op.cit. 1977, p.416.

13. Georganne Chapin and Robert Wasserstrom, ‘Pesticide Use and Malaria Resurgence in Central America and India’, The Ecologist Vol. 13, No. 4 1983, p.117.

14. Ibid, p.117.

15. Frank Graham Jnr., Since Silent Spring, Hamish Hamilton, London, 1977.

16. Paul and Anne Ehrlich, Extinction, Victor Gollancz, London, 1982, pp.78-79.

17. Mudiyanse Tenakoon, quoted by Edward Goldsmith, ‘Traditional Agriculture in Sri Lanka’, The Ecologist Vol. 12, No. 5, 1982, p.213.

18. Anil Agarwal, op.cit. 1980, p.25.

19. Letitia E. Obeng, ‘Schistosomiasis – The Environmental Approach’ in E. Barton Worthington, (Ed), op.cit. 1977, p.403.

20. Letitia E. Obeng, ‘Starvation of Bilharzia? A Rural Development Dilemma’ in Carl Widstrand (Ed), The Social and Ecological Effects of Water Development in Developing Countries, Pergamon, Oxford, 1978, p.343.

21. Ibid, p.346.

22. Ibid, p.347.

23. Ibid, p.346.

24. Gilbert F. White, ‘The Main Effects and Problems of Irrigation’ in E. Barton Worthington, (Ed), op.cit. 1977, p.48.

25. Asit K. Biswas, ‘Environmental Implications of Water Development for Developing Countries’ in Carl Widstrand (Ed), op.cit. p.290.

26. A. W. A. Brown and J. O. Deom, ‘Health Aspects of Man-Made Lakes’ in William Ackermann et. al. (Eds), Man-Made Lakes: Their Problems and Environmental Effects, American Geophysical Union, Washington D.C. 1973, p.757.

27. John Waterbury, Hydropolitics of the Nile Valley, Syracuse University Press, 1979, p. 146.

28. J. N. Lanoix, ‘Relation between irrigation engineering and Bilharziasis’, Bulletin of the World Health Organization 18, (1011-1035), 1958. Quoted by Letitia Obeng, op.cit. 1978, p.345.

29. Asit K. Biswas, op.cit. 1978, p.291.

30. Nigel Pollard, ‘The Gezira Scheme: A Study on Failure,’ The Ecologist Vol. 11, No. l, Jan-Feb 1981, p.24.

31. M. A. Amin, ‘Schistosomiasis in the Gezira’, Aquatic Weed Management, University of Gezira, Wad Medani, 1979. Quoted by Nigel Pollard, op.cit. 1981, p.24.

32. E. Barton Worthington, personal communication to Nicholas Hildyard, 1979.

33. Robert Goodland, personal communication to Nicholas Hildyard, 1979.

34. Robert Goodland, Rio parana Hydro-Electric Project: Ecological Impact Reconnaissance, International Engineering-Electroconsult, The Carey Arboretheum of the New York Botanical Gardens (Environmental Protection Programme), 1972, p.43.

35. Letitia Obeng, op.cit. 1978, p.344.

36. Jane Stein, ‘Water for the Wealthy’, Environment, May 1977, p.9.

37. John Madeley, ‘The Onchocerciasis Control Programme’, Mazingira, Vol. 4, No. 3/4, p.55.

38. John M. Hunter, ‘Strategies for the Control of River Blindness’ in Melinda S. Meade (Ed), Conceptual and Methodological Issues in Medical Geography. Studies in Geography No. 15, University of North Carolina at Chapel Hill, 1980, p.39.

39. John M. Hunter, ‘Progress and Concerns in the World Health Organization Onchocerciasis control program in West Africa’, Social Science and Medicine, Vol. 150, p.267.

40. Ibid, p.271.

41. Thayer Scudder, ‘Ecological Bottlenecks and the Development of the Kariba Lake Basin’, in T. Farvar and J. Milton (Eds), The Careless Technology, Tom Stacey, London, 1973, p.206.

42. Warren Linney and Susan Harrison, Large Dams and the Developing World: Social and Environmental Costs and Benefits – A Look at Africa, Environment Liaison Centre, P.O.Box 72461, Nairobi, 1981, p.28.

43. Gilbert F. White, quoted by Jane Stein, ‘Water for the Wealthy’, Environment, May 1977, p.7.

44. Ibid, p.7.

45. John M. Hunter, ‘Past Explosion and Future Threat: Exacerbation of Red Water Diseases (Schistosomiasis haematobium) in the Upper Region of Ghana’, International Journal for Physical, Biological and Human Geosciences and their Application in Environmental Planning and Ecology, Vol. 5, No. 4, 1981, p.312.

46. Ibid, p.118.

47. Alexis Coumbaras, ‘Sante et Irrigation’, in E. Barton Worthington (Ed), op.cit. 1977, p.3-2.

48. Mutamad A. Amin, ‘Problems and Effects of Schistosomiasis in Irrigation Schemes in the Sudan’ in E. Barton Worthington (Ed), op.cit. 1977, p.410.

49. W. H. Wright, ‘Schistosomias is a World Problem’, Bulletin of New York Academy of Medicine, 44 (3) 301-213, March 1968. Quoted by Letitia Obeng, op.cit. 1978, p.347.

50. M. A. Farooq, ‘A Possible Approach to the Evolution of the Economic Burdens imposed on a community by schistosomiasis’, Annals of Tropical Medicine and parasitology. 57.323-331 (1963). Quoted by Obeng, op.cit. 1978, p.347.

51. John Waterbury, op.cit. 1979, p.147.

  • Twitter
  • Facebook
  • Digg
  • Reddit
  • StumbleUpon
  • Diaspora
  • Identi.ca
  • email
  • Add to favorites
Back to top