Science & technology

Plagued by cures

There is growing evidence that preventing diseases in infancy may be a mixed blessing. Can intervening in an illness sometimes be worse than doing nothing at all?

|

THE feather in the cap of 20th-century medicine is the prevention of infectious diseases, especially in childhood. Smallpox was eradicated 25 years ago. Thanks to extraordinary international efforts—including ceasefires in wars just so that vaccinations could be administered—polio is on the verge of going the same way. Measles, mumps and whooping cough can also be prevented with vaccines, and their incidence has declined dramatically in the past 50 years. Even some less tractable diseases, such as malaria, have started to bend to interventions. Covering more beds with nets has proved to be remarkably effective—perhaps as effective as vaccinations—at reducing the incidence of this disease.

Yet the triumph is by no means complete. It is, of course, well known that preventing or treating an infectious disease can have profound effects on the pathogenic organism that causes it. The evolution of drug-resistant strains is the most famous example of such an effect. Others, equally disturbing, include the appearance of mutants able to evade the protection conferred by vaccines. But now a new worry has emerged. It appears that intervening in infections may have undesirable effects on the hosts—that is, on people—as well as on the pathogens themselves.

Hygienic diseases

The first possible effect is the replacement of one disease by another. As the incidence of childhood infections has fallen, a number of chronic ailments, such as diabetes and asthma, have become more frequent. In parts of the world where childhood diseases are still common, these chronic ailments are rare.

A direct link between these two phenomena is not yet proven. This may be because there isn't one. Doctors in rich countries have the experience to detect, and the money to treat, chronic disease. In poor countries, such diseases—if detected at all—are low on the list of priorities, and may thus go unreported. However, a number of studies suggest that this is not the whole explanation. Instead, childhood infections do indeed seem to reduce the probability of chronic disease—an idea known as the “hygiene hypothesis”.

Hints that this might be true emerged a few years ago when it was noted that children with many siblings were less likely than average to contract chronic infections, even though they were more likely to catch non-chronic childhood infections because of their large number of brothers and sisters. Since this observation was largely anecdotal, sceptics dismissed it. But the sceptics are now coming round. Several bits of research published over the past year—and most recently this week—provide firmer evidence that something intriguing really is going on.

For example, in a study in Guinea-Bissau, researchers tested almost 300 children, of whom 133 had had measles, for reactions to seven different airborne “allergens”. They found that atopy—a fancy name for allergic reactions including asthma, eczema and hay-fever—was significantly more common among those children who had not had measles. Unfortunately, the study found that vaccination against measles did not give the same protection as getting the disease.

Two recent studies in Italy—one of 2,226 schoolchildren and one of 1,659 male college students—found that atopy is less prevalent among poorer people, and among those with big families or who grow up in unsanitary surroundings. In addition, the study of the students found that exposure to hepatitis-A decreased the probability of atopy.

On top of all this, several other studies published this year suggest that exposure to tuberculosis protects against atopy. The stronger the response to tests for exposure to TB, the weaker the response to tests for atopy. Whether vaccines against TB, unlike those against measles, also contribute to protection from atopy is unresolved.

A final piece of evidence came this week, in a paper published in Archives of Disease in Childhood. David Phillips and his colleagues at the University of Southampton, in England, show that childhood infections reduce the probability of acquiring insulin-dependent diabetes. Dr Phillips and his colleagues compared 58 diabetic children with 172 non-diabetic ones matched for age, sex, social class and pediatrician. They found that the diabetic children were significantly less likely to have had a severe respiratory infection during the first year of their life.

It is not at all obvious why any of this should be so. One explanation is that the immune system needs to keep busy. Deprived of the enemies it has evolved to cope with, it responds to stimuli that would not normally irritate it. Diabetes is an “auto-immune” disease—one in which the immune system attacks its own body. And allergies, too, are caused by immunological malfunctions. A second possibility is that infectious diseases modify the immune response in some way—although precisely how is largely conjecture.

Fortunately, diabetes and allergic reactions are more likely to be inconvenient than fatal. Measles, on the other hand, is still a ferocious killer. The measles epidemic which preceded the allergen study in Guinea-Bissau killed one in four of those children under the age of three who became infected. It is obviously better to live with asthma than die from measles.

Once bitten...

But this line of reasoning may not always hold. The second possible effect of intervening in a disease is that the intervention makes the disease worse in the long term, not better. A number of viral infections—chickenpox and polio, for example—are more dangerous to an adult than an infant. If a disease is dangerous and no effective vaccine or treatment for it exists, it might be better to allow a child to catch it early on rather than to try to delay or prevent the disease. Some researchers are starting to wonder whether this might even be true for non-viral diseases such as malaria.

When Robert Snow, a biologist at the Kenya Medical Research Institute, and his colleagues compared hospital admissions for severe malaria from three communities in Kenya and two in the Gambia, they made an alarming finding. The admission rate for children with severe disease was low in the areas where transmission of malaria was highest, and high in areas where transmission of malaria was more moderate. The implication is that being exposed to malaria early and often increases the body's resistance to it. Dr Snow and his colleagues are worried by this. It suggests that the use of bednets in places where malarial transmission is high may have a perverse effect. By transforming them into places where transmission is moderate, it could increase the number of deaths that the disease causes.

Understandably, this suggestion is intensely controversial. There have been many criticisms of the study's findings, and bednets have certainly proved their efficacy against malaria in the short term. But it is nonetheless true that their long-term effect is unknown.

Even if bednets did increase the number of deaths directly due to severe malaria, however, Chris Curtis, a biologist at the London School of Hygiene and Tropical Medicine, reckons that their use may still reduce the total number of deaths which are in any way—directly or indirectly—attributable to malaria. This is because infection with one disease often increases susceptibility to others. Indeed, such enhanced susceptibility may sometimes be of mutual benefit to the pathogens (see article). Unfortunately, this synergy may kill the victim in circumstances when infection with either disease alone would have been manageable. Thus, by postponing the age at which a child first gets malaria, bednets may help to reduce overall deaths by blocking infectious synergies.

All this suggests that in curing disease there are no easy fixes: every silver bullet leaves a cloudy trail. Even after a plague is eradicated, it may not be sensible to stop vaccinating against it. Smallpox vaccinations, for instance, protect people from monkeypox, and now that such vaccination has ended, monkeypox is cropping up in people again. If, despite the worrying results from Guinea-Bissau, measles vaccine did turn out to protect against atopy, then vaccination against that disease might remain useful even if the virus itself eventually followed smallpox to the grave.

In the case of malaria, Louis Molineaux, a malariologist now retired from the World Health Organisation, argues that it would be foolish to charge into Africa distributing bednets to all and sundry. But given their short-term benefits it would be equally wrong to withhold them from anyone who wants or needs them. What really matters is organising bednet programmes carefully, and monitoring their users constantly. For the state of the host turns out to be as important in disease as the state of the parasite. Until this is better understood, attention to the effects of treatments is the only way to prevent mother nature from having the last laugh.

This article appeared in the Science & technology section of the print edition under the headline "Plagued by cures"

Is Russia going wrong?

From the November 22nd 1997 edition

Discover stories from this section and more in the list of contents

Explore the edition

Discover more

Antarctica, Earth’s largest refrigerator, is defrosting

The world must pay more attention to its southern pole

Killer whales deploy brutal, co-ordinated attacks when hunting

Their techniques are passed down through the generations


A new generation of music-making algorithms is here

Their most useful application may lie in helping human composers