Health-care practitioners with their treatment and advice loom large in the subject of health, but in fact their activity is only one of many other factors that contribute to the level of our well-being. Health-care practitioners have seldom shown much interest in the other factors.
The rampant killing diseases of the past had already stopped being major killers before effective medical interventions were introduced for either their cure or their prevention. You may find this difficult to believe since doctors and patients alike share a profound belief in the decisive role that medical practice plays in keeping us alive and kicking.
Why tuberculosis, typhoid fever, measles, scarlet fever, diphtheria, and whooping cough stopped being major killers is open to speculation: chlorinated water, improved nutrition, refrigeration and safer preservation of food, better housing, better sewage disposal, and the pasteurization of milk are some of the many factors that helped.
Knowledge gained through the advancement of science greatly improves disease prevention. Malaria, cholera, and AIDS may still kill people by the millions, but death rates from these diseases would be vastly higher if we did not know their cause nor had some idea of how to prevent their spread.
Most well-fed and well-cared-for children survive infectious diseases; underfed and poorly-cared-for children frequently do not. The wealth generated by the industrial revolution raised many people’s living standards. When economic conditions improve, couples tend to have fewer children, and in smaller families children receive better care. It was probably smaller families more than anything else that reduced mortality rates, for it would have been difficult to improve social conditions if birth rates had remained high.
It was social factors and not medical care that transformed our health statistics. I do not want to imply that medical intervention for these once killing diseases is without use, for while measles and other scourges from the past will probably no longer kill you, they may make you very sick.
Life expectancy is a useful statistical abstraction. It is the number of years on average that a person is expected to live. Life expectancy steadily increased over the last 140 years; oddly enough, the so-called “golden years of medicine” (1945-70) made little discernible difference to this rate of increase. But, while the outlook for a newborn has improved immensely, the outlook for the middle-aged, particularly the middle-aged male, is not much improved.
Why have the middle-aged fared so poorly? In developed countries, the eight leading causes of death in middle age are: heart disease, cancer, stroke, accidents, diseases of the respiratory system (inflammation, pneumonia, and bronchitis), type-2 diabetes, suicide, and cirrhosis of the liver. Many of these deaths are potentially preventable, for they are often the result of our lifestyles. Obesity, smoking, and alcohol are major contributors to most of these causes of death. While medical intervention reduces the death rates from some of these conditions, its effectiveness is puny compared to the results of not-smoking, exercising, staying trim, and avoiding an excessive alcohol intake. Doctors can do little in the face of an unhealthy lifestyle or destructive behaviour.
Perhaps medicine’s recurrent claim that it is about to eradicate cancer will one day come true, but until then stopping smoking would prevent more cancer deaths than anything doctors could possibly do. Despite the hype about the effectiveness of cancer treatment (and indeed some cancer cures are impressive), the overall success rate is dismal, and 40% of people who develop cancer still die of it.
Until recently, breast cancer was the No. 1 killer cancer in women. For several years, medicine has waged a campaign for its early detection. How useful has this campaign been? The screening tests for breast cancer are self-examination by the woman herself, regular breast examination by a doctor or nurse, and mammography. About 80% of breast cancers are discovered by self-examination, so teaching women this technique seemed a sensible way to go, but the results of a big study into the effectiveness of such examination were disappointing. In a five-year study of over 250,000 women, half were given intensive instruction in breast self-examination and the other half were not. The women in the self-examination group did indeed discover twice the number of breast lumps, but their death rate from breast cancer was no lower than that for the non-self-examiners. The American National Breast Cancer Center states about such self-examination: “The evidence is not sufficiently strong to justify continued health campaigns to encourage its use.” Instead, the Center suggests mammography.
What about mammography? Stephen Feig, a widely published American radiologist, claims it “reduces breast cancer deaths in women aged 40 years and older by at least 30-to-40%.” The assertiveness of this claim might lead you to believe it, but mammography’s effectiveness is controversial. Perhaps the least controversial thing about it is that it works better in women over 50, since the denser breast tissue of younger women makes the results difficult to read, producing many both false negative and false positive results.
In the early 1980s, the Canadian National Breast Screening Study enrolled nearly 40,000 women aged 50 to 59 and randomly assigned them to one of two groups. The women of one group received annual mammography along with regular physical examination of the breasts, while the women in the other group received only regular breast examination. The 13-year follow-up report found no difference in the death rates from breast cancer in the two groups. Christian Rembold, writing in the British Medical Journal, points out that, even for women in the 50-to-59 age group, it is necessary to screen 2,451 women for five years to prevent one breast cancer death. Several authors now question the value of routine mammograms.
Sadly, it appears that no form of regular breast examination is particularly effective at reducing breast cancer deaths. This may seem counter-intuitive, but breast cancer is a slow-growing cancer, often taking 10 years or so to become detectable. Cancers kill by metastasizing, but if breast cancer is going to metastasize it often does so before it is large enough to be detected. Non-metastasizing breast cancers don’t usually cause much harm, so it does not much matter how long they remain hidden. For everybody’s peace of mind, such cancers sometimes are better left undiagnosed
The health-care industry, confronted with equal doubts about the effectiveness of both self-examination and mammography, was quick to drop its support for self-examination, but not for mammography. Rather, radiologists suggest the need to intensify its use. Some now recommend that women as young as 25 be screened, and that the frequency of screening be increased from every two years to twice yearly. The total cost of mammography in the U.S. is already about $3 billion. The health-care industry is not too keen on simple procedures, especially when there is a lucrative high-tech procedure that can be used instead. Campaigns for breast self-examination may go, but mammography seems here to stay. The Economist, in questioning the demands for more mammography in the face of its paltry success, sums it up: “The blunt answer is that breast cancer is one of many diseases about which debate has been distorted by politics.”
Whereas the results of breast screening have been disappointing, treatments for breast cancer have had some modest success. In Canada, it is estimated that improved treatments have reduced breast cancer deaths by about 15% over the last 30 years. Breast cancer is no longer the No. 1 cause of cancer deaths in women, though its demotion has nothing to do with its lowered death rates. Lung cancer in women increased by 300% in the same time period. While attention was focused on expensively trying to reduce breast cancer deaths by medical means, lung cancer, which is easily preventable by not smoking, crept up almost unnoticed to become women’s leading killer cancer. The rates of lung cancer deaths in women are now greater than they ever were for breast cancer.
It is half a century since Sir Austin Bradford Hill and Richard Doll (later Sir Richard) demonstrated incontestably that smoking causes lung cancer. Apart from reducing their own smoking, doctors showed little effective concern. Only in the last 15 years in North America and the last 10 years in Europe has the profession become vociferous about tobacco’s dangers. Doctors in the underdeveloped world still make little protest as tobacco companies, hampered by legislation in their home countries, shift their marketing activities to poorer countries. Increased tobacco use will cause more people to die in the developing nations than their doctors can ever hope to save.
Philip Morris, the makers of Marlboro, stung by the publicity about deaths from smoking, recently had a dramatic change of heart. From adamantly denying that smoking is harmful, they admitted that tobacco kills, but that it does so usefully. The company prepared a cost-benefit analysis of smoking for the Treasury of the Czech Republic. Balancing the costs of increased medical care for smokers and time lost through tobacco-related illness against income from tobacco taxes and the money saved to pension funds and housing costs through smokers’ early deaths, Philip Morris reported a positive balance of US$224 million.
Certainly the viability of pension plans in the developed world is predicated on the fact that many smokers will not be around to collect their pensions. If smokers suddenly stopped smoking, these plans would go bankrupt. Many pension plans are government-financed and, since governments seem more protective of their budgets than of their citizens, Philip Morris has a point.
Although most of us put good health high on our list of desirable assets, we often resist any innovation that might ensure that it is so. The fluoridation of water effectively reduces tooth decay, but some communities refuse to have their water supply “adulterated with chemicals.” Although wearing helmets greatly reduces motorcycle accident deaths, motorcyclists have objected to the laws that make their wearing mandatory. The residents of Toronto are constantly informed that its polluted air is responsible yearly for 1,000 premature deaths and 5,000 hospital admissions, yet they somehow prefer to spend resources on direct health-care instead of making the kinds of environmental changes that would more effectively protect their health.
Acquiring good health can be an inconvenience, so it is tempting to hand the responsibility to someone else; health-care practitioners are eager to jump in. While we can often do much more to preserve our health than doctors possibly can, we continue to believe that, without medical concern and attention, our lives will be abruptly curtailed.
Despite the convictions of clinical ecologists, the developed world, compared to times past, is a healthy place in which to live. Most of us (smokers excepted) can expect to survive comfortably into our 70s and 80s, often with little or no medical intervention. Whether our world will remain a healthy place, however, is a different matter. Countries, like people, have discretionary income. Should we fritter away our resources on often unnecessary health-care while the care of our environment, on which both our own health and the living world ultimately depend, gets short shrift?
As governments and the insurance industry pour more money into health-care, the health-care industry both jacks up its prices and discovers an ever-increasing array of expensive tests and treatments. No amount will ever be sufficient. Despite its insatiable appetite for scarce resources, the results of much modern medicine are disappointing. No matter how available treatment becomes, it cannot be expected to significantly improve our health indices, but money spent in other ways may well do so. The World Bank, in a recent report, says that money spent on housing and better social services does more to improve health than money spent on building hospitals and providing direct health-care.
James Le Fanu, a respected English medical journalist, in his latest book, The Rise and Fall of Modern Medicine, argues that the “Golden Age of Medicine” that started after World War II had come to an end by the mid-1970s. The easy or serendipitous medical discoveries had all been made, and chemists were running out of new substances to test. The cornucopia of new drugs is drying up and the few drugs in the pipeline will be exorbitantly expensive. The antibiotics and anti-malarial drugs are losing their magic and mankind’s oldest killers are reasserting their prowess. Le Fanu suggests that the “New Genetics”--the deciphering of the human genome--while keeping alive medicine’s promise, has in fact contributed little of any practical value.
Perhaps pharmaceutical companies are now investing so heavily in the uncertainty of genes because, with so few fresh pharmacological ideas around, they do not know what else to do with their profits. Are they entangling us all in costly but unusable chains of DNA? Time will tell, but whatever else, such future medical care will likely be too expensive for either most people or nations to afford.
It seems that a “first shall be last” situation prevails between the developed countries when it comes to health-care indices and spending on health-care. Japan spends about the smallest proportion of its national wealth on health, but in the 1980s its life expectancy caught up with and now comfortably surpasses that of the rest of the developed world. The U.S. tops the health-spending list, but, when ranked with other countries in order of life expectancy, it stands only in 37th place--lagging behind such countries as Oman and Morocco. As for infant mortality rates, the Organization for Economic Co-operation and Development’s 1998 Health Data for the 29 developed countries shows the U.S. near the bottom, in 25th place.
We work on the assumption: “If some medical care is useful, more would be better.” We have more of it. By 1998, Canada, for instance, spent more on health-care than on education, and is scheduled to do so in increasing amounts. If we continue this way, we will cook the golden-egg-laying goose, for modern medicine is unsustainable without the affluence generated by a well-educated community.
While health-care spending has diminishing returns, it creates an escalating problem. Entrepreneurs, big and small, prefer to go where the money is, and the big money is now in health-care. Illness expands to use the health-care dollars available. Doctors and other health-care practitioners seldom sit and wait for work to come their way; they make their own. Any enthusiastic health-care practitioner or any purveyor of latter-day nostrums, from “Big Pharma” to backyard herbalists, can quickly convince the worried-well that they are sick or about to become so and stand in urgent need of treatment. Into the bargain, because treatment is a motherhood issue, governments and insurance companies can usually be manipulated into paying for it. Any attempt to limit such payment is labelled “rationing of health-care”--the death knell for any government that tries to do it.
When suppliers of treatment determine its demand, they create an economic situation in which demand soon becomes unlimited. The growth of health-care costs in the U.S. demonstrates the results of such perverse economics. The Office of the Actuary of the Health-Care Financing Administration reports that the annual per capita outlay for medical care in the U.S. increased 200-fold from $204 in 1965 to $4,100 in 1995. If medical expenditures are allowed to increase at the same rate, the average annual per capita outlay for 2025 will be $82,000. In 1998, 44 million Americans could not afford health-care premiums, and each year another million are added to the list of the medically unprotected.
Daniel Callahan, the co-founder of the Hastings Center, the prestigious health research institute for bioethics in New York, in his book False Hopes, argues that America’s hypochondriacal quest for perfect health has become an unsustainable impediment to normality. The excellent becomes the enemy of the good. The vast costs of the health-care enterprise both deprive people of basic health-care and impose a heavy financial burden on the community at large. Expensive medical care is counterproductive. It does little to reduce sickness, but it certainly increases the number of have-nots.
For the health-care industry, nurses are a dead loss. They neither prescribe remunerative treatments nor order expensive investigations. To cut costs, high-tech health-care economizes on their salaries. Also, for good or bad, young women are now much less prepared to be at the beck and call of the sick; recruitment into nursing is down. Practising nurses get older and, since those who remain are managed to distraction by professional health-care administrators, they opt for early retirement. Our health-care services go from crisis to crisis. The officials who run them veer between trying to contain costs and defusing the anger of patients and families for the inadequacies of their services. Worse is yet to come. Half of all health-care spending goes on the health-care needs of people over 65, and the numbers of old fogeys like me are mounting. When the baby boomers slide into decrepitude, they will be lucky to have a nurse even speak to them, let alone get a bedpan.
When health-care is orchestrated by health-care practitioners, it quickly becomes unaffordable. We may choose to believe that health-care is our right, but, unless we make some changes in our health-care delivery systems, health-care won’t be there when most of us will want it.
Science, with its astonishing discoveries and inventions, has contributed enormously to our prosperity and hence to human health. It is science that has given us X-rays and all the other technologies that have made our medical miracles possible. Medicine has accomplished some remarkable feats in the last few decades, but it has also remained distressingly static. While science in general has become greatly sophisticated in its understanding of uncertainty and complex events, medicine has boxed itself in with a simplistic pre-Copernican approach to the complex antecedents of illness. It is stuck with a practitioner-centred model unsuited either for understanding the multifaceted antecedents of health or for the effective provision of the community’s health-care needs. Just as the clergy once resisted the displacement of Earth from the centre of the universe (and of course the displacement of their own authority), the helping professions resist changes that displace them from the centre of human health. Good health requires more than good medicine.
It would be wonderful if our health-care and legal services were as beneficial to us as they are to the professionals who provide them. The great upsurge of interest in alternative methods may not add much to health, but it demonstrates people’s wide commitment to staying well when they are allowed to become active participants in the endeavour. Perhaps society’s central health-care question should be: how can practitioners be encouraged to collaborate with their communities and patients to become better decision-makers?
It is one thing to pay practitioners well for work that needs doing, and another to pay them for work that is unnecessary. The moguls of health-care notwithstanding, it probably is possible to provide health-care services that:
• are pleasant, humane, and effective;
• will not bankrupt the community;
• will not encourage practitioners to manufacture unnecessary illnesses; and
• will not demoralize those practitioners so that they go elsewhere.
Designing such services will require understanding, ingenuity, flexibility, and experimentation, combined with much public support. Health is not “a state of complete physical, mental, and social well-being.” It is a capacity, despite the vicissitudes of life, to find “a place in the sun” and a chance to add to the rich panoply of life. Not all our ills require expensive treatment; some do better without. By relinquishing our hypochondriacal determination to find a cure (or compensation) for every ill, we might succeed in designing health-care services that are accessible when we need them and sufficiently solvent even to allow us to incorporate some of the exciting medical innovations that may accrue from our exploration of the human genome.
(Andrew Malleson is a specialist in internal medicine, a psychiatrist with Toronto’s University Health Network, and a psychiatric consultant to the Canadian government’s Occupational Health and Safety Agency. This article was excerpted from his new book, Whiplash and Other Useful Illnesses, published by McGill-Queen’s University Press. The book is extensively footnoted, referenced, and indexed. This documentation has been omitted from the foregoing article, but Monitor readers can find it in abundance in the book.)