When you dial 911 for a medical emergency, the outcome may very well depend on the 411 — the quality of the information available about your condition and ways to treat it.
No aspect of human life has escaped the impact of the Information Age, and perhaps in no area of life is information more critical than in health and medicine. As computers have become available for all aspects of human endeavors, there is now a consensus that a systematic approach to health informatics — the acquisition, management, and use of information in health — can greatly enhance the quality and efficiency of medical care and the response to widespread public health emergencies.
Health and biomedical informatics encompass issues from the personal to global, ranging from thorough medical records for individual patients to sharing data about disease outbreaks among governments and international health organizations. Maintaining a healthy population in the 21st century will require systems engineering approaches to redesign care practices and integrate local, regional, national, and global health informatics networks.
On the personal level, biomedical engineers envision a new system of distributed computing tools that will collect authorized medical data about people and store it securely within a network designed to help deliver quick and efficient care.
Basic medical informatics systems have been widely developed for maintaining patient records in doctor’s offices, clinics, and individual hospitals, and in many instances systems have been developed for sharing that information among multiple hospitals and agencies. But much remains to be done to make such information systems maximally useful, to ensure confidentiality, and to guard against the potential for misuse, for example by medical insurers or employers.
For one thing, medical records today are plagued by mixtures of old technologies (paper) with new ones (computers). And computerized records are often incompatible, using different programs for different kinds of data, even within a given hospital. Sharing information over regional, national, or global networks is further complicated by differences in computer systems and data recording rules. Future systems must be engineered for seamless sharing of data, with built-in guarantees of accurate updating and ways to verify a patient’s identity.
Keeping track of individual records is just part of the challenge, though. Another major goal is developing trusted systems that offer relevant decision support to clinicians and patients as well as archive medical research information. Doctors suffering from information overload need systematic electronic systems for finding information to treat specific patients and decision support systems to offer “just in time, just for me” advice at the point of care.
“There is a need,” writes Russ Altman of Stanford University, “to develop methods for representing biological knowledge so that computers can store, manipulate, retrieve, and make inferences about this information in standard ways.” [Altman p. 120]
Apart from collecting and maintaining information, health informatics should also be put to use in improving the quality of care through new technologies. Some of those technologies will involve gathering medical data without a visit to the doctor, such as wearable devices to monitor such things as pulse and temperature. Monitoring devices might even come in the form of tiny electronic sensors embedded in clothing and within the body.
Such devices are emerging from advances in microelectronic mechanical systems for health care delivery as wireless integrated microsystems, or WIMS. Tiny sensors containing wireless transmitter-receivers could provide constant monitoring of patients in hospitals or even at home. If standardized to be interoperable with electronic health records, WIMS could alert health professionals when a patient needs attention, or even trigger automatic release of drugs into the body when necessary. In effect, every hospital room could be turned into an ICU. Seamlessly integrating the input from such devices into a health informatics system raises the networking challenge to a new level.
On the local to global scale, a robust health informatics system would enable health professionals to detect, track, and mitigate both natural health and terrorism emergencies.
Biological and chemical warfare are not new to human history. From ancient times, warriors have tried to poison their enemies’ water. Today, of course, the threat of such attacks comes not only from military engagements in ongoing wars, but from terrorists capable of striking almost anywhere at any time. Protecting against such assaults will require an elaborate and sophisticated system for prompt and effective reaction.
Meeting that challenge is complicated by the diverse nature of the problem — terrorists have a vast arsenal of biological and chemical weapons from which to choose. Perhaps the most familiar of these threats are toxic chemicals. Poison gases, such as chlorine and phosgene, essentially choke people to death. Mustards burn and blister the skin, and nerve gases, which are actually liquids, kill in the same way that pesticides kill roaches, by paralysis.
As serious as chemical attacks can be, most experts believe their risk pales in comparison with their biological counterparts. Of particular concern are potent biological toxins including anthrax, ricin, and botulism neurotoxin.
Anthrax has received special attention, partly because of the deaths it caused in the U.S. in 2001, but also because its potential to produce mass death is so large. It’s not hard to imagine scenarios where airborne release of anthrax could infect hundreds of thousands of people. Antibiotics can be effective against anthrax bacteria if provided soon enough. But that window of opportunity is narrow; after the germs release their toxic chemicals, other defenses are needed.
Providing data to feed an informatics system in preparation for bio and chemical terror involves engineering challenges in three main categories. One is surveillance and detection — monitoring the air, water, soil, and food for early signs of an attack. Next is rapid diagnosis, requiring a system that can analyze and identify the agent of harm as well as track its location and spread within the population. Finally come countermeasures, powered by nimble operations that can quickly develop and mass-produce antidotes, vaccines, or other treatments to keep the effects of an attack as small as possible and track how effective the countermeasures are.
Efficient and economical monitoring of the environment to find such agents early is a major challenge, but efforts are underway to develop sensitive detectors. “Artificial noses,” for example, are computer chips that can sort out and identify signals from thousands of potentially deadly chemicals. These systems are still much less sensitive than the canine nose, however, and their perfection is an engineering challenge. Toxins or viruses might also be identified using biological detectors. Ultra-tiny biological “nanopore” devices can be engineered, for example, to send electrical signals when a dangerous molecule passes through the pore.
Yet another novel method would track not the attack molecule itself, but molecules produced by the body’s response to the invader. When exposed to bacteria, immune system cells known as neutrophils alter their internal chemistry. Profiling such changes can provide clues to the invader’s identity and suggest the best counterattack. Databases cataloging the cellular response to various threats should ultimately make it possible to identify biowarfare agents quickly with simple blood tests.
Nothing delivers as much potential for devastation as natural biology. From the bacterium that killed half of European civilization in the Black Death of the 14th century to the 1918 Spanish Flu pandemic that killed 20 million people, history has witnessed the power of disease to eradicate huge portions of the human population.
In the 21st century, the prospect remains real that flu — or some other viral threat, yet unknown — could tax the power of medical science to respond. Bird flu, transmitted by the virus strain known as H5N1, looms as a particularly clear and present danger.
A major goal of pandemic preparedness is a good early warning system, relying on worldwide surveillance to detect the onset of a spreading infectious disease. Some such systems are now in place, monitoring data on hospital visits and orders for drugs or lab tests. Sudden increases in these events can signal the initial stages of an outbreak.
But certain events can mask trends in these statistics, requiring more sophisticated monitoring strategies. These can include tracking the volume of public Web site hits to explain acute symptoms and link them to geocodes, such as zip codes. Having an integrated national information technology infrastructure would help greatly. Closures of schools or businesses and quarantines may actually reduce hospital use in some cases, and people may even deliberately stay away from hospitals for fear of getting infected. On the other hand, rumors of disease may send many healthy people to hospitals for preventive treatments. In either case the numbers being analyzed for pandemic trends could be skewed.
New approaches to analyzing the math can help — especially when the math describes the network of relationships among measures of health care use. In other words, monitoring not just individual streams of data, but relationships such as the ratio of one measurement to another, can provide a more sensitive measure of what’s going on. Those kinds of analyses can help make sure that a surge in health care use in a given city because of a temporary population influx (say, for the Olympics) is not mistaken for the beginning of an epidemic.
Similarly, mathematical methods can also help in devising the most effective medical response plans when a potential pandemic does begin. Strategies for combating pandemics range from restricting travel and closing schools to widespread quarantines, along with vaccinations or treatment with antiviral drugs.
The usefulness of these approaches depends on numerous variables — how infectious and how deadly the virus is, the availability of antiviral drugs and vaccines, and the degree of public compliance with quarantines or travel restrictions. Again, understanding the mathematics of networks will come into play, as response systems must take into account how people interact. Such models may have to consider the “small world” phenomenon, in which interpersonal connections are distributed in a way that assists rapid transmission of the virus through a population, just as people in distant parts of the world are linked by just a few intermediate friends.
Studies of these methods, now at an early stage, suggest that rapid deployment of vaccines and drugs is critical to containing a pandemic’s impact. Consequently new strategies for producing vaccines in large quantities must be devised, perhaps using faster cell culture methods rather than the traditional growing of viruses in fertilized eggs. A system will be required to acquire samples of the virus rapidly, to sequence it, and then quickly design medications and vaccines. The system needs to have technologies to enable rapid testing, accompanied by a system for accelerating the regulatory process. If there is an emergency viral outbreak that threatens widespread disease and death in days or weeks, regulatory approval that takes years would be self-defeating.
“It will be imperative to collect the most detailed data on the . . . characteristics of a new virus . . . and to analyze those data in real time to allow interventions to be tuned to match the virus the world faces,” write Neil Ferguson of Imperial College London and his collaborators. [Ferguson et al. p. 451]
The value of information systems to help protect public safety and advance the health care of individuals is unquestioned. But, with all these new databases and technologies comes an additional challenge: protecting against the danger of compromise or misuse of the information. In developing these technologies, steps also must be taken to make sure that the information itself is not at risk of sabotage, and that personal information is not inappropriately revealed.
R.B. Altman, “Informatics in the care of patients: Ten notable challenges,” West j Med 166 (February 1997), pp. 118-122.
Robert Booy et al., “Pandemic vaccines: Promises and pitfalls,” Medical Journal of Australia 185 (20 November 2006), S62-S65.
James C. Burnett et al., “The Evolving Field of Biodefence: Therapeutic Developments and Diagnostics,” Nature Reviews Drug Discovery 4 (April 2005), pp. 281-297.
Fabrice Carrat et al., “A ‘small-world-like’ model for comparing interventions aimed at preventing and controlling influenza pandemics,” BMC Medicine 4 (2006), p. 26.
Neil M. Ferguson et al., “Strategies for mitigating an influenza pandemic,” Nature 442 (July 27, 2006), pp. 448-452. DOI:10.1038/nature04795.
Timothy C. Germann et al., “Mitigation strategies for pandemic influenza in the United States,” Proceedings of the National Academy of Sciences USA 103 (April 11, 2006), pp. 5935–594.
Margaret A. Hamburg, “Bioterrorism: Responding to an emerging threat,” Trends in Biotechnology 20 (July 2002), pp. 296-298.
Reinhold Haux, “Individualization, globalization and health – About sustainable information technologies and the aim of medical informatics,” International Journal of Medical Informatics 75 (2006), pp. 795–808.
Roland R. Regoes et al., “Emergence of Drug-Resistant Influenza Virus: Population Dynamical Considerations,” Science 312 (April 21, 2006), pp. 389-391. DOI: 10.1126/science.1122947
Alan J. Russell et al., “Using Biotechnology to Detect and Counteract Chemical Weapons,” The Bridge 33 (Winter 2003).