Physicians have struggled with the management of patient data for a long time. The struggle intensifies as we attempt to juggle increasingly large and complicated volumes of information during a 24-hour day. As the number and acuity of patients increase in parallel, our abilities to sift critical information and prioritize data are key.
Alarms or alerts to abnormal parameters are of limited benefit and may be counterproductive. The techniques of data display and information visualization hold great promise for revolutionizing how we manage this data overload. Other industries have realized such benefits, and healthcare (especially in the hospital) has good reason to want to catch up. In the meantime, before we can say management of clinical data in the hospital is optimal, there remains much work to do.
Physicians have struggled with the management of patient data for a long time. Such information used to be relatively simple—heart rate, respiratory rate, skin color and temperature, and so on. The limits of technology fundamentally limited what physicians could observe and record.
As our ability to gather information became more sophisticated, so did the data we could acquire. Still, the physician remained the primary collector, assessor, and interpreter of tests and their results. Individual physicians would spin urine and examine the sediment, perform blood smears, and even examine tissue samples for pathology. This was a manageable task for the physician because the number of tests was small, and the interpretation of results was fairly straightforward.
Today tests and the ways we can interpret them are both more numerous and more complicated. This has resulted in a significant issue for clinicians: How can we manage all of this information?
Too Much Data
The quantity of data available for the busy clinician is always increasing. This data explosion is happening for three reasons:
- Increased number of sophisticated tests. We test for more diseases, traits, and conditions than ever before. Example: Troponin I, T, and beta natriuretic peptide—all in widespread use today—were not available 10 years ago. Advanced genetic testing will continue this trend;
- Increased archival capability. The cost of data storage continues to decrease, making it inexpensive to archive data that might have been purged in the past; and
- Increased sophistication of data delivery methods. Computers and the networks that connect them are faster than ever. This allows for efficient transfer of data from the archive to the user. It also allows the user to access the data from a variety of geographic locations, including an outpatient office or home.
Patient care in the ICU provides a perfect example of the volumes of data that we generate in the course of clinical care. Monitors capture moment-by-moment readings of heart rate, blood pressure, respirations, oxygen saturation, temperature, electrocardiographic tracings, and more. In addition to capturing the patient’s physiologic signals, we also measure the interventions we perform on patients. We record intravenous fluid and medication rates, artificial ventilation parameters, and so on. A decade ago, East estimated the number of information categories in the ICU to have been in excess of 236.1 Certainly that number has only increased.
Increasingly Complicated Data
As the number of tests has increased, interpretation of the results has become more complex. In many institutions samples are obtained by highly qualified personnel—not the primary physician. Depending on the test, the sample may be sent to a lab (sometimes in a different area of the country) where another individual may perform the test. Finally, a trained observer reviews the results, may make an interpretation of those results, and then records that interpretation—together with the objective data—in the patient’s medical record. These data are then available for the physician to review.