Resolving the Productivity Paradox of Health Information Technology

During the past decade, the US health care system has gone digital. In 2008, fewer than 1 in 10 US hospitals had an electronic health record (EHR) system; today, fewer than 1 in 10 does not. The increase in use of an EHR system in ambulatory practices has been similarly steep.

Although the evidence that health care digitization has led to improvements in quality and safety is generally positive,1 these have been context-dependent improvements and relatively small. Moreover, rates of physician self-reported burnout are high, partly because little useful intelligence is delivered back to physicians despite all their time spent performing data entry. In addition, the presence of the computer often separates the physician from the patient, rather than enhancing communication.

Perhaps most surprising, gains in efficiency and productivity have not materialized. This represents a clear example of the productivity paradox of information technology—the observation that productivity does not immediately increase with investment in information technology.2

The good news from other industries is that the productivity paradox eventually resolves. In health care, the combination of widespread digitization, advances in machine learning and artificial intelligence, the opening of the health information technology business environment to both established technology companies and start-ups focused on solving specific health care problems, and growing competencies in systems thinking, work redesign, and population health have put health care on the cusp of potentially substantial improvements in quality, safety, patient and clinician experience, and efficiency. In this Viewpoint, we describe several reasons for optimism.

The Old Health Care Model

In a world of paper records, each physician did her or his best to care for each individual patient, but there was no platform to enable a learning health system. Learning in the clinical setting often emerged 1 case at a time, such as through reviews of interesting cases (as in residents’ report in teaching hospitals) or those with poor outcomes (as in morbidity and mortality conferences). This approach was promoted by a training model and culture oriented to the single patient, and a record-keeping system that made it cumbersome or impossible to glean insight from populations of patients.

In the early 2000s, the Institute of Medicine reported that medical care in the United States was prone to errors, highly variable, often involved use of non–evidence-based practices, and was enormously inefficient.3 This led to new public reporting, accreditation, and payment systems that began holding health care organizations and clinicians accountable for value. This, in turn, drove health care organizations to build professionalized quality and safety departments, implement training in improvement techniques, and promote system redesign focused on higher reliability.

Even though digitization should have enabled these efforts, its contribution has been less than might have been expected. At the level of individual patient care, EHR decision support is rudimentary, and often delivered as poorly designed alerts and alarms with high rates of false-positive results and recommendations that lead to limited efficacy and even some safety problems. Clinical notes have become longer but not more useful, often including copy-and-pasted data of questionable veracity and value. The ability to extract data from EHRs for meaningful analysis is limited, as has been the capacity to integrate useful electronic tools and apps—particularly those built by third parties—back into EHR systems.

Why Might Now Be Different?

The lesson from other industries is that the productivity paradox eventually resolves, often over 1 to 2 decades. This was true both for the electrification of factories at the start of the 20th century and the digitization of manufacturing, banking, and retail at the turn of this century. Researchers have identified 2 keys for this resolution: improvements in the technology and a reimagining of the work.4 Health care is now poised for similar improvements.

The first reason is the expansion of health care’s digital business environment. The amount of digital health care data today is measured in exabytes and zettabytes, almost inconceivable amounts of information. Drawn by the business and clinical opportunities potentially associated with leveraging these data, investments in health care start-ups have increased substantially, and virtually all major general purpose technology companies have announced significant health care initiatives.

Current EHR vendors are also building capacity in advanced data analytics. For various reasons, some of these vendors have tried to limit the ability of other parties to extract data from their systems, but it seems likely that the next phase of health care digitization will overcome such barriers. If this happens, advanced tools and techniques will be applied to EHR data, with the results made available to decision makers such as physicians, health care administrators, and patients to hopefully improve outcomes and efficiency. New technologies (such as application programming interfaces) that ease the ability of outside digital tools to integrate into EHRs will be an important enabler, as will consensus standards for health care information exchange (such as Fast Healthcare Interoperability Resources or FHIR).

Another important advance is the exponential improvement in machine learning and artificial intelligence. Outside health care, artificial intelligence has fueled the driverless car, computer victories in games previously thought unapproachable by machines, and digital voice assistants. In health care, these techniques now allow computers to identify diabetic retinopathy with accuracy similar to that of experienced ophthalmologists.5 Once EHR data are fully unlocked, rapid improvements may occur in the ability to make knowledge and insights that were previously possessed only by specialists available for the care of every patient.

In this regard, it is worth considering the capacity of digital systems to scale expertise. Until recently at UCSF Medical Center, a hospitalized patient with diabetes would be cared for by a hospitalist or a surgeon, who would consult an endocrinologist in only the most problematic cases. Today, an experienced endocrinologist reviews the course of every hospitalized patient with diabetes and certain risk factors each day through a digital portal and makes management recommendations to clinicians providing direct patient care. Since implementation of this approach, there has been a 39% reduction in patients with hyperglycemia (from 6.6 per 100 patient-days in 2013 to 4.0 per 100 patient-days in 2015) and a 36% reduction in serious hypoglycemic events (from 0.78 per 100 patient-days in 2013 to 0.49 per 100 patient-days in 2015).6

This kind of program is likely to become more commonplace, ultimately leveraging artificial intelligence–enabled risk predictors and treatment algorithms that do not necessarily require direct subspecialist participation except in unusual or difficult cases. The same technology that allows tailored, personalized search or shopping recommendations could be applied to improve patients’ outcomes by providing customized help to clinicians along the lines of “patients like this did better when their insulin dose was raised” or “patients with uncomplicated pneumonia are generally afebrile by now. Reconsider your diagnosis.”

A Path Forward

Evolving models of systems thinking and population health, advances in artificial intelligence, and greater business investment are positioning health care for transformation. However, the history of health information technology offers room for skepticism.

Pessimists can cite several important obstacles. First, important privacy and security concerns may impede the data portability needed to facilitate this work. Second, current EHR vendors may succeed in preventing others from accessing data or linking apps back to the EHR to create a seamless experience for clinicians and patients. Third, technology companies, even those with dazzling capabilities in areas such as artificial intelligence, may fail to create the partnerships with clinicians and health care delivery systems needed to build tools that are truly useful and respectful of clinicians’ culture and workflow.

Fourth, the analogies to other industries—particularly historical experience that predicts a resolution of the productivity paradox—may prove ill-suited to predicting the future of health care. For example, patients do not “shop” for health care in the same way they shop for commodities, comparing prices (often not known in health care) and quality before making a decision. Moreover, technology has not made health care less expensive; indeed, the evidence suggests otherwise, unlike virtually all other industries, technology has made health care more labor intensive and more expensive.

Fifth, technological progress in health care may be slowed by the clinical and medicolegal consequences of mistakes. “Fail fast and then iterate” may be acceptable when designing a restaurant reservation app, but the stakes are profoundly higher when the digital tool affects clinical outcomes rather than many consumer-oriented tasks.

On balance, the forces promoting progress may soon become sufficiently powerful to overcome these obstacles. If this occurs, health care could realize the kinds of improvements that have emerged in other industries that ultimately overcame their version of the productivity paradox. Although this transition is sure to be challenging and somewhat unpredictable, history suggests that the most likely result will be higher quality, safer, more satisfying, and less expensive health care, and better health outcomes.

References

1.
Jones SS, Rudin RS, Perry T, Shekelle PG. Health information technology. Ann Intern Med. 2014;160(1):48-54. doi:10.7326/M13-1531PubMedGoogle ScholarCrossref
2.
Jones SS, Heaton PS, Rudin RS, Schneider EC. Unraveling the IT productivity paradox. N Engl J Med. 2012;366(24):2243-2245. doi:10.1056/NEJMp1204980PubMedGoogle ScholarCrossref
3.
Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
4.
Brynjolffsson E, Hitt LM. Beyond the productivity paradox. Commun ACM. 1998;41(8):49-55. doi:10.1145/280324.280332Google ScholarCrossref
5.
Gulshan V, Peng L, Coram M, et al. Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA. 2016;316(22):2402-2410. doi:10.1001/jama.2016.17216PubMedGoogle ScholarCrossref
6.
Rushakoff RJ, Sullivan MM, MacMaster HW, et al. Association between a virtual glucose management service and glycemic control in hospitalized adult patients. Ann Intern Med. 2017;166(9):621-627. doi:10.7326/M16-1413

Related Posts

  • 48
    With the routine use of electronic health records (EHRs) in hospitals, health systems, and physician practices, there has been rapid growth in the availability of health care data over the last decade. In addition to the structured data in EHRs, new methods such as natural language processing can derive meaning…
    Tags: data, care, health