A RAND study cited in yesterday’s N.Y. Times pegged potential health care cost savings enabled by widespread adoption of EHR systems at about $42 billion a year.

Sounds good, no?  The same article, though, highlighted the fact that such widespread adoption could also increase costs.  For example, consider prescription drugs.  On the one hand, getting good data from EHRs, and plugging it into the NHIN, stripped of personal identifiers, could enable better decisions about prescribing certain drugs — including expensive so-called “blockbuster” drugs that may be prescribed to some patients without clear medical benefits.  On the other hand, data collected and highlighted wthrough EHRs could result in increased prescriptions for some narrowly-targeted pharmaceuticals: good for the patient, but increasing costs to the system.

The article — which is worth reading in full — noted that within managed care systems serving large populations, HIT has already enabled improved treatment and care management decisionmaking and, yes, cost savings.

At least some parts of the HIT industry seem to think that this article’s take-away is that EHR will end up costing money, not saving money, and that it will steer folks wrong.  Even Health Affairs published a paper last month finding that EHR adoption, at least in the ambulatory care setting, and likely in other settings, would be more likely to cost money than save money — though a number of correspondents disagreed with this conclusion.

True implementation of an integrated EHR system will clearly be expensive (though the costs may be spread around, thanks to the safe harbors recently finalized), but there will just as clearly be significant benefits to the exercise.  It remains to be seen whether the benefits will outweigh the costs in a quantifiable manner, but is seems likely that they will.  There is obviously a great deal more work to be done in analyszing this issue as EHR implementation continues.