The Virginia prescription record security breach: The big picture, and using this case as a learning experience

The Virginia Department of Health Professions is having a bad week. Apparently, a hacker downloaded personal health information of eight million individuals, including 35 million prescription records, and then replaced the information on the state website with a crude "ransom" note demanding $10 million in exchange for unlocking the encrypted file containing what is supposedly the only copy of the patient information seized.  (Screenshot of hacked website with notice posted here; see Bob Coffield's post on the story for a good roundup of the facts and review of some HIPAA/ARRA/HITECH implications.)  This has gotten the attention of the digerati and the blogerati, and even of some folks beyond the echo chamber of the blogosphere and twitterverse, out in the real world (like Virginia officialdom, which has gotten communications on this incident off to a slow start).

Update 6/5/09: Virginia security breach notices are going out — a month after the fact — to over 500,000 individuals whose social security numbers were part of their prescription records.  Too little, too late? 

So, this episode raises a few questions for me of broader application:

  1. What is the scope of personal data insecurity in this country?
  2. What preventive maintenance and design steps must or should be taken by all holders of personal data in order to minimize the likelihood of a breach?
  3. In the event of a security breach, what communication is required by law, and what should "best practices" communications strategy look like, beyond what is required by the letter of the law?

Let's hack away (unconscious choice of words while typing) at these questions one at a time.

Scope of the Problem The scope of the issue is, not to put too fine a point on it, real broad, and getting broader daily. The issue is relevant to financial and other data, but for purposes of this post, I'll confine my observations to personal health data ("protected health information," "PHI" or "individually identifiable health information" in HIPAA-speak). In the bad old days (which are perhaps coming to a close one of these years thanks to the $19 billion HITECH Act handout), PHI insecurity was limited to the problem of folks who might wander into a file room and get a hold of your medical records without having a good reason to do so. Thanks to the computerization of medical records in a desktop computer, laptop, server, storage device, or "in the cloud" (now that's a whole other can of worms), millions of records are out there for the hacking. Given the lackadaisical attitude that some have towards data security, these records are accessible to bad-intentioned identity thieves as well as to recreational hackers. The scope of the issue may be glimpsed through a visit to the Privacy Rights Clearinghouse site, A Chronology of Data Breaches, a wonderful compendium of data security breach incidents (beginning January 2005) and related resources (not yet updated as of this writing to include a reference to the Virginia debacle). This chronology is not limited to health care data breaches; a quick scan seems to confirm that the Virginia incident is among the largest health care data breaches, but it is not the first breach of a state agency system.  (And remember the Express Scripts ransom hacker case a little while back?)

Prevention
  Data security and privacy protections applicable to PHI have been ratcheted up a couple notches this year with the Son of HIPAA provisions thrown into ARRA, the FTC Red Flags Rule and some parallel state rulemaking activity (see, e.g., Massachusetts data security rule). With all these recent changes, new comprehensive preemption analyses will have to be undertaken, but I'll offer a couple of observations: It is imperative that all health care providers and business associates undertake privacy and security audits of their current operations. This includes a review of policies and procedures (and the adoption of policies and procedures later this year by business associates, which were not required to have them in place pre-ARRA), to ensure compliance with HIPAA, Son of HIPAA, FTC Red Flags Rule (if applicable; it relates to businesses that extend credit, defined very broadly, and snaps into effect August 1, after a couple of delays), and state privacy laws. All policies and procedures need to be beefed up as appropriate. Hardware, software and wetware must be tested for compliance and must also be beefed up as needed. In my community, when faced with a computer problem, we always say: "Ask a teenager!" In addition to the usual trusted advisors, it migh

t not hurt to spot-check security systems by challenging a reliable computer-savvy teenager (or twentysomething) to hack into a system.

Breach Notification  ARRA Sec. 13402 (p. 146) technically doesn't require a breach notification to be sent to affected folks in the Virginia matter because the regs aren't out yet (they're due out by August, effective 30 days later).  Guidance on what makes data unreadable by unauthorized folks has been released for public comment — if Virginia made the data secure according to the definitions in this guidance, then its release would not be considered a breach, and would not trigger notification requirements.  These guidelines are something to consider in designing secure environments for data — they address both data in use and data at rest, and incorporate by reference some NIST standards.  Adhering to the guidance not only has the PR benefit of allowing an entity to avoid having to make a breach notification, it could even help in preventing breaches in the first place.  It would be interesting to learn whether the Virginia data was protected in the manner called for in this guidance.

Whether or not a notice is required, careful consideration should be given to developing a communications plan for alerting patients to any breach, and to explaining what is being done to minimize the risk of similar (or dissimilar) breaches occurring in the future.  This may be a delicate dance (the folks in Virginia have been saying they can't comment becasue an FBI investigation is underway), but it seems to me that a criminal investigation does not need to bar any and all communications with patients and the public at large about the situation.

As the remaining ARRA rules come out and covered entities and others have a clearer roadmap before them, it will be imperative that they undertake the steps outlined above so that they can maintain compliance with these new requirements, ensure privacy and security of PHI, and stay out of the regulators' sights.

David Harlow
The Harlow Group LLC
Health Care Law and Consulting

David Harlow

Share
Published by
David Harlow

Recent Posts