Under the much debated HITECH legislation in the American Recovery and Reinvestment Act of 2009, HIPAA covered entities and their business associates must notify patients and in some cases the secretary of Health and Human Services of privacy breaches pertaining to identifiable patient records. I have written previously about the distinction between privacy and security breaches, and I am going to focus on the security breach aspect today.
In the language, the secretary of HHS is required to specify technologies and methodologies that would render protected health information unusable, unreadable, or indecipherable to unauthorized individuals. If covered entities and their business associates apply such technologies and methodologies, they will not be required to provide notice of the breach as otherwise required by the act.
HHS specified that the “unusable, unreadable, indecipherable” test has been met if the breached data has been encrypted and the security of the key has not been compromised. HHS also specifies that the encryption must also comply with the HIPAA security rule’s provisions. To make things easier on us, HHS actually gives two examples of encryption that meets the standard:
- For data at rest, encryption consistent with NIST Publication 800-111
- For data in transit, encryption complying with Federal Information Processing Standard (FIPS) 140-2
One way of securing data in a NIST 800-111 consistent way is the use of disk encryption. Microsoft’s BitLocker is available with certain editions of Windows 7, Windows Server 2008, and Windows Vista and is also FIPS 140-2 validated, so is McAffee’s SafeBoot and there are many others available as well. It may be cumbersome for healthcare CIOs to have all their applications tested in a disk encrypted environment on the endpoints and the transition may take some time.
FIPS 140-2 includes several layers of security and HITECH/HIPAA does not seem to specify which one the government would deem appropriate to grant the reporting exception. I am certainly thinking about this topic from a virtualization perspective, where the data would never leave the datacenter. Applications or entire desktops would execute securely inside the datacenter and be accessed by end users over a high performance delivery protocol that provides a great user experience. This is already done widely for clinical apps in the healthcare space and providing FIPS 140-2 compliant remote access is a problem that has been solved. However, I am wondering what would need to happen inside the datacenter? I have my thoughts on this topic but I am curious to hear from you.
What do you anticipate the internal or external auditing procedures to be?
- Remote access only?
- FIPS 140-2 for all server to server communication inside the datacenter?
- FIPS 140-2 even for server to storage communication for medical apps?
Please comment directly on these pages.