EHR certification and security is always top of mind in choosing a healthcare software application. Not only does a software application have to prove its clinical or infrastructure value, but if it cannot do so in a secure way it will never be the product of choice.
The security questions asked of software vendors can vary in nature. Generally they revolve around meeting HIPAA requirements for access, authentication, and encryption. In addition, having a product tested against a threat model, such as OWASP, or some other type of security audit provides an extra sense of security for potential buyers. However, the healthcare industry already has a certain level of security standards provided in EHR certification testing.
EHR certification is normally linked to EHR vendors helping their customers meet Meaningful Use requirements. Within the EHR certification, there are a set of security criteria. And any software vendor can test to these security measures as a health IT module, without testing all the other requirements related to Meaningful Use. In this way, healthcare software products can ensure that they meet the same security requirements as an EHR and give their customers peace of mind that they at least meet a certain level of security standards as defined by the ONC.
The security measures included in the 2015 Edition EHR certification include:
- 315.d.1 Authentication, Access, Authorization
- 315.d.2 Auditable Events and Tamper Resistance
- 315.d.3 Audit Reports
- 315.d.4 Amendments
- 315.d.5 Automatic Access Time-out
- 315.d.6 Emergency Access
- 315.d.7 End User Device Encryption
- 315.d.8 Integrity
- 315.d.9 Trusted Connection
- 315.d.10 Auditing Actions On Healthcare Information
The criteria in d.1, d.5 and d.6 have to do with validating the user seeking access to electronic health information. The software application is tested in a number of different ways to ensure that unauthorized users are not allowed to authenticate or access the system. This includes having automatic time-outs to help ensure that a malicious user cannot follow behind an unsuspecting valid user. These criteria do include an option for emergency access if the need arises.
The criteria in d.7, d.8, and d.9 focus on encryption of data, both at rest and in transport. The software application must have a way to either encrypt data at rest, or show that no personal health date is left behind once the application is closed. And for data in transit, encryption and hashing must be used to protect the data, or the use of a standard secure transport such as TLS.
The criteria in d.2, d.3, and d.10 focus on logging and reporting when PHI is handled. This handling of PHI can include a variety of actions such as querying, changing, deleting, adding, printing, or copying. Other related actions are logged as well, such as changing user privileges, disabling the audit log itself, or turning off encryption. All of the logging is required to be included in a report that can be run over a given date range.
These three key areas of security (user access, encryption, and audit logging) provide a baseline that all software applications can meet. Certainly, any EHR is going to meet these security requirements because of Meaningful Use, but other software vendors can also test to these requirements to ensure the industry that they are on par with EHRs when it comes to security. This helps give providers the peace of mind they want, while setting the same baseline for all vendors in healthcare across the board.
Corepoint Integration Engine is certified by the Office of the National Coordinator for Health Information Technology (ONC-Health IT) the the Meaningful Use 2015 Edition Health IT Module.