MD Logic Real World Testing 2022 Results

REAL WORLD TESTING RESULTS for MD LOGIC EHR

General Information

Developer Name:    MD Logic, Inc.      Product Name(s):    MD Logic EHR           Version Number(s):    7.1

Certified Health IT Product List (CHPL) ID(s):    15.04.04.2785.MDLo.07.01.1.191226

Developer Real World Testing Page URL: www.mdlogic.com/solutions/real-world-testing

Summary of Testing Methods and Key Findings

Case management logs, system logs, and secure email logs were reviewed to determine the frequency of incoming data by type as well as the ability of the module to successfully read and incorporate the contained data. Log files obtained during Real World Testing were de-identified and used for analysis in several areas. Using live client environments, we established test APIs according to publicly available instructions and sent requests for patient data. The ability to receive requests and the accuracy of responses was catalogued and error rates tracked. Various test data was imported to and exported from these cloned systems and CQMs calculated and compared to expected results. Error rates were tracked and reviewed to offer additional explanations for possible data gaps.

Multiple certification criteria were tested simultaneously in this testing plan across two use cases. These metrics and use cases represent the specific interoperability scenarios relevant to the specialty care settings listed in our plan: provider to provider, provider to patient, patient to third party and provider to third party. Real World Testing demonstrated that MD Logic EHR is conformant to the following certification criteria: § 170.315(b)(1) Transitions of Care, § 170.315(b)(2) Clinical Information Reconciliation and Incorporation, § 170.315(b)(3) Electronic Prescribing, § 170.315(b)(6) Data Export, § 170.315(c)(1) Record and Export,  § 170.315(c)(2) Import and Calculate,  § 170.315(c)(3) Report, § 170.315(e)(1) View, Download, and Transmit to 3rd party, § 170.315§(g)(7) Application Access – Patient Selection, 170.315§(g)(8) Application Access – Data Category Request and 170.315(g)(9) Application Access –All Data Request. Our overall findings are that our systems function as predicted; that errors found were display errors or client input errors, rather than nonconformities and that we have achieved our goals as laid out in our plan for each measure and care setting.

Standards Updates

Standard (and version)

All standards versions are those specified in USCDI v1

Date of ONC-ACB notification (SVAP of USCDI)

Not applicable

Date of Customer Notification (SVAP only)

Not applicable

USCDI-updated criteria

None

Care Settings

The MD Logic EHR supports the documentation, tracking and sharing of interoperability data within and outside of specialty care settings. For the purposes of testing, our specialty care settings included: General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, Podiatry and Neurosurgery.

 

Metrics and Outcomes

Use Case 1 (Single Patient) Metrics: 

 

Measure 1: Sharing EHI

Catalogue mechanisms were used to share transitions of care documents and EHI, as well as to track usage of the various transport mechanisms. These include transitions of care for patient referrals, transmitting data to be viewed by the patient via a patient portal, and patient transmission of EHI data to a 3rd party.

 

Associated Criterion:                                    

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

Sharing EHI

§ 170.315(b)(1) Transitions of Care 

§ 170.315(e)(1) View Download Transmit

Surescripts and ViewMyHealthRecords.com (MD Logic Patient Portal site)

N/A

 

Outcomes:

As the care settings in which this measure was tested consisted of specialty care providers, we expected that the number of outbound referrals sent via secure email would be low. In fact, among 14 client settings over the 6 month testing period, only four outgoing referrals were sent using this method. All 4 of these were shared successfully, resulting in a 100% success rate. There were no errors.

 

The more frequent method of sharing EHI was by means of the patient portal at ViewMyHealthRecords.com. The tracked clients provided 20,648 patients access to their health information via patient portal across the 6 month testing period. No issues were encountered related to the ability of patients to access their data or the readability of the data at ViewMyHealthRecords.com. During the same six months, 597 patients viewed their health data a total of 1942 times. Seven patients We either downloaded their data or transmitted it to a third party. No errors were logged or reported. As expected, transmission and readability were successful in all cases.

 

Measure 2: Receiving and Incorporating EHI

Track the frequency of incoming CCDA documents via Direct protocol and manual import, and the success rate of incorporating information into the patient chart. 

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

Receiving and Incorporating EHI

§ 170.315(b)(1) Transitions of Care  § 170.315(b)2 – Clinical information reconciliation and incorporation

Surescripts Direct Protocol

N/A

 

Outcomes:

A set of client systems were tracked over a 6 month testing period. Over that period, 1252 CCDA documents were received via Direct protocol. None were received via manual import. 558 CCDA documents were successfully incorporated into the patient chart. Of the 14 clients, 5 experienced the same error attempting to incorporate CCDAs into the patient’s chart. The error message claimed that properly filed CCDAs were not filed. This was a display error rather than a nonconformance. As we discovered the error, we promptly rectified the error for each client. The discrepancy between the number of CCDAs received and those incorporated can otherwise be attributed to several factors, none of which indicate a nonconformance: duplicate CCDAs received, CCDAs received for patients who were never seen and patients for whom there was no attempt to reconcile the CCDA.

 

Measure 3: Electronic Prescribing

Track the frequency of nonscheduled medications prescribed in the specialty care setting and the success rate of timely electronic transmission of those prescriptions.

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

Electronic Prescribing

§ 170.315(b)3 – Electronic Prescribing

Surescripts Direct

N/A

 

Outcomes:

Our expectation was that 90% of electronic prescriptions would be successfully transmitted in a timely manner to NewCrop via Surescripts. Over the 6 month testing period, 14 client systems generated 12,604 electronic prescriptions, 11,583 of which were successfully transmitted as expected, for a success rate of 92%. This number is what was expected, however we were correct in our expectation that the gap between the actual success rate and a perfect success rate was caused exclusively by user error rather than system error. We identified the following to be the primary causes of issues with submitting electronic prescriptions:

  • User accidentally tried to prescribe a duplicate script to the same patient
  • User without permission to process ePrescriptions attempted to send
  • User prescribed a medication who’s NDC had become ineligible and when the replacement was sent, ineligible prescription was never deleted

 

Measure 4: Application Programming Interfaces

Assess the ability of an API to receive and respond to calls in accordance with the publicly available documentation.

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

Application Programming Interfaces

§ 170.315 (g)7 – Application access – patient selection

§ 170.315 (g)8 – Application access – data category request   

§ 170.315 (g)9 – Application access – all data request

N/A

See Notes Below

 

Outcomes:

On 3 client systems, we created a test user and gave access to particular patients. On each client system, 24 individual data calls were made for individual and all category requests, and in all cases the expected result was achieved as follows:

 

  • Login with bad credentials
    • Http Status Result: 401 Unauthorized
  • Login with good credentials
    • Result: 200: OK
  • Data Category with invalid category
    • Result: 400: Invalid Request
  • Data Category with invalid access token
    • Result: 401: Unauthorized
  • Data Category: Encounter
    • Result: 200: OK
  • Data Category: Problem
    • Result: 200: OK
  • Data Category: Procedures
    • Result: 200: OK
  • Data Category: CareTeam
    • Result: 200: OK
  • Data Category: Medications
    • Result: 200: OK
  • Data Category: MedicationAllergies
    • Result: 200: OK
  • Data Category: Immunizations
    • Result: 200: OK
  • Data Category: Goals
    • Result: 200: OK
  • Data Category: CarePlan
    • Result: 200: OK
  • Data Category: VitalSigns
    • Result: 200: OK
  • Data Category: SmokingStatus
    • Result: 200: OK
  • Data Category: LabTests
    • Result: 200: OK
  • Data Category: LabResults
    • Result: 200: OK
  • Data Category: UniqueDeviceIds
    • Result: 200: OK
  • Data Category: Assessment
    • Result: 200: OK
  • Data Category: ReasonforReferral
    • Result: 200: OK
  • Data Category: HealthConcerns
    • Result: 200: OK
  • Data Category: FunctionalStatus
    • Result: 200: OK
  • Data Category: CognitiveStatus
    • Result: 200: OK
  • Data Category: All Data
    • Result: 200: OK
    • Result also includes CCDA document.

 

 

Challenges Encountered:

Due to lack of real world usage of APIs by our clients, this measure was tested via simulations on live client systems. This was anticipated in our Real World Testing plan.

 

Use Case 2: Population Services

 

Measure 1: Data Export

Assess the ability of the Health IT to create an export summary for a set of patients with specified parameters as well as the usage of the feature.

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

Data Export

§ 170.315(b)6 – Data Export

N/A

N/A

 

Outcomes:

The data export is not a commonly used feature by our clients, but of our clients, three did attempt to export data. Those three clients were successful in their attempts.

 

Measure 2: CQMs: Import

Assess the ability to import a QRDA I file and use the associated data to calculate CQMs.

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

CQMs: Import

§ 170.315(c)2 – Clinical Quality Measures - import and calculate

Cypress Test Tool

See Notes Below

 

Outcomes:

Clones were made of 3 client systems in order to test. Two sets of test data were created with the Cypress Test Tool. Each set of data was individually imported into the client database. As expected, files were imported without error, the expected patient records were created as needed, the patient data was visible after the export and CQMs were correctly calculated without error in all cases

 

Challenges Encountered:

Due to lack of actual real world test cases of clients needing to import QRDA1 files, we were forced to run simulations using a copy of real client data. A copy was used as to not contaminate a real working environment. This was expected and noted as a likely scenario in our Real World Testing plan, as we have not had any clients that have used the QRDA I import to date.

 

Measure 3: CQMs: Export 

Assess the ability to accurately record data necessary for the calculation of CQMs, as well as the successful creation and export of QRDA I and QRDA III files utilizing that data. 

 

Associated Criterion:

Measurement/Metric

Associated Criteria

Relied Upon Software

Challenges Encountered

CQMs: Export

§ 170.315(c)1 – Clinical Quality Measures - record and export

§ 170.315(c)3 – Clinical Quality Measures - report

Cypress Test Tool

See Notes Below

 

Outcomes:

Of our Real World Testing client partners, 3 are actively collecting data necessary to calculate and report CQMs. On each of these three client systems, data was exclusively gathered via documentation in MD Logic rather than via CQM import. In all three client systems, CQMs were successfully calculated at the end of the measurement period and accurately reflected the data captured. In addition, QRDA III files were created successfully in all cases and were successfully exported.

 

Because none of these clients actively used the QRDA I export, we combined testing of this measure with our CQM Import testing using clones of live client systems. Multiple sets of test data were imported into these systems and the resulting CQM calculations effectively incorporated this information. CQMs included in this testing were 22, 50, 69, 117, 123, 131, 1134, 138, 139, 147 and 155. Updated QRDA I data was then exported from the clone environment without error in all cases.

 

Challenges Encountered:

Due to lack of actual real world test cases of clients needing to export QRDA1 files, we were forced to run simulations using a copy of real client data. A copy was used as to not contaminate a real working environment. This was expected and noted as a likely scenario in our Real World Testing plan, as we have not had any clients that have used the QRDA I export to date.

 

Schedule of Key Milestones:

Key Milestone

Care Setting

Date/Timeframe

Released documentation for Real World Testing and provided to authorized representatives and providers running MD Logic

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed prior to Dec. 1, 2021

Selected clients who would capture data

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from Jan. 1 - Feb. 28, 2022

Built necessary test tracking and logging systems for accurate data collection

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from Mar. 1 - May 31, 2022

Met with client partners and reviewed and educated them on testing protocols

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from Jun. 1 - 30, 2022

Began data and information collection as laid out by the plan

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from Jul. 1, 2022

Followed-up with providers and authorized representatives on a regular basis to understand any issues that arose with the data collection.

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from, Jul.-Dec. 2022

Ended data collection for the testing period

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

 Dec. 31, 2022

Analyzed data and generated reports

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Completed from Jan. 1 - Feb. 15, 2022

Submitted Real World Testing report to Drummond Group

Podiatry, General Surgery, Plastic and Hand Surgery, Colon Rectal Surgery, ENT, Orthopedic Surgery, & Neurosurgery

Feb. 21, 2023