REAL WORLD TESTING RESULTS REPORT 2022
GENERAL INFORMATION
Plan Report ID Number: 20211213ALL
Developer Name: AllegianceMD Software, Inc
Product Name(s): Veracity
Version Number(s): 9.1
Product List (CHPL) ID(s): 15.02.05.2672.ALLE.01.01.1.220117 (current), 15.02.02.2672.A100.01.00.1.191025 (previous)
Developer Real World Testing Plan Page URL: https://allegiancemd.com/real-world-testing
Developer Real World Testing Results Report URL: https://allegiancemd.com/real-world-testing
CHANGES TO ORIGINAL PLAN
| Summary of Change | Reason | Impact |
|---|---|---|
| Changes to the original dates | We imlemented an advanced logging system by using distributed event streaming platform. It allows us to monitor across multiple accounts in real production settings. The dates were changed from the original suggested dates to 12.05.2022 to 12.31.2022. |
No impact. |
| 170.315(f)(5) | No adoption by any customer. | No impact. |
STANDARDS UPDATES (INCLUDING STANDARDS VERSION ADVANCEMENT PROCESS (SVAP) AND UNITED STATES CORE DATA FOR INTEROPERABILITY (USCDI))
| Standard (and version) | N/A |
| Updated certification criteria and associated product | |
| Health IT Module CHPL ID | |
| Method used for standard update | |
| Date of ONC ACB notification | |
| Date of customer notification (SVAP only) | |
| Conformance measure | |
| USCDI updated certification criteria (and USCDI version) |
SUMMARY OF TESTING METHODS AND KEY FINDINGS
Test Plan:
The test cases that were included actions by various user types to capture the required data and workflows. In most scenarios, real world patient data was used to confirm compliance with the things such as successful transmission statuses for some certification criteria requirements. Some standards were tested via manual inspection and/or with ONC-recommended test tools.
Test Methods:
User actions are sent to an event distributed event streaming platform. User events are recorded. This method is superior to picking certain clients for testing as it provides real time data across all clients and in case of error, it provides error logs to help us trouble shoot any issue with the system.
Key Milestones
Q1 2022:
Developed a list of clients to assist with Real World Testing. We ran an automated queries to list the clients with the most patients.
Q2 and Q3 2022:
After running internal reports, we concluded that the method of individual client for testing weren’t enough to produce enough data for a reliable real world testing. The method of contacting individual clients and schedule “testing” would have produced data tested on most likely test patients on most of the measures, not real world patients and usage scenario.
07/04/2022 to 07/08/2022
We identified the measures that were least used by clients and ran them manually. If the measure was used by the practice in real world settings, then the manual results are replaced by the realworld report/logging method
For the rest of the measures, we decided to implement a universal logging system that gathers data from all of our clinics. An event is triggered once the user decided to use a function associated with the measures included in this report. The event is then sent to a distributed event streaming platform. The data should include entry logs of successful attempts, errors, and print of the error stack, if an error exists. The logs would include the measure, account number, status. This method would produce reliable data, it also helps us measure our clients usage and be proactive with any error produced. In August We started developing the logging system by July.
Q4 2022:
The logging system was tested and deployed by 11/27/2022. The logging system is a seperate entity from the EHR that captures logs using distributed event streaming platform.
Change from Original Plan:
We decided to implemet a universal event driven logging system using a distributed event streaming platform across all of our clinics. This is a better plan than the original plan where we would contact or survey individual clinics. The event driven logging system also allows us to regulary query the logs and identify any issues or errors universally across all our clinics.
RWT Measure #1. Number of Transition of Care C-CDAs Successfully Sent
Associated Criteria: 315(b)(1)
Care Coordination: 170.315 (b)(1): Transitions of Care
Testing Methodology: Reporting/Logging
Measurement Description:
This measure is tracking and counting how many C-CDAs are created and successfully sent from the EHR Module to a 3rd party during a transition of care event over the course of a given interval.
Care Settings:
Behavioral Health, Cardiology, Family, Fertitlity, Hormone Replacement, Infectious Disease, Internist, Neurologist, Obgyn, Pain Doctor, Podiatry, Rheumatology, Sleep Doctor, Surgeon, Urgent Care
Testing Results:
Clinics Queried: 37
Reporting Interval: 12/05/2022 – 12/31/2022
Reporting Interval: 12 months
Total C-CDA Sent for all Clinics: 78
Analysis and Key Findings
Our clients do not regularly share data through C-CDA files so we have few records of exchanged C-CDAs.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the time range and care settings.
Relied Upon Software
Phimail
RWT Measure #2. Clinical information reconciliation and incorporation
Associated Criteria: 315(b)(2)
Care Coordination: 170.315(b)(2) Clinical information reconciliation and incorporation
Testing Methodology: Reporting/Logging
Measurement Description:
Demonstration of incorporating problem list, medication list and allergy medication list reconciliation from discrete problems, medications and medication allergies parsed from a C-CDA in referral providers environment. Verfity the successful reconciliations of parsed discrete data in the referrall provider’s environment into the problem list, medication list and allergy list in the clinical summmary.
Care Settings:
Ambulatory Surgery, Family medicine
Testing Results
Clinics With Records: 3
Reporting Interval: 12/05/2022 – 12/31/2022
Number of C-CDAs: 748
Testing Metric/Measurement:
Number of C-CDAs: 748
Analysis and Key Findings
Our clients do not regularly share data through C-CDA files so we have few records of Direct exchange C-CDAs. One clinic in the clinics queried was heavily using the CCDA to import patients.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the date range and care settings.
Relied Upon Software
None
RWT Measure #3. Number of Rx Messages Successfully Sent
Associated Criteria: 315(b)(3)
Care Coordination: 170.315(b)(3) Electronic prescribing
Testing Methodology: Reporting/Logging
Measurement Description:
This measure is tracking and counting how many NewRx and CancelRx electronic prescriptions were created and successfully sent from the EHR Module to a pharmacy destination over the course of a given interval.
Care Settings:
Behavioral Health, Cardiology, Family, Fertitlity, Hormone Replacement, Infectious Disease, Internist, Neurologist, OB/GYN, Pain Doctor, Podiatry, Rheumatology, Sleep Doctor, Surgeon, Urgent Care, Palliative Medicine, Endocrinology, Nephrology, Urgent Care, Pediatrics, Dermatology, Plastic and Reconstructive, Psychiatry.
.
Testing Results:
Clinics Queried: 223
Reporting Interval: 12/05/2022 – 12/31/2022
Total Electronic Prescriptions Sent for all Clinics: 123,780
Analysis and Key Findings:
Electronic prescribing is a very popular and widely used feature in our EHR.
Non-Conformities or Errors Discovered:
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the date range and care settings.
Relied Upon Software
None
RWT Measure #4. Number of Patient Batch Exports
Associated Criteria: 315(b)(6)
Care Coordination: 170.315(b)(6) Data export
Testing Methodology: Manual
Measurement Description:
This measure is tracking and counting how many batch exports of C-CDAs were successfully performed by the EHR Module over the course of a given interval.
Care Settings:
Family Practice
Testing Results:
Clinics Queried: 2
Reporting Interval: 07-04-22 to 07-08-22
Total C-CDA exported for all Clinics: 2,178
Analysis and Key Findings:
This test reveals that most of the clinics surveyed don’t utilize the batch exporting feature. The export result was only for two clinic.
Non-Conformities or Errors Discovered:
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #5. Clinical quality measures – record and export
Associated Criteria: 315(C)(1)
Clinical Quality Measures: 170.315(c)(1)—record and export
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of the ability to export patient data recorded in the EHR for a given patient population in a QRDAI format.
Care Settings:
Palliative Medicine, Podiatry, Infectious Disease, Primary Care
Testing Results:
Number of Clinics With Records: 4
Reporting Interval: 12/05/2022 – 12/31/2022
Number of QRDA1 batches exported: 12
Analysis and Key Findings
Because we are certified in nearly all ONC eCQMs, our clients have a wide variety of options to choose from. While CMS will accept the top six scoring eCQMs for their MIPS Quality Reporting, users can submit more than 6.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #6. Clinical quality measures – import and calculate
Associated Criteria: 315(c)(2)
Clinical Quality Measures: 170.315(c)(2)—import and calculate
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of the ability to calculate quality measures for the patient data and measures.
Care Settings:
Plastic and Reconstructive, Endocrinology, Nephrology, Urgent, Surgery, Pain Center, Pediatrics, Cardiology, Pallative Care, Sleep Medicine, Otolaryngology, Podiatry, Neurobehavioral Health, Amb Surgery Center, Internist, Primary Care, Urgent Care, OB/GYN, Dermatology
Testing Results:
Number of Clinics: 61
Reporting Interval: 12/05/2022 – 12/31/2022
Number of Patients where CQM was calculated: 875
Analysis and Key Findings
Because we are certified in nearly all ONC eCQMs, our clients have a wide variety of options to choose from. While CMS will accept the top six scoring eCQMs for their MIPS Quality Reporting, users can submit more than 6.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #7. Clinical quality measures – import and calculate
Associated Criteria: 315(c)(3)
Clinical Quality Measures: 170.315(c)(3)—report
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of the ability to generate QRDA 3 files.
Care Settings:
Rheumatology, Pdoiatry, Infectious Disease, Primary Care
Testing Results:
Number of Clinics: 5
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of QRDA-III Exported : 11
Analysis and Key Findings
Clinics record CQM results and submit QRDA-III files to CMS the following year. The number of clinics exported to QRDA-III was below average due to the reporting period wasn’t on the due dates of CMS submission.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #8. View, download, and transmit to 3rd party
Associated Criteria: 315(e)(1)
Patient Engagement: 170.315(e)(1) View, download, and transmit to 3rd party
Testing Methodology: Reporting/Logging
Measurement Description
Login to patient portal to view clinical summary, create and make a valid C-CDA available to download or transmit to 3rd party.
Care Settings:
Family, Sleep Doctor, Rheumatology, Urgent Care, Endocrinology, OB/GYN, Podiatry, Fertility, Pain, Psychiatry, Neurology, Pediatrics, Infectious Disease.
Testing Results:
View:
Number of Clinics With Records: 69
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of views : 1396
Download:
Number of Clinics With Records: 21
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of downloads : 51
Transmit:
Number of Clinics With Records: 1
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of Transmits: 1
Analysis and Key Findings
Some patients login and click view CCDA and download it or transmit to a 3rd party. However this is rarely used by patients. Some clinics ask their patients to login to the portal and download the CCDA.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
MyPortal.MD
RWT Measure #9. Transmission to public health agencies-immunization registry
Associated Criteria: 315(f)(1)
Public Health: 170.315(f)(1) Transmission to immunization registries
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of the ability to add an immunization to a patient and generate a VXU message for an administered immunization and transmit it via HL7 2.5.1 to a public health agency successfully. Verify the immunization log of successful VXU message transmissions.
Care Settings:
Pediatrics, Family Medicine.
Testing Results:
Number of Clinics With Records: 3
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of transmissions : 136
Analysis and Key Findings
Few clinics are using the feature. If the end user uses the state inventory, lot# has to be accurate.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #10. Transmission to Public Health Agencies
Associated Criteria: 315(f)(5)
Public Health: 170.315(f)(5) Transmission to public health agencies — electronic case
reporting
Testing Methodology: Manual
Measurement Description
Demonstration of the ability to generate a transmission to public health agencies depending on a table of trigger codes per encounter.
Care Settings:
Live test family practice with non-real patient data.
Testing Results:
Number of Clinics With Records: 1
Reporting Interval: 07/04/2022 – 07/04/2022
Number Of Files : 2
Analysis and Key Findings
We don’t have any clinics using this feature. A manual test was performed. We have a system that can query encounters for trigger codes. Once the encounter was flagged, an eICR CCDA was generated. Since we implemented FHIR, We are utilizing eCRNow for 2023.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the care settings.
Relied Upon Software
None
RWT Measure #11. Application access – patient selection
Associated Criteria: 315(g)(7)
Application Programming Interfaces: 170.315(g)(7) Application access— patient selection
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of a patient’s ability to make a data request through API for one or more data elements from the Common Clinical data Set.
Care Settings:
Family Medicine
Testing Results:
Number of Clinics With Records: 2
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of Queries : 22
Analysis and Key Findings
Very few facilities are using this feature. We are hoping for better utilization with g (10) roll out.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #12. Application access — data category request
Associated Criteria: 315(g)(8)
Application Programming Interfaces: 170.315(g)(8) Application access
Testing Methodology: Manual
Measurement Description
Demonstration of a patient’s ability to make a data request through API for one or more data elements from the Common Clinical data Set.
Care Settings:
Family Practice
Testing Results:
Number of Clinics With Records: 2
Reporting Interval: 07/04/2022 – 07/04/2022
Analysis and Key Findings
Very few facilities are using this feature. We are hoping for better utilization with g (10) roll out.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the care settings
Relied Upon Software
None
RWT Measure #13. Application access – all data request
Associated Criteria: 315(g)(9)
Application Programming Interfaces: 170.315(g)(9) Application access— all data request
Testing Methodology: Reporting/Logging
Measurement Description
Demonstration of a patient’s ability to make a data request through API for one or more data elements from the Common Clinical Data Set.
Care Settings:
Family Medicine
Testing Results:
Number of Clinics With Records: 2
Reporting Interval: 12/05/2022 – 12/31/2022
Number Of Queries : 1739
Analysis and Key Findings
Very few facilities are using this feature. Only two were found. We are hoping for better utilization with g (10) roll out.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
None
RWT Measure #14. Direct Project
Associated Criteria: 315(h)(1)
Electronic Exchange: 170.315(h)(1) Direct Project
Testing Methodology: Reporting/Logging
Measurement Description
This use case is tracking various measurements associated with Direct messaging successfully
sent and received with the EHR Module and a 3rd party over the course of a given interval. We
are tracking and measuring both the number of Direct messages sent, the unique destinations
Direct messages were sent to, and how many Direct messages were received.
Care Settings:
Family Medicine, Surgery, Cardiology
Testing Results:
Number of Clinics With Records: 3
Reporting Interval: 02/01/2022 – 12/31/2022
Number of Messages: 236
Analysis and Key Findings
Our clients do not regularly share data through C-CDA files so we have few records of Direct exchange C-CDAs. Some clinics use third-party stand-alone apps to send direct messages.
Non-Conformities or Errors Discovered
During our testing, we did not discover any errors or criteria non-conformities. We did not
make any changes to this measure from our original test plan except for the reporting interval and care settings.
Relied Upon Software
Phimail