Strategy 15 min read

EHR Usability Scores and Benchmarks: How the Top Systems Compare (2026)

Vendor demos look great. Day-to-day usability is what determines whether your clinicians thrive or burn out. This guide compiles KLAS ratings, SUS benchmarks, physician and nurse satisfaction data, and task-efficiency metrics so you can evaluate EHR usability with hard numbers instead of sales pitches.

By Maria Gray, LPN

Key Takeaways

  • Clinicians rate EHR usability at a SUS score of 45.9 — an "F" grade in the bottom 9% of all software ever tested. Vendor-reported scores average 75. The gap is massive.
  • Epic leads enterprise EHRs in KLAS satisfaction (8.5/9 tangible outcomes). Oracle Health satisfaction has dropped 10+ points since the 2022 Cerner acquisition.
  • The average physician NEES is just 23.4 on a -100 to +100 scale. Only 18% of physicians report a strong or elite EHR experience.
  • Nurses lose an average of 3+ hours per week to unproductive charting. Among those nurses, 46% report burnout and 34% intend to leave their organization.
  • Organizations that invest in usability optimization see measurable ROI: Mercy saved 32 minutes of charting per nurse per day; Wooster Community Hospital saved 15,000+ nursing hours annually.

45.9

Avg clinician SUS score (F grade)

23.4

Avg physician NEES (-100 to +100)

43%

Physicians reporting burnout (2024)

13 hrs

Weekly indirect patient care time

EHR Usability Rankings: The 2026 Scorecard

Vendor KLAS Overall (1-9) Est. SUS Score Physician Satisfaction Nurse Satisfaction
Epic Systems 8.5 55-65 High Above Avg
athenahealth 8.0 52-60 High N/A (ambulatory)
MEDITECH Expanse 7.8 48-58 Moderate Moderate
NextGen Healthcare 7.5 48-55 Moderate N/A (ambulatory)
eClinicalWorks 7.0 42-52 Mixed N/A (ambulatory)
Oracle Health (Cerner) 6.8 40-50 Declining Below Avg

Epic leads across nearly every usability metric for enterprise EHRs, earning the top KLAS score for the 15th consecutive year. athenahealth dominates the ambulatory segment, winning Best in KLAS for Independent Physician Practice two years running.

Oracle Health's scores have dropped more than 10 points in vendor loyalty and relationship since the 2022 Cerner acquisition, though recent app developments have led to "cautious optimism" among some current customers.

Methodology note: KLAS scores are based on validated provider interviews. Estimated SUS scores reflect aggregated research from published studies and Arch Collaborative data. Exact vendor-level SUS scores are not publicly reported by KLAS; ranges shown here synthesize multiple data sources. For the most current KLAS data, refer to klasresearch.com.

The SUS Benchmark Gap: Vendor Claims vs. Clinical Reality

75

Vendor-Reported Avg SUS

Controlled testing, 27 products

45.9

Clinician-Reported Avg SUS

870 physicians, real-world use

SUS Score Range Letter Grade Adjective Rating Percentile Where EHRs Fall
80.3+ A Excellent Top 10% No EHRs here
68.0 - 80.2 B-C Good / Average 30th-80th Vendor-reported scores
51.0 - 67.9 D Below Average 15th-30th Best real-world EHRs
25.0 - 50.9 F Not Acceptable Bottom 15% Most EHRs land here (avg 45.9)

The 29-point gap between vendor-reported and clinician-reported SUS scores is the defining usability story in healthcare IT. Vendor testing is conducted in controlled environments with simplified scenarios that do not resemble actual clinical workflows.

A JAMA Network Open study found that among 27 widely-used EHR products, 67% met the average SUS benchmark in vendor testing. When actual physicians evaluated the same systems in real clinical settings, scores plummeted to the "not acceptable" range.

Why the gap matters for your selection process:

Never rely on vendor-supplied usability data. During your EHR evaluation, require hands-on clinician testing with real clinical scenarios from your practice. Have your own physicians and nurses complete standardized tasks and score the experience using the SUS questionnaire.

Task Efficiency Benchmarks: Clicks, Time, and Documentation Burden

Common Task Industry Avg Time Best-in-Class Worst-in-Class Avg Clicks
New patient intake note 18-25 min 8-12 min 35+ min 40-120
E-prescribing (single Rx) 2-4 min 30-60 sec 5-8 min 8-25
Lab order + results review 3-6 min 1-2 min 8-12 min 12-35
Progress note (follow-up) 8-15 min 3-5 min 20+ min 25-80
Inbox message triage 1-3 min/msg 30-45 sec 5+ min 5-15
Prior authorization initiation 15-35 min 5-10 min 45+ min 20-60+
Referral to specialist 4-8 min 1-3 min 10-15 min 10-30
Nursing infusion documentation 10-20 min 5-8 min 30+ min 30-80

Research shows that deep navigation hierarchies and non-intuitive menu labels double the clicks required for common tasks. In observed studies, wrong-field data entry occurred in 17% of tasks — a direct patient safety risk driven by poor interface design.

The 2024 physician workweek breakdown:

57.8

Total hours/week

27.2

Direct patient care

13.0

Indirect care (EHR)

7.3

Admin tasks

Physicians spend nearly as much time on EHR documentation and order entry (13 hours/week) as they do on administrative tasks and personal overhead combined. Systems with ambient AI documentation — like those from Epic, Nuance DAX, and NextGen — are cutting progress note time by 50% or more.

Usability by Specialty: One Size Does Not Fit All

Specialty Highest-Rated EHR Top Pain Points Key Feature Gaps
Primary Care Epic, athenahealth Inbox overload, alert fatigue AI triage, panel management
Cardiology NextGen Healthcare Device data integration, structured reporting Cardiac imaging workflow, remote monitoring
Orthopedics ModMed (specialty); Epic (enterprise) Operative note templates, image annotation Surgical outcome tracking, implant registries
Dermatology ModMed Photo documentation, body-map charting AI lesion tracking, dermatoscopy integration
Behavioral Health TherapyNotes, AZZLY Rize Progress note formats, tx plan templates ASAM integration, outcomes measurement
Ophthalmology NextGen, ModMed Diagnostic device integration, drawing tools OCT/visual field data flow, surgical planning
Urgent Care Experity, athenahealth Speed of documentation, patient throughput Occupational health workflows, rapid coding
Emergency Medicine Epic (enterprise); T-System Real-time tracking, rapid order entry ED-specific dashboards, bed management

KLAS Arch Collaborative data shows that specialty satisfaction varies dramatically. Only 49% of cardiologists and 47% of orthopedic surgeons report that their EHR has the functionality they need, compared to higher satisfaction rates in primary care.

The takeaway: general-purpose EHRs require significant configuration to serve procedural and image-heavy specialties. For specialty practices evaluating systems, test workflows specific to your clinical scenarios. See our specialty-specific guides for cardiology, orthopedics, and dermatology.

EHR Usability and Burnout: The Hard Numbers

Factor High-Usability EHR Low-Usability EHR Difference
Physician burnout rate 30-35% 55-60% +25 pts
After-hours EHR use ("pajama time") 30-45 min/day 90-120 min/day 2-3x more
Physician satisfaction with career 80%+ 55-65% -20 pts
Intent to reduce clinical hours 15-20% 35-45% 2x more
Documentation time per encounter 5-8 min 15-25 min 2-3x more
Alert override rate (alert fatigue) 40-50% 85-95% Safety risk

The Mayo Clinic Proceedings study linking SUS scores to burnout is unambiguous: physicians who rate their EHR poorly are significantly more likely to report burnout symptoms. JAMA Network Open research confirmed that EHR-based alerts had the lowest usability scores of any EHR subsystem, and approximately one-quarter of family physicians reported being "somewhat or very dissatisfied" with their EHR.

The retention math: Replacing a physician costs $500,000-$1,000,000 in recruitment, onboarding, and lost revenue. If low EHR usability drives even 2-3 physicians per year to leave or reduce hours, the cost far exceeds any EHR optimization investment. The KLAS Arch Collaborative confirmed in December 2025 that EHR experience directly drives — or damages — clinician retention.

The good news: physician burnout has improved from 53% in 2022 to 43.2% in 2024, and job satisfaction rose from 68% to 76.5% over the same period. AI-powered documentation tools and focused EHR optimization are contributing to this trend.

Nurse EHR Experience: The Documentation Crisis

92%

of nurses say charting hurts job satisfaction

40%

of nurses intend to leave by 2029

47.3

Average nurse NEES (of 113K surveyed)

A December 2025 KLAS report based on 80,147 acute care nurses found that reducing or streamlining documentation ranks as the number-one EHR enhancement request — cited twice as frequently as any other improvement. Nearly four in five nurses report losing time to unproductive charting each week.

Organization Initiative Outcome NEES Improvement
Seattle Children's Hospital 2-year documentation overhaul post-Epic go-live Comprehensive charting redesign +71.4 pts
Mercy (50 hospitals) Project ANEW documentation optimization 32 min/nurse/day saved Significant
Wooster Community Hospital Multidisciplinary task force; 150+ nurse ideas 96 fields eliminated; 15,000+ hrs/yr saved +20.9 pts

The common thread across successful organizations: executive sponsorship, multidisciplinary governance teams, and direct engagement of frontline nurses through task forces, feedback surveys, and superuser programs. This is not a technology problem alone — it is an operational commitment.

ONC Usability Certification Criteria: What Vendors Must Meet

Criterion Description Why It Matters How to Test
Safety-Enhanced Design (SED) User-centered design process following ISO standards Reduces medication errors and wrong-patient orders Ask for UCD process documentation and ISO compliance citation
Summative Usability Testing Minimum 10 participants per capability tested Validates that real users can complete tasks efficiently Request full usability test reports with participant demographics
Insights Condition (HTI-1) Transparent reporting on usability, security, interoperability Public accountability for real-world performance Check vendor's Insights Condition filings on HealthIT.gov
b(11) Criteria (Jan 2025) Enhanced safety and usability standards for certified EHRs New baseline for clinical decision support and alerts Verify vendor's b(11) certification status and compliance date
DSI Transparency Decision support interventions must disclose AI/ML methods Clinicians need to understand how recommendations are generated Request DSI source attribution and validation documentation
Real-World Testing (RWT) Ongoing testing in production clinical environments Bridges the gap between lab testing and actual clinical use Ask for RWT results and how findings were addressed

The 2025 HTI-1 rule significantly strengthened usability requirements. Vendors must now follow a named industry-standard User-Centered Design process — such as ISO 9241-210 or NISTIR 7741 — and provide the citation during certification.

Despite these requirements, the gap between certification-level usability testing and real-world clinician experience persists. This is because vendor-conducted tests use simplified scenarios that do not reflect the cognitive load, interruption frequency, and time pressure of actual clinical practice. Use these criteria as a baseline during your vendor evaluation, but always supplement with your own hands-on testing.

Usability Improvement ROI: Where to Invest

Improvement Area Typical Investment Annual Savings Payback Period
Ambient AI documentation $200-$500/provider/mo $50K-$150K/provider (time recaptured) 2-6 months
Documentation field optimization $50K-$150K (one-time project) 15,000+ nursing hours/yr (Wooster data) 3-6 months
Alert fatigue reduction program $25K-$75K (governance + config) Improved safety + 10-30 min/provider/day 1-3 months
EHR training & onboarding program $500-$2,000/provider $33K/provider/yr in efficiency gains 1-2 months
Online eLearning EHR training $30K-$80K/yr (platform license) $10K+ per 100 physicians (training cost reduction) 4-8 months
Full EHR optimization engagement $200K-$1M+ (12-18 mo project) $500K-$2M+/yr (efficiency + retention) 6-18 months

University of California data shows institutions saved up to $33,000 per provider per year after focused EHR optimization, primarily from administrative efficiencies and improved charge capture. Many practices recoup EHR optimization costs in under 2.5 years.

Highest-ROI starting point: Alert fatigue reduction and documentation field elimination consistently show the fastest payback. Wooster Community Hospital reviewed 150+ nurse-submitted ideas and eliminated 96 documentation fields — a low-cost, high-impact effort. Start there before investing in new technology. See our EHR training best practices guide for structured optimization approaches.

How to Evaluate EHR Usability During Vendor Selection

Vendor demos are choreographed. Published scores have limitations. Here is a practical framework for evaluating usability with your own data.

Step 1: Define your top 5 clinical workflows

Identify the 5 tasks your clinicians perform most often (e.g., progress note, e-prescribe, inbox triage, lab review, referral). These become your usability test scenarios.

Step 2: Require hands-on sandbox access

Do not accept a vendor-led demo as your only evaluation. Require 2-4 weeks of sandbox access where your actual physicians and nurses can complete real tasks independently.

Step 3: Measure clicks and time

Have testers record the number of clicks and time required to complete each workflow. Compare against the benchmarks in this article. A system that requires 80+ clicks for a progress note is a red flag.

Step 4: Administer the SUS questionnaire

After testing, have each evaluator complete the standardized 10-question System Usability Scale. This gives you a comparable, industry-standard score for each vendor.

Step 5: Check Arch Collaborative data

Request the vendor's KLAS Arch Collaborative NEES scores for your specialty and practice size. If they will not share, that is informative in itself. Cross-reference with the KLAS public reports.

For a complete vendor evaluation framework, including scoring rubrics and reference check questions, see our EHR Selection Process guide and EHR Demo Evaluation Guide.

Frequently Asked Questions

What is a good System Usability Scale (SUS) score for an EHR?

A SUS score of 68 is considered average across all software products, and 80 or above is considered above average. However, EHR systems score far below these benchmarks. Vendor-reported SUS scores average around 75, but when actual clinicians rate their EHRs, the average drops to 45.9 — an "F" grade that places EHRs in the bottom 9% of all software products ever evaluated. A "good" EHR SUS score in practice would be anything above 60, which would place it well above the clinician-reported average.

Which EHR system has the highest usability scores?

Epic Systems consistently earns the highest usability and satisfaction scores among enterprise EHR vendors, scoring approximately 8.5 out of 9 on KLAS tangible outcomes for the 15th consecutive year. For ambulatory practices, athenahealth scores highest for independent physician practices (Best in KLAS 2025). For specialty-specific use cases, vendors like ModMed (dermatology, orthopedics) and NextGen Healthcare (cardiology) often outperform general-purpose EHRs in their target specialties. See our complete vendor ranking for details.

How does EHR usability affect physician burnout?

EHR usability has a direct and measurable impact on physician burnout. Research published in Mayo Clinic Proceedings found that physicians who rated their EHR poorly on usability were significantly more likely to report burnout. As of 2024, 43.2% of physicians report burnout (down from 53% in 2022), but more than one-third still cite ineffective EHR systems as a primary contributor. Physicians spend an average of 13 hours per week on indirect patient care tasks like documentation and order entry. Each additional hour of after-hours EHR work correlates with increased burnout risk.

What is the KLAS Arch Collaborative Net EHR Experience Score (NEES)?

The Net EHR Experience Score (NEES) is a metric developed by the KLAS Arch Collaborative to measure clinician satisfaction with their EHR on a scale of -100 to +100. As of 2025, the average nurse NEES is 47.3 (based on 113,045 responses) and the average physician NEES is 23.4 (based on 53,037 responses). Only 22% of nurses and 18% of physicians report a strong or elite EHR experience. Organizations can improve their NEES through focused initiatives — some have achieved improvements of 20 to 70+ points through documentation optimization and clinician engagement programs.

What are the ONC usability requirements for EHR certification?

The ONC Health IT Certification Program requires EHR vendors to meet Safety-Enhanced Design criteria and conduct summative usability testing with a minimum of 10 participants per capability. Under the 2025 HTI-1 rule, vendors must follow an industry-standard user-centered design process (such as ISO 9241-210 or NISTIR 7741), conduct usability testing that reflects real clinical scenarios, and report results through the Insights Condition. Despite these requirements, a significant gap remains between vendor-reported usability scores and actual clinician experience — making independent evaluation during vendor selection essential.

The Bottom Line

EHR usability is not a soft metric — it directly drives clinician burnout, retention, patient safety, and organizational costs. The industry-wide SUS score of 45.9 is a failing grade that should be unacceptable. The good news is that usability is improvable, measurable, and increasingly a competitive differentiator among vendors.

When selecting an EHR, do not trust vendor-reported usability scores alone. Run your own hands-on evaluations with real clinical workflows, administer the SUS questionnaire, and reference KLAS Arch Collaborative data for your specialty. After implementation, invest in ongoing optimization: the organizations achieving the biggest NEES improvements share a common approach of executive sponsorship, frontline clinician engagement, and systematic documentation streamlining.

Next Steps