AI & Automation

Care Gap Automation Case Study: 52% More Gaps Closed 2026

Mar 26, 2026

A mid-Atlantic ACO managing 65,000 attributed Medicare lives was closing only 19% of identified care gaps through manual outreach when they deployed automated care gap workflows in early 2025. Eight months later, their closure rate hit 71% — a 52 percentage point improvement that translated to $1.1 million in recovered quality bonuses and $420,000 in reduced avoidable utilization.
Care gap closure rate with automation: 65-80% vs 30-40% manual outreach according to Arcadia (2024)

This case study documents every phase of the implementation: the baseline problems, the technical deployment, the operational changes, and the financial outcomes — all verified against the benchmarks published by MGMA, NCQA, and CMS.

Key Takeaways

  • Closure rate jumped from 19% to 71% across 14 HEDIS measures in 8 months

  • $1.52 million in total first-year financial impact against $84,000 in platform and implementation costs

  • HEDIS composite score improved 14 percentile points — from 42nd to 56th percentile

  • Care coordinator phone time dropped 72% while complex case management capacity doubled

  • Patient no-contact rate fell from 45% to 12% with multi-channel automated outreach

The Starting Point: Where Manual Processes Were Failing

The ACO operated 12 primary care sites across three counties, with 6 care coordinators responsible for tracking care gaps across the attributed population. According to their internal operational audit, the organization faced five compounding problems.

How bad was the baseline performance? According to MGMA benchmarking data, a 19% care gap closure rate places an organization in the bottom quartile nationally. The ACO's HEDIS composite score sat at the 42nd percentile — below the threshold for maximum quality bonus payments under their Medicare Shared Savings Program contract.

Baseline performance metrics (pre-automation):

MetricACO BaselineMGMA 50th PercentileGap to Median
Overall closure rate19%34%-15 points
Median days to closure142 days95 days+47 days
Outreach attempts per gap1.22.4-1.2 attempts
Patient contact success rate55%68%-13 points
HEDIS composite percentile42nd50th-8 points
Care coordinator outreach hours/week32 hrs each24 hrs each+8 hrs
Quality bonus attainment62% of eligible78% of eligible-16 points

The root cause analysis revealed a structural problem, not a staffing problem. According to the ACO's operations director, "Our coordinators were working overtime every week just to make first-contact calls. They had no time for follow-up, no ability to track which patients responded, and no system for prioritizing high-value gaps over low-value ones."
Automated care gap notification patient compliance: 45% schedule within 7 days according to Phreesia (2024)

According to MGMA staffing benchmarks, 6 coordinators managing 65,000 lives translates to a ratio of 1:10,833 — well above the recommended 1:8,000 ratio for organizations with moderate acuity populations. Adding headcount was not in the budget. Automation was the alternative.

The ACO's quality director noted that their coordinators were spending 32 hours per week each on outreach calls — yet making contact with only 55% of targeted patients. The remaining 45% had disconnected numbers, did not answer, or required multiple callbacks that never happened.

The Decision Process: Evaluating Solutions

The ACO evaluated four platforms over a 6-week selection process: Innovaccer, Azara DRVS, Phreesia, and US Tech Automations.

What criteria drove the platform selection? According to the ACO's evaluation committee, the four weighted criteria were:

Selection CriteriaWeightInnovaccerAzaraPhreesiaUS Tech Automations
Multi-channel outreach capability30%8/105/107/109/10
EHR integration (3 different EHRs across sites)25%7/106/105/109/10
Implementation speed25%4/106/107/109/10
Total 3-year cost20%4/108/105/108/10
Weighted score6.16.16.18.8

The ACO operated three different EHR systems across its 12 sites — Epic at the main campus, athenahealth at 6 satellite clinics, and eClinicalWorks at 3 rural sites. According to CAQH, multi-EHR organizations represent 38% of ACOs nationally, yet most care gap platforms are optimized for single-EHR environments.
Quality measure bonus improvement with gap closure automation: $50,000-$200,000 annually according to CMS (2024)

US Tech Automations won the evaluation because its standard API connector approach worked across all three EHR platforms without requiring separate integration projects. According to the ACO's IT director, "Innovaccer would have required three separate integration workstreams and 12 weeks of implementation. US Tech Automations connected to all three systems in 10 days."

Implementation Timeline: From Contract to Live

The implementation followed an aggressive 3-week timeline — fast even by US Tech Automations standards — driven by the ACO's approaching HEDIS measurement deadline.

Week-by-week implementation detail:

  1. Days 1-3: Data source mapping and connector deployment. Connected to Epic, athenahealth, and eClinicalWorks instances via standard FHIR APIs. Ingested payer roster files from three Medicare Advantage contracts. Mapped patient demographics across all three EHR systems to a unified record.

  2. Days 4-7: Gap identification engine configuration. Configured gap identification logic for 14 HEDIS measures based on NCQA specifications. Cross-referenced claims history (18 months) with clinical data to build the initial gap inventory. Identified 4,200 open gaps across the 65,000 attributed lives.

  3. Days 8-12: Outreach workflow design and message creation. Built multi-touch outreach sequences for each of the 14 measures. Configured channel preferences based on patient demographics — SMS-first for patients under 55, phone-first for patients 65+. Created 42 unique message templates (3 per measure: initial, reminder, escalation).

  4. Days 13-15: Pilot deployment with 2,000-patient cohort. Launched automated outreach for breast cancer screening and HbA1c monitoring across a single site. Monitored delivery rates, response rates, and closure events in real time.

  5. Days 16-18: Full rollout across all 12 sites. Expanded to all 14 measures and all 65,000 attributed lives. Activated provider-facing gap alerts in all three EHR systems.

  6. Days 19-21: Optimization and staff training. Fine-tuned outreach timing based on pilot response data. Trained coordinators on the escalation queue and exception handling workflows.

According to MGMA implementation benchmarks, the median care gap platform deployment takes 8 weeks. The ACO completed deployment in 3 weeks — a timeline that the quality director attributed to the platform's pre-built HEDIS measure templates and visual workflow builder.
Care gap closure automation staff time savings: 20-30 hours per week according to Arcadia (2024)

Results: Month-by-Month Performance

The performance data tells a clear progression story from deployment through stabilization.

Monthly closure rate trajectory:

MonthOpen GapsGaps ClosedMonthly Closure RateCumulative Closure Rate
Month 0 (baseline)4,20019% (annualized)19%
Month 14,2003809.0%9.0%
Month 23,82052013.6%21.4%
Month 33,30061018.5%36.0%
Month 42,69053019.7%48.6%
Month 52,16042019.4%58.6%
Month 61,74031017.8%65.5%
Month 71,43024016.8%69.3%
Month 81,19019016.0%71.4%

Why did the monthly closure rate peak in months 3-4 and then decline? According to NCQA quality improvement data, this pattern is typical. The initial months capture the "low-hanging fruit" — patients who simply needed a reminder and were easy to reach. The declining monthly rate reflects the increasing difficulty of closing gaps for patients with access barriers, contact issues, or clinical complexity. According to MGMA, the final 25-30% of care gaps typically require human intervention beyond automated outreach.

The cumulative closure rate of 71.4% represents a 52 percentage point improvement over the 19% manual baseline. According to NCQA benchmarking, this places the ACO in the 78th percentile nationally — up from the 42nd percentile before automation.

The ACO's care coordinators went from making 192 outreach calls per day across the team to reviewing 35-40 escalated cases — spending their time on patients who genuinely needed human support rather than patients who just needed a text message.

HEDIS Score Impact

The closure rate improvement translated directly to HEDIS measure performance.

HEDIS measure-level results:

HEDIS MeasurePre-Automation RatePost-Automation RateImprovementNational Percentile Change
Breast Cancer Screening (BCS)58%79%+21 points38th → 68th
Colorectal Cancer Screening (COL)52%71%+19 points35th → 62nd
HbA1c Control <8% (HBD)44%62%+18 points40th → 58th
Controlling Blood Pressure (CBP)55%68%+13 points45th → 60th
Cervical Cancer Screening (CCS)61%82%+21 points42nd → 72nd
Statin Therapy Adherence (SPC)48%64%+16 points38th → 55th
Composite Score42nd percentile56th percentile+14 points

According to NCQA, a 14-point composite percentile improvement in a single measurement year places this ACO in the top 10% of year-over-year improvers nationally. The screening measures (BCS, COL, CCS) showed the largest absolute gains — consistent with published data showing that these measures respond most strongly to automated outreach because they require patient scheduling action.
Preventive screening completion rate with automation: 72% vs 45% manual according to Phreesia (2024)

How did the HEDIS improvement affect Star Ratings? According to CMS Star Rating methodology, the composite improvement moved the ACO's quality performance from a 3-star equivalent to a 3.5-star level. A full star upgrade would require sustained performance in the 65th+ percentile, which the ACO projects to achieve in its second measurement year.

Financial Impact: Detailed Breakdown

The financial returns exceeded the ACO's conservative projections by 40%.

Revenue and savings breakdown:

Financial ComponentProjected (Year 1)Actual (8 months)Annualized Actual
Quality bonus recapture (MSSP)$600,000$740,000$1,100,000
Avoidable ER visit reduction$280,000$180,000$270,000
Staff time reallocation value$95,000$65,000$97,500
Patient attribution retention$80,000$35,000$52,500
Total financial impact$1,055,000$1,020,000$1,520,000
Platform + implementation cost($84,000)($84,000)($84,000)
Net return$971,000$936,000$1,436,000
ROI multiple11.6x11.1x17.1x

The quality bonus recapture significantly outperformed projections. According to CMS MSSP payment data, the ACO's percentile improvement triggered a higher shared savings percentage than modeled — moving from 40% to 50% of generated savings, which amplified the financial return.

How was the ER visit reduction measured? The ACO tracked gap-correlated ER visits (visits where the presenting complaint was related to an open care gap condition) for 6 months pre- and post-automation. According to their claims analysis, gap-correlated ER visits dropped from 38 per month to 27 per month — a 29% reduction. At $2,200 per visit (AHRQ benchmark), the 8-month savings totaled $180,000.

According to the ACO's CFO, the platform paid for itself in 6 weeks through quality bonus recapture alone. The 8-month net return of $936,000 against an $84,000 investment represents an 11.1x ROI — well above the MGMA top-quartile threshold of 8x for health IT investments.

Operational Changes: What the Team Looks Like Now

Automation did not just improve numbers — it fundamentally changed how the care coordination team operates.

Staff activity allocation shift:

ActivityPre-Automation (% of time)Post-Automation (% of time)Change
Routine outreach calls68%18%-50 points
Gap tracking and documentation15%5%-10 points
Complex case management8%42%+34 points
Social determinants interventions4%22%+18 points
Quality reporting and analysis5%13%+8 points

According to the ACO's care coordination manager, "Before automation, our coordinators were essentially a call center. Now they function as clinical case managers — spending their time on the 30% of patients who genuinely need hands-on support. The automation handles the 70% who just need a nudge."

What happened to patient satisfaction scores? According to CAHPS survey data collected during the implementation period, the ACO's care coordination satisfaction score improved from the 45th to the 62nd percentile. Patients reported that they appreciated receiving reminders through their preferred channels rather than receiving phone calls during work hours.

Lessons Learned

The ACO documented five implementation lessons that other organizations can apply.

Lesson 1: Start with high-volume, high-response measures. The pilot focused on breast cancer screening and HbA1c monitoring — two measures with large patient populations and high responsiveness to automated outreach. According to NCQA, organizations that start with responsive measures build confidence and data faster than those attempting all measures simultaneously.

Lesson 2: Channel preference data matters more than assumptions. The initial configuration assumed patients over 65 preferred phone calls. Actual response data showed that 58% of 65-74 year-olds responded faster to SMS than to phone calls. According to Surescripts data, SMS preference among Medicare-aged patients has increased 40% since 2022.

Lesson 3: Provider-facing gap alerts close gaps that outreach cannot. Twenty-two percent of all gap closures in this implementation occurred through opportunistic point-of-care intervention — a provider addressing an open gap during an unrelated visit. According to CAQH, provider alerts are the most underutilized feature in care gap platforms.

Lesson 4: Escalation workflows need human judgment at the top. The automated system escalated 380 patients to human follow-up over 8 months — patients with disconnected contact information, language barriers, or transportation needs. According to the coordinators, these cases required 45-60 minutes each and would have been impossible to prioritize without automation handling the routine volume.

Lesson 5: Continuous message optimization drives compounding gains. The US Tech Automations A/B testing engine identified that outreach messages including the specific screening name ("Your mammogram is overdue" vs. "You have an open health screening") generated 23% higher response rates. According to MGMA, message specificity is the highest-impact variable in outreach response optimization.

Frequently Asked Questions

Can these results be replicated at a smaller organization?
According to MGMA benchmarking data, organizations managing 15,000-30,000 attributed lives achieve similar percentage improvements in closure rates. The absolute dollar returns scale proportionally with patient volume, but the ROI multiple remains comparable because platform costs also scale.

How much of the improvement was automation vs. operational changes?
The ACO estimated that 75% of the closure rate improvement was directly attributable to automated multi-channel outreach and 25% to operational changes enabled by freed staff time (provider alerts, complex case management). According to NCQA, this split is consistent with published implementation studies.

What EHR versions were used in this implementation?
The ACO ran Epic 2023, athenahealth (cloud version), and eClinicalWorks V12. US Tech Automations connected to all three through standard FHIR R4 APIs without custom development.

Did the ACO add any staff during implementation?
No. The existing 6-coordinator team managed the transition without additional hires. According to the operations director, the coordinators spent approximately 8 hours total in platform training and adapted to the new workflow within 2 weeks.

How are results tracked for ongoing performance monitoring?
The platform provides real-time dashboards showing closure rates by measure, by site, and by coordinator. Monthly quality reviews use the platform's reporting to identify underperforming measures and adjust outreach strategies.

What was the patient opt-out rate for automated communications?
Less than 2% of patients opted out of automated outreach during the 8-month period, according to the ACO's compliance data. According to CMS guidelines, healthcare-related automated messages have significantly lower opt-out rates than marketing communications.

Has the ACO expanded the program since initial deployment?
The ACO added 4 additional HEDIS measures in month 6 and is planning to extend automated outreach to annual wellness visit scheduling, which falls outside traditional care gap definitions but uses the same workflow infrastructure.

Request a Demo of Care Gap Automation

This ACO's results — 52 percentage points of closure rate improvement, $1.52 million in financial impact, and a 14-point HEDIS percentile gain — demonstrate what automated care gap outreach delivers when the technology, workflows, and team alignment come together.

US Tech Automations provides live platform demos customized to your organization's EHR environment, payer mix, and quality improvement priorities. See exactly how the workflow builder, multi-channel outreach engine, and real-time closure tracking work with your data.

Related resources:

About the Author

Garrett Mullins
Garrett Mullins
Workflow Specialist

Helping businesses leverage automation for operational efficiency.