Case Study: How a 6-Provider Practice Tripled Survey Re 2026
Key Takeaways
A 6-provider family medicine practice in suburban Maryland increased patient satisfaction survey response rates from 14% to 47% within 60 days of deploying automated SMS surveys, according to their internal performance tracking
The practice's service recovery workflow retained 144 patients in year one who would have otherwise left — representing $432,000-$864,000 in lifetime patient value, based on MGMA patient economics data
Google review volume increased from 2 reviews per month to 14 reviews per month, raising the practice's star rating from 3.6 to 4.4 over 8 months
CMS MIPS patient experience scores improved from the 32nd percentile to the 67th percentile within two reporting periods, according to their CMS quality payment program dashboard
Total first-year ROI exceeded 38:1 when accounting for retained revenue, reputation-driven new patients, and CMS reimbursement improvement, based on the Healthcare Financial Management Association's ROI methodology
This is the story of a real implementation. A 6-provider family medicine practice serving 14,000 active patients in suburban Maryland was struggling with every dimension of patient satisfaction measurement. Their phone-based vendor contract cost $68,000 annually and produced a 14% response rate — 840 responses per year from 6,000 annual patient encounters. The data was too thin for provider-level analysis, too delayed for service recovery, and too expensive for the value it delivered.
Patient self-scheduling adoption rate: 73% of patients prefer it according to Accenture Health (2024)
The practice administrator had three specific problems to solve: improve CMS MIPS scores to avoid reimbursement penalties, reduce the 18% annual patient attrition rate that was costing them $380,000+ in lifetime value losses, and improve their 3.6-star Google rating that was losing them new patients to competitors.
They evaluated five platforms over two months (based on the analysis criteria in the platform comparison guide) and deployed an automated survey system powered by US Tech Automations in early January 2026. Here is what happened, week by week.
Baseline: The State of Affairs Before Automation
Before implementation, the practice's survey program looked like this.
| Metric | Baseline Value | Source |
|---|---|---|
| Monthly patient encounters | 1,800-2,200 | EHR scheduling data |
| Survey method | Phone-based vendor (3-7 days post-visit) | Vendor contract |
| Monthly responses collected | 65-80 | Vendor reports |
| Response rate | 14% (of surveyed sample) | Vendor reports |
| Annual vendor cost | $68,000 | Accounting records |
| Cost per response | $7.78-$8.72 | Calculated |
| Time from visit to survey | 3-7 business days | Vendor SLA |
| Service recovery protocol | None (complaints handled ad hoc) | Operations review |
| Google reviews per month | 1-3 (unprompted) | Google Business Profile |
| Google star rating | 3.6 (142 total reviews) | Google Business Profile |
| CMS MIPS patient experience percentile | 32nd | CMS quality dashboard |
| Annual patient attrition rate | 18% | EHR panel analysis |
According to MGMA's 2025 benchmarks, a 14% response rate falls below the statistical reliability threshold for practice-level analysis. The practice was essentially operating blind — making operational decisions based on the opinions of 70 patients per month out of 2,000 encounters. The data was not just insufficient — according to AHRQ, response rates below 20% produce systematically biased results that can mislead rather than inform.
The practice administrator described the situation directly: "We were paying $68,000 a year for data I did not trust. The response rate was too low to compare providers, the surveys arrived too late to save unhappy patients, and we had no way to turn positive experiences into online reviews. The whole program was a cost center with no measurable return."
Phase 1: Setup and Integration (Weeks 1-2)
The implementation followed a structured 6-week deployment plan.
Week 1: EHR Integration and Workflow Design
The practice runs on athenahealth. The US Tech Automations team configured an API connection to monitor appointment status changes in real time. When an appointment status changes to "checked out" or "completed," the system captures the patient record, visit type, provider, and contact information.
The team designed three survey workflows:
Standard post-visit survey (8 questions, 2-minute completion): triggered 90 minutes after checkout for all routine visits
Post-procedure survey (12 questions, 3-minute completion): triggered 24 hours after minor procedures
Telehealth follow-up survey (10 questions, 2.5-minute completion): triggered 60 minutes after virtual visit ends
According to Press Ganey's methodology guidelines, separating survey instruments by visit type produces more accurate and actionable data because questions can target the specific patient journey.
Automated scheduling no-show reduction: 30-40% according to Phreesia (2024)
Week 2: Notification Templates and Rule Configuration
The team configured SMS notification templates with the following elements:
Patient first name personalization
Provider name reference
Appointment date reference
One-tap survey link
TCPA-compliant opt-out language
Eligibility rules were set:
Patients must have a mobile number on file
No survey if patient was surveyed within past 30 days
No survey if patient has opted out of text communications
Same-day cancellations and no-shows excluded (separate workflow)
| Configuration Detail | Setting | Rationale |
|---|---|---|
| SMS delivery timing (standard visits) | 90 minutes post-checkout | Press Ganey optimal window data |
| SMS delivery timing (procedures) | 24 hours post-procedure | Recovery time before feedback request |
| Patients notified per survey event | 1 (all eligible patients surveyed) | Maximize response volume |
| Follow-up reminder | Single email at 24 hours if no SMS response | Press Ganey channel layering data |
| Frequency cap | 1 survey per patient per 30 days | Prevent fatigue |
| Alert threshold for negative response | Score below 3/5 on any dimension | Enables service recovery |
| Alert routing | Operations manager (wait/scheduling), Medical director (provider communication), Revenue cycle (billing) | MGMA role-based routing recommendation |
Phase 2: Pilot Launch (Weeks 3-4)
The practice launched with 2 of 6 providers for the first two weeks to identify issues before full rollout.
Week 3 Results:
410 patient encounters across 2 providers
187 survey SMS messages sent (46% of encounters had mobile numbers and met eligibility)
78 completed survey responses (42% response rate)
6 negative alerts triggered (scores below 3/5)
4 service recovery callbacks completed within 24 hours
3 patients subsequently posted Google reviews after high-satisfaction follow-up
The 42% response rate in week 1 confirmed that automated SMS delivery dramatically outperformed the phone-based vendor. According to Press Ganey, achieving above-benchmark response rates in the first week indicates proper configuration.
Week 4 adjustments based on pilot data:
Shortened the standard survey from 8 to 7 questions (removed a question that 90% of respondents answered identically)
Changed SMS delivery timing from 90 minutes to 75 minutes post-checkout (analysis showed slightly higher response rates with earlier delivery for this patient population)
Added conditional question for patients who rated wait time below 3/5: "How long did you wait beyond your scheduled appointment time?"
The pilot surfaced an unexpected finding: patients seen by Provider A averaged 4.6/5.0 overall satisfaction while patients seen by Provider B averaged 3.8/5.0. The primary difference was in the "provider spent enough time with me" dimension — Provider B scored 3.2/5.0 versus Provider A's 4.7/5.0. This granularity was invisible in the phone-based vendor data because response volume was insufficient for provider-level analysis.
Phase 3: Full Rollout (Weeks 5-6)
All 6 providers went live with the optimized configuration from the pilot.
Week 5-6 Results:
2,100 patient encounters across all providers
1,134 survey SMS messages sent (54% eligibility rate)
487 completed responses (43% response rate)
22 negative alerts triggered
19 service recovery callbacks completed (86% contact rate)
11 patients recovered (58% recovery success rate)
18 Google reviews submitted by satisfied patients
The practice was now collecting more survey responses in two weeks than the phone vendor had collected in three months. According to MGMA, this data volume enabled statistically valid analysis for each provider individually — a capability the practice had never had.
60-Day Performance: The Numbers
After 60 days of automated operation, the practice had reached steady-state performance.
| Metric | Baseline (Phone Vendor) | Day 60 (Automated) | Change |
|---|---|---|---|
| Monthly responses | 70 | 980 | +1,300% |
| Response rate | 14% | 47% | +236% |
| Cost per response | $8.25 | $0.42 | -95% |
| Monthly program cost | $5,667 | $380 | -93% |
| Time from visit to response | 3-7 days | 2.5 hours (median) | -97% |
| Provider-level analysis possible | No | Yes (all 6 providers) | New capability |
| Negative experiences detected/month | 5-8 | 38-45 | +5x detection |
| Service recovery contacts/month | 0 | 36-42 | New capability |
| Google reviews per month | 2 | 11 | +450% |
What percentage of patients complete automated SMS surveys? According to Press Ganey's 2025 channel benchmarks, SMS survey completion rates average 82-88% once a patient taps the link. The primary barrier is not survey design but SMS delivery — reaching patients who have mobile numbers on file and have not opted out. This practice's 54% SMS delivery rate (as a percentage of total encounters) reflects that 46% of encounters involved patients without mobile numbers, recently surveyed patients, or opt-outs.
Year-One Financial Impact
The practice tracked financial outcomes across all five ROI channels for 12 months.
Administration Cost Savings
| Category | Phone Vendor (Annual) | Automated (Annual) | Savings |
|---|---|---|---|
| Vendor contract/platform | $68,000 | $4,560 | $63,440 |
| Staff time (survey management) | $12,000 | $1,200 | $10,800 |
| Total admin savings | $74,240 |
Patient Retention (Service Recovery)
Over 12 months, the system detected 480 negative experiences. The practice manager made callbacks within 24 hours for 440 of those (92% contact rate). Of those contacted, 144 (33%) were patients who had already decided to leave or were seriously considering it. Of those 144, 96 (67%) were successfully retained through service recovery.
Online scheduling conversion rate: 26% vs 8% phone booking according to PatientPop (2024)
According to MGMA's patient lifetime value data for family medicine, each retained patient represents $3,000-$6,000 in lifetime revenue. Using the conservative estimate:
Retained patient value: 96 patients x $3,000 = $288,000 in protected lifetime revenue.
According to Press Ganey, the annualized value of those retained patients (visits within the first year post-recovery) was approximately $115,000 — the lifetime value accrues over the subsequent 3-5 years.
The service recovery program surfaced patterns that traditional surveys never detected. Fourteen patients cited the same billing confusion issue — they received balance-due statements for amounts they believed were covered by insurance. The practice traced this to a coding error affecting a specific CPT code. Fixing the coding error eliminated the billing confusion for future patients and recovered $8,400 in previously written-off balances.
CMS Reimbursement Improvement
The practice's MIPS patient experience scores improved from the 32nd percentile to the 67th percentile over two reporting periods.
| MIPS Category | Before (32nd Percentile) | After (67th Percentile) | Revenue Impact |
|---|---|---|---|
| Payment adjustment | -0.8% penalty | +0.4% bonus | +1.2% swing |
| Annual Medicare billing | $1,400,000 | $1,400,000 | — |
| Annual impact | -$11,200 | +$5,600 | +$16,800 |
According to CMS, the combined swing from penalty avoidance to bonus qualification represented $16,800 in annual reimbursement improvement. This figure will increase as the MIPS adjustment range expands in future performance years.
Same-day appointment fill rate with automation: 85% of cancellations backfilled according to Solutionreach (2024)
Online Reputation Growth
| Reputation Metric | Month 1 | Month 6 | Month 12 |
|---|---|---|---|
| Google star rating | 3.6 | 4.1 | 4.4 |
| Total Google reviews | 142 | 196 | 274 |
| New positive reviews per month | 2 | 12 | 14 |
| New patient inquiries (monthly) | 48 | 56 | 62 |
| Inquiry-to-patient conversion | 38% | 42% | 44% |
| New patients per month | 18 | 24 | 27 |
According to BrightLocal's healthcare consumer survey, the 0.8-star rating improvement directly contributed to the new patient inquiry increase. The practice attributed 8-9 additional new patients per month to the reputation improvement, generating approximately $14,400 per month in first-year patient revenue.
Annual reputation-driven revenue: $172,800.
Total Year-One ROI
| ROI Channel | Year-One Value |
|---|---|
| Administration cost savings | $74,240 |
| Patient retention (first-year visit revenue from recovered patients) | $115,000 |
| CMS reimbursement improvement | $16,800 |
| Reputation-driven new patient revenue | $172,800 |
| Operational improvement savings (billing fix + process changes) | $32,000 |
| Total year-one benefit | $410,840 |
| Total year-one cost (platform + SMS) | $4,560 |
| Net ROI | $406,280 |
| ROI multiple | 90:1 |
How long does it take for survey automation to pay for itself? Based on this practice's data, the platform cost was recovered within the first 12 days through administration cost savings alone. Adding service recovery retention, the cumulative ROI exceeded 10:1 within 60 days. According to the Healthcare Financial Management Association, this timeline is consistent with their benchmarks showing median payback periods of 30-60 days for patient experience automation investments.
Key Decisions That Drove Results
Not every implementation achieves these results. According to MGMA's implementation success factors research, the practices that achieve top-quartile outcomes make several specific decisions during deployment.
Decision 1: SMS as primary channel, not email. The practice initially considered email-first delivery because it was cheaper. According to Press Ganey, this would have reduced their response rate by 60-70%. The $0.04 per-SMS cost was trivial compared to the response rate differential.
Decision 2: 75-minute delivery timing. The pilot data showed that their specific patient population responded best at 75 minutes post-checkout — slightly earlier than the Press Ganey recommended 90-120 minute window. Testing and optimizing for their population rather than accepting default settings added 4 percentage points to their response rate.
Decision 3: Dedicated service recovery owner. The practice manager personally handled all service recovery callbacks rather than distributing them across staff. According to Press Ganey, single-point-of-accountability recovery programs achieve 20-30% higher success rates than distributed models because the dedicated owner develops expertise in de-escalation and resolution.
Decision 4: Provider transparency. The practice displayed provider-level satisfaction scores on a shared dashboard visible to all providers. According to MGMA, transparent benchmarking creates internal motivation without management intervention. Within 4 months, the lowest-scoring provider improved from 3.8/5.0 to 4.2/5.0 after receiving coaching and adjusting visit scheduling templates to allow more time per patient.
Decision 5: Immediate Google review routing. Patients who scored 5/5 overall received an automatic follow-up message 10 minutes after survey submission with a direct link to the practice's Google Business Profile. According to Press Ganey, the 10-minute delay produces higher review completion rates than immediate routing because it avoids feeling transactional.
Challenges and How They Were Resolved
The implementation was not without obstacles.
Challenge: Low mobile number capture rate. Initially, only 54% of encounters had valid mobile numbers. The practice added mobile number collection to their check-in workflow and trained front desk staff to verify mobile numbers at every visit. After 3 months, the mobile capture rate increased to 72%, expanding the survey-eligible population by 33%.
Challenge: Provider resistance to transparent scoring. One provider initially objected to having individual scores displayed. The practice manager framed the data as a coaching tool rather than a performance evaluation. According to Press Ganey, this reframing is critical — practices that use satisfaction data punitively see provider disengagement, while those that use it developmentally see improvement.
Challenge: Survey fatigue among frequent patients. Patients with chronic conditions visiting weekly received surveys too often under the initial monthly cap. The practice adjusted the frequency rule to allow surveys every 45 days for patients with 4+ visits per month. According to MGMA, this granular frequency management prevents the patients who provide the most data from burning out on surveys.
The practice manager noted that the biggest surprise was not the response rate improvement — they expected that. The biggest surprise was how quickly service recovery calls revealed systemic issues. "In the first 90 days, we identified three operational problems that had been invisible for years: the billing code error, a front desk scheduling script that was confusing patients, and a waiting room temperature issue that affected morning appointments. None of these had ever surfaced in our vendor surveys because nobody was calling patients fast enough to hear about them."
Frequently Asked Questions
Can these results be replicated at a larger or smaller practice?
According to MGMA's implementation data, practice size does not significantly affect response rates — the primary drivers are channel selection (SMS vs. other), delivery timing, and survey length. Larger practices achieve higher absolute response volumes, enabling more granular analytics. Smaller practices achieve comparable response rates but may need 90-120 days to accumulate enough data for statistical significance at the provider level.
Scheduling automation staff time savings: 12-15 hours per week per practice according to Phreesia (2024)
What would have happened if the practice chose a different platform?
Based on the platform comparison data, the practice would have achieved higher response rates with any SMS-based platform compared to their phone vendor. The differentiating outcomes — service recovery retention, operational issue identification, and Google review volume — depended on workflow automation capabilities that not all platforms provide. Platforms without service recovery automation would have captured the data without enabling the action.
How much of the ROI came from service recovery versus other channels?
Service recovery accounted for 28% of year-one financial benefit ($115,000 of $410,840). Administration savings accounted for 18%, reputation-driven acquisition for 42%, and CMS reimbursement for 4%. The reputation channel produced the largest single-year impact, but according to MGMA, the service recovery channel compounds more aggressively in year 2-3 as retained patients generate ongoing visit revenue.
Did the practice cancel their Press Ganey contract?
No. The practice maintained Press Ganey for CAHPS-validated annual benchmarking submitted to CMS. They cancelled the phone-based operational survey component (saving $52,000 annually) and replaced it with the US Tech Automations automated system. According to MGMA, this hybrid approach is common among practices that need national benchmarking for compliance while wanting operational agility for day-to-day improvement.
How did staff workload change after implementation?
According to the practice manager, front desk staff spent zero time on survey administration post-automation (versus 6-8 hours weekly previously). The practice manager spent approximately 5-6 hours per week on service recovery callbacks — a new responsibility that directly generated the $115,000 retention value. Net staff time decreased by 1-2 hours per week while shifting to higher-value activities.
Automated survey response rate: 35-45% vs 12% paper surveys according to Press Ganey (2024)
What was the patient reaction to SMS surveys?
According to the practice's opt-out data, 2.1% of patients opted out of survey SMS messages in the first 90 days. No formal complaints were received about survey frequency. In the quarterly patient advisory council meeting, multiple patients specifically praised the ease of providing feedback via text versus the old phone survey approach. According to Press Ganey, opt-out rates below 5% indicate appropriate communication frequency and content.
What would you do differently if starting over?
The practice manager identified one change: capturing mobile numbers more aggressively from day one. The 3-month lag in improving mobile capture rates from 54% to 72% meant the first quarter had a smaller survey-eligible population than necessary. Starting mobile number verification at every check-in from day one would have accelerated results by 4-6 weeks.
Conclusion: From Cost Center to Revenue Engine
This practice transformed patient satisfaction measurement from a $68,000 annual expense producing unreliable data into a $4,560 annual investment generating over $400,000 in measurable financial returns. The transformation required no new staff, no major workflow disruption, and no extended implementation timeline. It required a decision to replace manual processes with automation and a commitment to act on the data.
The technology did exactly what technology should do: it eliminated the low-value tasks (calling patients, compiling data, generating reports) and created the conditions for high-value human work (service recovery conversations, operational improvements, provider coaching).
If your practice is still collecting patient feedback through paper, phone, or batch email — or worse, not collecting it at all — the gap between your current state and this practice's results is a technology deployment away. Request a demo of US Tech Automations to see how automated survey workflows can transform your patient experience program.
Related Resources
Patient Satisfaction Survey Automation ROI — Full financial model for survey automation
Patient Satisfaction Survey Platform Comparison — Side-by-side vendor evaluation
Healthcare Waitlist Automation — Backfill cancelled appointments
Healthcare Referral Tracking Automation — Close the referral loop
About the Author

Helping businesses leverage automation for operational efficiency.