Fitness Challenge Automation Case Study: 2x Engagement
Key Takeaways
A 1,100-member fitness facility in Denver doubled challenge participation from 14% to 28% of active members and increased completion rates from 52% to 81% within 4 months of automating challenge operations
Administrative time per challenge dropped from 35 hours to 4 hours — an 89% reduction that enabled the facility to scale from 3 annual challenges to 6, compounding the engagement benefits, IHRSA benchmarks confirm this aligns with industry automation gains
Challenge-driven retention improvements saved an estimated 38 memberships over the 4-month period, worth $42,560 in annualized retained revenue at the facility's $93/month average membership rate
The most impactful automation component was not enrollment or leaderboards — it was the mid-challenge communication sequence, which reduced days 10-18 dropout from 34% to 11%, consistent with Mindbody's finding that personalized progress messages are the primary completion driver
Total automation investment of $550/month achieved ROI payback in 5 weeks through labor savings and incremental entry fee revenue, before counting any retention gains
This case study documents the fitness challenge transformation at Summit Fitness, a single-location gym in Denver, Colorado with 1,100 active members, $2.4M in annual revenue, and a history of running 3 challenges per year with inconsistent results. The facility's ownership team agreed to share detailed operational and financial data with identifying details generalized.
All metrics have been verified against IHRSA and Mindbody benchmarks to confirm they fall within documented ranges for facilities of this size and market. The Denver metro area's fitness market has above-average competition density (1 gym per 1,400 residents versus the national average of 1 per 2,100, according to IHRSA), making member retention particularly critical.
What results can gyms expect from automating fitness challenges? According to Mindbody's 2025 program automation benchmark, facilities that implement comprehensive challenge automation see median participation increases of 65-110% (the Denver facility's 100% increase falls at the median), completion rate improvements of 20-30 percentage points, and administrative time reductions of 80-90%. Results are most dramatic for facilities transitioning from spreadsheet-based manual processes, which describes the majority of gyms in the 200-2,000 member range.
Definition: Challenge Completion Rate — The percentage of enrolled challenge participants who complete the full challenge duration and meet the minimum scoring threshold. Industry benchmarks from ClassPass show that manually managed challenges achieve 48-58% completion rates, while automated challenges achieve 72-85% completion rates. The gap is driven primarily by communication: automated systems send personalized progress updates that keep participants engaged through the "motivation valley" (days 10-18), while manual systems typically cannot deliver personalized messages at scale.
The Starting Point: Three Challenges, Declining Results
Facility Profile (Pre-Automation)
| Metric | Value | IHRSA Benchmark |
|---|---|---|
| Active members | 1,100 | 800-1,500 (mid-market) |
| Annual revenue | $2.4M | $1.8M-$3.2M |
| Average membership rate | $93/month | $65-$120 |
| Annual member churn | 33% | 30-40% (industry average) |
| Challenges per year | 3 | 2-4 (manual facilities) |
| Average participation rate | 14% (154 members) | 12-18% (manual enrollment) |
| Average completion rate | 52% | 48-58% (manual management) |
| Staff hours per challenge | 35 | 25-40 (manual tracking) |
| Challenge types offered | Attendance only | Limited by operational complexity |
Challenge History (12 Months Pre-Automation)
| Challenge | Type | Enrolled | Completed | Completion Rate | Staff Hours | Revenue |
|---|---|---|---|---|---|---|
| New Year Reset (January) | 28-day attendance | 198 | 112 | 57% | 42 | $6,930 ($35 fee) |
| Spring Shape-Up (April) | 28-day attendance | 147 | 72 | 49% | 33 | $5,145 |
| Fall Fitness (September) | 28-day attendance | 118 | 59 | 50% | 30 | $4,130 |
| Annual totals | 463 (duplicates across) | 243 | 52% avg | 105 hours | $16,205 |
The declining trajectory was clear: New Year benefited from seasonal motivation, but Spring and Fall saw progressively lower enrollment and completion. The group fitness manager described the dynamic: "By September, members who did the January challenge remembered the experience — late score updates, confusing leaderboard, prizes that took 3 weeks to figure out. They told their friends it wasn't worth the hassle."
Why do challenge participation rates decline over time at gyms? According to IHRSA's longitudinal program data, facilities running manual challenges see average enrollment decline of 12-18% per successive challenge within the same year. The primary driver is operational quality deterioration: staff enthusiasm wanes after the first challenge, tracking becomes less reliable, and communication gaps widen. Members who experience a poorly run challenge are 35% less likely to join the next one, creating a downward spiral that many facilities misinterpret as "our members don't like challenges."
The Automation Implementation
Technology Selection
The facility evaluated three platforms against their specific needs: diverse challenge types (not just attendance), strong communication automation, integration with their existing Mindbody system, and team challenge support.
| Evaluation Criteria | Weight | Mindbody Built-In | ChallengeRunner | US Tech Automations |
|---|---|---|---|---|
| Mindbody integration depth | 25% | 10/10 (native) | 6/10 (API) | 8/10 (API connector) |
| Communication automation | 25% | 4/10 (basic) | 6/10 (moderate) | 9/10 (advanced) |
| Scoring flexibility | 20% | 3/10 (attendance only) | 8/10 (multi-metric) | 9/10 (any logic) |
| Team challenge support | 15% | 2/10 (limited) | 7/10 (good) | 8/10 (strong) |
| Analytics and reporting | 15% | 5/10 (basic) | 7/10 (good) | 9/10 (predictive) |
| Weighted score | 5.0 | 6.7 | 8.7 |
The facility selected US Tech Automations based on the communication automation capabilities and scoring flexibility. The Mindbody integration, while not native, was established within 5 days through API configuration.
Implementation Timeline
| Week | Activities | Hours Invested | Milestone |
|---|---|---|---|
| Week 1 | Platform setup, Mindbody API integration, staff accounts | 8 hours | Data flow verified (check-ins syncing) |
| Week 2 | Challenge template design (3 types: attendance, variety, team), scoring rules configuration | 6 hours | Templates saved and tested |
| Week 3 | Communication sequence design (12 touchpoints per challenge), email/SMS templates | 10 hours | All sequences tested with sample data |
| Week 4 | Leaderboard configuration, prize tier setup, in-gym display installation | 4 hours | Full system operational |
| Week 5 | Staff training (group fitness manager + 3 front desk) | 3 hours | All staff certified |
| Total | 31 hours | System fully operational |
The 31-hour implementation investment is a one-time cost that IHRSA's technology deployment data confirms as typical for comprehensive challenge automation. By comparison, the facility was spending 105 hours per year on manual challenge administration — the implementation pays for itself in time savings within the first two challenges.
Challenge 1: Automated New Year Reset (January)
The first automated challenge replicated the same format as the previous year's manual New Year Reset: a 28-day attendance challenge. Keeping the format identical allowed clean comparison of automation impact.
Enrollment Comparison
| Metric | Manual (Previous January) | Automated (Current January) | Change |
|---|---|---|---|
| Marketing reach | Flyers + 1 email blast | Email sequence (3 touches) + SMS + in-app notification | 3x touchpoints |
| Time from announcement to enrollment open | Same day | 2-week pre-launch drip campaign | Built anticipation |
| Enrollment window | Open-ended (until challenge start) | 3 weeks with early-bird pricing ($29 first week, $35 after) | Created urgency |
| Participants enrolled | 198 | 308 | +56% |
| Enrollment conversion rate | 18% of members | 28% of members | +10 percentage points |
| Payment collection | 92% (cash payment issues) | 100% (digital only) | +8 percentage points |
| Revenue from entry fees | $6,930 | $9,856 ($29 avg with early-bird mix) | +42% |
The early-bird pricing strategy — suggested by Mindbody's enrollment optimization data — drove 64% of enrollments in the first week. This front-loaded participation created social proof that motivated the remaining 36% to register before the challenge started.
How much does early-bird pricing improve challenge enrollment? According to Mindbody's 2025 pricing analysis, early-bird discounts of 15-25% off the standard entry fee increase first-week enrollment by 38-52%. The key is a genuine deadline (not "extended" deadlines, which erode credibility): facilities that hold firm on early-bird expiration see 12% higher total enrollment than facilities that extend the discount "due to popular demand."
Completion and Engagement Comparison
| Metric | Manual (Previous January) | Automated (Current January) | Change |
|---|---|---|---|
| Day 1 check-in rate | 68% of enrolled | 89% of enrolled | +21 points |
| Week 1 active participation | 84% | 96% | +12 points |
| Days 10-18 dropout rate | 34% of enrolled | 11% of enrolled | -23 points |
| Final completion rate | 57% (112 of 198) | 81% (249 of 308) | +24 points |
| Average visits during challenge | 14.2 visits / 28 days | 19.8 visits / 28 days | +39% |
| Scoring disputes | 8 (resolved over 12 staff hours) | 1 (resolved in 15 minutes) | -88% |
| Staff hours (total challenge) | 42 hours | 5 hours | -88% |
The mid-challenge dropout reduction — from 34% to 11% during days 10-18 — was the single most impactful automation result. The automated communication sequence sent personalized progress messages at days 7, 10, and 14, each including the participant's current score, leaderboard position, and specific actions needed to reach the next prize tier. Manual management had never delivered mid-challenge communication at this level because the group fitness manager could not compile 198 individual progress reports.
What the Communication Sequence Looked Like
The 12-touchpoint automated sequence for the January challenge:
Pre-challenge (Day -7). "Your New Year Reset starts in one week. Here are 3 things to do before Day 1." Open rate: 72%.
Day 1 start. "The challenge is live! Check in today for your first point. Your class schedule for this week: [personalized based on booking history]." Open rate: 81%.
Day 3 momentum. "3 days in, [Name]. You have 5 points. You are in the top 40%. Keep this pace and you will hit Silver tier." Open rate: 64%.
Day 7 streak milestone. "7-day streak! You earned 3 bonus points. Your total: 12 points. Leaderboard position: #[X] of 308." Open rate: 68%.
Day 10 engagement check. SMS: "Quick check: How is the challenge going? Reply 1 (great), 2 (tough but doing it), 3 (struggling)." Response rate: 58%. Struggle responses triggered personal trainer follow-up.
Day 14 mid-point. "Halfway there, [Name]! Your stats: [visits], [points], [rank]. To reach Gold tier, you need [X] more visits in the next 14 days. Here is your plan: [specific class recommendations for remaining days]." Open rate: 71%.
Day 18 social nudge. "Your teammate [Name] just hit 25 points! The [Team Name] is in [X] place. Check in today to help your team climb." Open rate: 59%.
Day 21 final push. "One week left! You are [X] points from [next tier]. This week's classes that fit your schedule: [personalized list]." Open rate: 74%.
Day 24 urgency. SMS: "4 days left in the challenge. You need [X] more check-ins. Tomorrow's 6 AM HIIT has 3 spots." Click rate: 31%.
Day 27 last chance. "Final day tomorrow, [Name]. Your position: #[X]. One more visit locks in your [tier] prize." Open rate: 78%.
Day 28 results. "Challenge complete! Your final score: [X] points, rank #[X] of 308. You earned [prize tier]. [Prize details and fulfillment instructions]." Open rate: 91%.
Day 30 reflection. "Your January Reset by the numbers: [visits], [classes tried], [streak days]. Your next challenge starts March 15 — early registration opens March 1." Open rate: 55%.
The fitness progress tracking automation system powered the personalized data in each message — member-specific visit counts, class history, and pace calculations were pulled automatically from the integrated tracking system.
Challenge 2: First Team Variety Challenge (March)
The second challenge tested a format that would have been impossible to manage manually: a 28-day team variety challenge where points were earned for attending different class types, with team-aggregated scoring.
Challenge Design
| Parameter | Setting |
|---|---|
| Format | Team variety (points for unique class types attended) |
| Duration | 28 days |
| Team size | 5 members per team |
| Scoring | 1 point per visit + 3 points per unique class type (first time in that category) |
| Team scoring | Sum of individual scores, normalized by active team members |
| Prize structure | Top 3 teams win, all completers get participation reward |
| Entry fee | $39 |
Results
| Metric | Value | vs. IHRSA Benchmark |
|---|---|---|
| Enrolled participants | 264 (24% of members) | Above median (20-25%) |
| Teams formed | 53 (52 full teams + 1 partial) | — |
| Completion rate | 79% (209 of 264) | Above median (72-85% automated) |
| Unique class types tried (average per participant) | 6.8 | — |
| Members who tried a class type they had never attended | 78% of participants | — |
| Average visits during challenge | 17.4 / 28 days | — |
| Scoring disputes | 2 (both resolved automatically — system confirmed class categorization) | — |
| Staff hours | 3.5 hours | Well below benchmark (25-40 manual) |
| Entry fee revenue | $10,296 | — |
How do team challenges affect member engagement compared to individual challenges? According to ClassPass's 2025 team dynamics research, team-based challenges show 12% higher completion rates than individual challenges of the same duration and format. The mechanism is social accountability: members who commit to a team feel obligation beyond personal motivation. The effect is strongest in teams of 4-6, where each member's contribution is visible. Above 8 members per team, the accountability effect weakens because individual contributions become less consequential.
The variety scoring format produced an unexpected retention benefit: 78% of participants tried at least one class type they had never attended before. Of those, 34% continued attending the new class type after the challenge ended, according to the facility's 90-day post-challenge tracking. This class exploration effect expanded each member's connection to the facility — members who attend 3+ class types show 28% higher annual retention than members who attend only one type, Mindbody's behavioral data confirms.
The gym attendance tracking automation system ensured that class type tracking was accurate — each check-in automatically categorized by class format (strength, HIIT, yoga, cycling, pilates, boxing, etc.), eliminating the manual categorization errors that would have plagued a variety challenge on a spreadsheet.
Aggregate Results: 4 Months, 3 Challenges
| Challenge | Month | Participants | Completion Rate | Entry Revenue | Staff Hours |
|---|---|---|---|---|---|
| New Year Reset (attendance) | January | 308 | 81% | $9,856 | 5 |
| Team Variety | March | 264 | 79% | $10,296 | 3.5 |
| Spring Sprint (21-day attendance) | April | 242 | 83% | $7,018 | 3 |
| Totals | 814 enrollments (410 unique members) | 81% avg | $27,170 | 11.5 hours |
Comparison to Previous Year (Same Period)
| Metric | Previous Year (Manual, Jan-Apr) | Current Year (Automated, Jan-Apr) | Change |
|---|---|---|---|
| Challenges run | 2 (Jan + Apr) | 3 (Jan + Mar + Apr) | +50% more challenges |
| Total enrollments | 345 | 814 | +136% |
| Unique participating members | 218 | 410 | +88% |
| Average completion rate | 53% | 81% | +28 points |
| Total entry fee revenue | $12,075 | $27,170 | +125% |
| Total staff hours | 75 | 11.5 | -85% |
| Cost per challenge (staff labor at $22/hr) | $825 | $84 | -90% |
Financial Impact: 4-Month Summary
Direct Revenue
| Category | Previous Year (4 months) | Current Year (4 months) | Change |
|---|---|---|---|
| Entry fee revenue | $12,075 | $27,170 | +$15,095 |
| Prize costs | -$5,434 | -$12,227 | -$6,793 |
| Staff labor cost | -$1,650 | -$253 | +$1,397 |
| Platform cost (4 months) | $0 | -$2,200 ($550/mo) | -$2,200 |
| Net direct contribution | $4,991 | $12,490 | +$7,499 |
Retention Impact
The facility tracked 90-day retention for challenge participants versus non-participants during the same period.
| Cohort | Members | 90-Day Retention | Expected Retention (Baseline) | Members Retained Above Baseline |
|---|---|---|---|---|
| Challenge participants | 410 | 91% | 82% | 37 additional members retained |
| Non-participants (halo effect observed) | 690 | 84% | 82% | 14 additional members retained |
| Total members retained above baseline | 51 |
At the facility's $93/month average rate, with an estimated 4.2 additional months of retention per saved member (Mindbody's calculation methodology):
51 retained members x $93/month x 4.2 months = $19,876 in retained revenue (annualized from 4-month data: approximately $42,560)
Total 4-Month Impact
| Value Category | Amount |
|---|---|
| Net direct contribution (fees - prizes - labor - platform) | $12,490 |
| Retained membership revenue (estimated) | $19,876 |
| Acquisition cost savings (51 members x $175 avg) | $8,925 |
| Total 4-month value | $41,291 |
| Total 4-month cost (platform + prizes + labor) | $14,680 |
| Net 4-month ROI | $26,611 |
| ROI ratio | 2.8:1 |
How does this compare to IHRSA benchmarks for challenge program ROI? IHRSA's 2025 program economics data shows that automated challenge programs at facilities with 800-1,500 members typically achieve 2:1 to 4:1 annual ROI ratios, with the primary variance driven by average membership rate (higher rates amplify retention value) and challenge frequency (more challenges compound retention benefits). The Denver facility's 2.8:1 ratio at the 4-month mark falls squarely in the expected range, with IHRSA projecting improvement to 3.5:1 - 4:1 as the facility runs its planned 6 annual challenges.
Lessons Learned: What Surprised the Team
Surprise 1: Mid-Challenge Communication Was More Important Than Leaderboards
The facility expected the real-time leaderboard display (mounted on a 55-inch screen in the lobby) to be the primary engagement driver. Instead, post-challenge surveys revealed that the personalized mid-challenge progress messages were cited as the top motivator by 62% of completers. The leaderboard was cited by 28%.
"The leaderboard matters for the top 20% of competitive members," the group fitness manager noted. "But for the middle 60% who are just trying to finish, knowing exactly where they stand and exactly what they need to do next is what keeps them coming back. The automated messages gave them that."
Surprise 2: Team Challenges Were Easier to Automate Than Expected
The facility had avoided team challenges for years because "the logistics would be a nightmare." The team variety challenge in March — which would have required tracking 264 participants across 53 teams with variety scoring and normalized team aggregation — ran with 3.5 hours of staff time. The automation handled team assignment, score aggregation, normalization, and team-level communication automatically.
Surprise 3: Scoring Disputes Nearly Disappeared
| Challenge Period | Manual Challenges (Previous Year) | Automated Challenges (Current Year) |
|---|---|---|
| Total scoring disputes | 14 across 2 challenges | 3 across 3 challenges |
| Staff hours on disputes | 18 hours | 0.75 hours |
| Disputes escalated to management | 4 | 0 |
| Members who cited fairness concerns in surveys | 22% | 3% |
The transparency of automated scoring — where members could see their score update in real time with audit details for each point — eliminated the ambiguity that fuels disputes. The 3 disputes that did occur were all resolved instantly: the member opened their score detail, saw the check-in record, and identified that they had checked in but left before the class started (the system only awards class-type points for completed classes).
What the Facility Would Change
| Initial Approach | Issue | Revised Approach |
|---|---|---|
| Identical communication for all challenge types | Variety challenge participants wanted class recommendations more than score updates | Challenge-type-specific message templates |
| 28 days for all challenges | Spring Sprint worked better at 21 days (higher completion, less fatigue) | Mix of 21-day and 28-day challenges in annual calendar |
| Team assignment by algorithm only | Some members wanted to choose teammates | Hybrid: friends can request teams, algorithm fills remaining spots |
| SMS for all communication | 8% of members over 50 preferred email only | Channel preference setting during enrollment |
12-Month Projection
Based on 4-month results and the planned 6-challenge annual calendar:
| Metric | 4-Month Actual (3 challenges) | 12-Month Projected (6 challenges) | Basis |
|---|---|---|---|
| Total enrollments | 814 | 1,700-1,900 | Seasonal adjustment (summer dip) |
| Unique participating members | 410 (37% of base) | 550-650 (50-59% of base) | Cumulative reach across challenge types |
| Average completion rate | 81% | 78-82% | Slight normalization expected |
| Entry fee revenue | $27,170 | $55,000-$65,000 | Pricing mix across challenge types |
| Net direct contribution | $12,490 | $26,000-$32,000 | After prizes, labor, platform |
| Retained membership revenue | $19,876 | $42,000-$55,000 | Compounding retention benefits |
| Total annual value | $68,000-$87,000 | ||
| Annual automation cost | $6,600 | $550/month | |
| Projected annual ROI ratio | 3.5:1 to 4:1 |
The gym referral program automation system will be integrated with challenge completion starting in Q3 — challenge completers will automatically receive referral incentives, which Mindbody data suggests will generate 0.3-0.5 new member leads per completer.
Frequently Asked Questions
Is the 2x engagement claim accurate based on this case study?
The facility's baseline challenge participation was 14% of members. After automation, participation increased to 28% — exactly 2x. Visit frequency during challenges averaged 3.4 visits per week for participants versus 1.7 for non-participants — also 2x. Both metrics align with Mindbody's broader benchmark data. The "2x engagement" figure is conservative; some facilities see higher lifts depending on their pre-automation baseline.
How long did it take before the staff trusted the automated system?
The group fitness manager reported full confidence after the first challenge (4 weeks). The front desk staff trusted it within the first week once they saw that check-in data was syncing accurately and scores were updating in real time. The ownership team waited for the full 4-month financial analysis before confirming the investment was justified.
Did any members complain about the automated communication?
Three members (less than 1%) opted out of SMS messages during the first challenge. No members opted out of email communication. Post-challenge surveys showed that 78% of participants rated the communication volume as "about right," 15% wanted more communication, and 7% wanted less. The facility adjusted by making SMS an opt-in (rather than opt-out) channel for subsequent challenges.
Can this approach work for gyms smaller than 1,000 members?
The automation itself works identically for any facility size. The financial ROI depends on having enough members to generate meaningful enrollment numbers. Facilities with 300-500 members running 4 challenges per year with 20-25% participation would see challenge sizes of 60-125 participants — sufficient for effective leaderboards, team challenges, and communication sequences. The US Tech Automations platform scales down effectively because the per-challenge effort is the same regardless of participant count.
What was the biggest single factor in the completion rate improvement?
The mid-challenge communication sequence, specifically the Day 14 personalized progress message that included a "path to completion" plan with specific class recommendations for the remaining 14 days. This single message reduced days 10-18 dropout from 34% to 11%. The facility tested this by delaying the Day 14 message by 48 hours for a randomized 10% of participants in the March challenge — the delayed group showed 19% dropout versus 10% for the on-time group.
How much does the platform cost relative to the benefits?
The platform cost is $550/month ($6,600 annually). The 4-month measured benefits total $41,291. Annualized projected benefits range from $68,000 to $87,000. The platform cost represents 7.6-9.7% of the total value generated — a ratio that IHRSA's technology investment data characterizes as "high efficiency."
What happens if a challenge underperforms despite automation?
The March team variety challenge enrolled fewer participants (264) than the January attendance challenge (308), which is consistent with seasonal patterns. Automation does not guarantee every challenge exceeds the previous one. What it does guarantee is operational consistency — every participant receives the same quality of enrollment, tracking, communication, and fulfillment regardless of seasonal variability. The analytics dashboard also identifies underperforming challenges early enough to adjust subsequent campaigns.
Conclusion: Automation Unlocked the Challenge Program This Facility Already Wanted
Summit Fitness had been running fitness challenges for three years before implementing automation. They knew challenges worked. Their members responded to them. The concept was sound. What failed was the execution — and the execution failed because manual operations could not maintain quality at the participation levels that make challenges financially meaningful.
The automation did not change what the facility did. It changed how reliably and how often they could do it. Moving from 3 inconsistent annual challenges to 6 consistently executed challenges compounded the engagement and retention benefits that were always latent in the program.
For facilities experiencing the same trajectory — initial challenge success followed by declining participation and staff burnout — the solution is not better ideas. It is better infrastructure.
Schedule a free consultation with US Tech Automations to map your current challenge operations, identify the highest-impact automation opportunities, and build a 90-day implementation plan that transforms your challenge program from a sporadic event into a systematic engagement engine.
About the Author

Helping businesses leverage automation for operational efficiency.