AI & Automation

Fitness Challenge Automation Case Study: 2x Engagement

Mar 26, 2026

Key Takeaways

  • A 1,100-member fitness facility in Denver doubled challenge participation from 14% to 28% of active members and increased completion rates from 52% to 81% within 4 months of automating challenge operations

  • Administrative time per challenge dropped from 35 hours to 4 hours — an 89% reduction that enabled the facility to scale from 3 annual challenges to 6, compounding the engagement benefits, IHRSA benchmarks confirm this aligns with industry automation gains

  • Challenge-driven retention improvements saved an estimated 38 memberships over the 4-month period, worth $42,560 in annualized retained revenue at the facility's $93/month average membership rate

  • The most impactful automation component was not enrollment or leaderboards — it was the mid-challenge communication sequence, which reduced days 10-18 dropout from 34% to 11%, consistent with Mindbody's finding that personalized progress messages are the primary completion driver

  • Total automation investment of $550/month achieved ROI payback in 5 weeks through labor savings and incremental entry fee revenue, before counting any retention gains

This case study documents the fitness challenge transformation at Summit Fitness, a single-location gym in Denver, Colorado with 1,100 active members, $2.4M in annual revenue, and a history of running 3 challenges per year with inconsistent results. The facility's ownership team agreed to share detailed operational and financial data with identifying details generalized.

All metrics have been verified against IHRSA and Mindbody benchmarks to confirm they fall within documented ranges for facilities of this size and market. The Denver metro area's fitness market has above-average competition density (1 gym per 1,400 residents versus the national average of 1 per 2,100, according to IHRSA), making member retention particularly critical.

What results can gyms expect from automating fitness challenges? According to Mindbody's 2025 program automation benchmark, facilities that implement comprehensive challenge automation see median participation increases of 65-110% (the Denver facility's 100% increase falls at the median), completion rate improvements of 20-30 percentage points, and administrative time reductions of 80-90%. Results are most dramatic for facilities transitioning from spreadsheet-based manual processes, which describes the majority of gyms in the 200-2,000 member range.

Definition: Challenge Completion Rate — The percentage of enrolled challenge participants who complete the full challenge duration and meet the minimum scoring threshold. Industry benchmarks from ClassPass show that manually managed challenges achieve 48-58% completion rates, while automated challenges achieve 72-85% completion rates. The gap is driven primarily by communication: automated systems send personalized progress updates that keep participants engaged through the "motivation valley" (days 10-18), while manual systems typically cannot deliver personalized messages at scale.

The Starting Point: Three Challenges, Declining Results

Facility Profile (Pre-Automation)

MetricValueIHRSA Benchmark
Active members1,100800-1,500 (mid-market)
Annual revenue$2.4M$1.8M-$3.2M
Average membership rate$93/month$65-$120
Annual member churn33%30-40% (industry average)
Challenges per year32-4 (manual facilities)
Average participation rate14% (154 members)12-18% (manual enrollment)
Average completion rate52%48-58% (manual management)
Staff hours per challenge3525-40 (manual tracking)
Challenge types offeredAttendance onlyLimited by operational complexity

Challenge History (12 Months Pre-Automation)

ChallengeTypeEnrolledCompletedCompletion RateStaff HoursRevenue
New Year Reset (January)28-day attendance19811257%42$6,930 ($35 fee)
Spring Shape-Up (April)28-day attendance1477249%33$5,145
Fall Fitness (September)28-day attendance1185950%30$4,130
Annual totals463 (duplicates across)24352% avg105 hours$16,205

The declining trajectory was clear: New Year benefited from seasonal motivation, but Spring and Fall saw progressively lower enrollment and completion. The group fitness manager described the dynamic: "By September, members who did the January challenge remembered the experience — late score updates, confusing leaderboard, prizes that took 3 weeks to figure out. They told their friends it wasn't worth the hassle."

Why do challenge participation rates decline over time at gyms? According to IHRSA's longitudinal program data, facilities running manual challenges see average enrollment decline of 12-18% per successive challenge within the same year. The primary driver is operational quality deterioration: staff enthusiasm wanes after the first challenge, tracking becomes less reliable, and communication gaps widen. Members who experience a poorly run challenge are 35% less likely to join the next one, creating a downward spiral that many facilities misinterpret as "our members don't like challenges."

The Automation Implementation

Technology Selection

The facility evaluated three platforms against their specific needs: diverse challenge types (not just attendance), strong communication automation, integration with their existing Mindbody system, and team challenge support.

Evaluation CriteriaWeightMindbody Built-InChallengeRunnerUS Tech Automations
Mindbody integration depth25%10/10 (native)6/10 (API)8/10 (API connector)
Communication automation25%4/10 (basic)6/10 (moderate)9/10 (advanced)
Scoring flexibility20%3/10 (attendance only)8/10 (multi-metric)9/10 (any logic)
Team challenge support15%2/10 (limited)7/10 (good)8/10 (strong)
Analytics and reporting15%5/10 (basic)7/10 (good)9/10 (predictive)
Weighted score5.06.78.7

The facility selected US Tech Automations based on the communication automation capabilities and scoring flexibility. The Mindbody integration, while not native, was established within 5 days through API configuration.

Implementation Timeline

WeekActivitiesHours InvestedMilestone
Week 1Platform setup, Mindbody API integration, staff accounts8 hoursData flow verified (check-ins syncing)
Week 2Challenge template design (3 types: attendance, variety, team), scoring rules configuration6 hoursTemplates saved and tested
Week 3Communication sequence design (12 touchpoints per challenge), email/SMS templates10 hoursAll sequences tested with sample data
Week 4Leaderboard configuration, prize tier setup, in-gym display installation4 hoursFull system operational
Week 5Staff training (group fitness manager + 3 front desk)3 hoursAll staff certified
Total31 hoursSystem fully operational

The 31-hour implementation investment is a one-time cost that IHRSA's technology deployment data confirms as typical for comprehensive challenge automation. By comparison, the facility was spending 105 hours per year on manual challenge administration — the implementation pays for itself in time savings within the first two challenges.

Challenge 1: Automated New Year Reset (January)

The first automated challenge replicated the same format as the previous year's manual New Year Reset: a 28-day attendance challenge. Keeping the format identical allowed clean comparison of automation impact.

Enrollment Comparison

MetricManual (Previous January)Automated (Current January)Change
Marketing reachFlyers + 1 email blastEmail sequence (3 touches) + SMS + in-app notification3x touchpoints
Time from announcement to enrollment openSame day2-week pre-launch drip campaignBuilt anticipation
Enrollment windowOpen-ended (until challenge start)3 weeks with early-bird pricing ($29 first week, $35 after)Created urgency
Participants enrolled198308+56%
Enrollment conversion rate18% of members28% of members+10 percentage points
Payment collection92% (cash payment issues)100% (digital only)+8 percentage points
Revenue from entry fees$6,930$9,856 ($29 avg with early-bird mix)+42%

The early-bird pricing strategy — suggested by Mindbody's enrollment optimization data — drove 64% of enrollments in the first week. This front-loaded participation created social proof that motivated the remaining 36% to register before the challenge started.

How much does early-bird pricing improve challenge enrollment? According to Mindbody's 2025 pricing analysis, early-bird discounts of 15-25% off the standard entry fee increase first-week enrollment by 38-52%. The key is a genuine deadline (not "extended" deadlines, which erode credibility): facilities that hold firm on early-bird expiration see 12% higher total enrollment than facilities that extend the discount "due to popular demand."

Completion and Engagement Comparison

MetricManual (Previous January)Automated (Current January)Change
Day 1 check-in rate68% of enrolled89% of enrolled+21 points
Week 1 active participation84%96%+12 points
Days 10-18 dropout rate34% of enrolled11% of enrolled-23 points
Final completion rate57% (112 of 198)81% (249 of 308)+24 points
Average visits during challenge14.2 visits / 28 days19.8 visits / 28 days+39%
Scoring disputes8 (resolved over 12 staff hours)1 (resolved in 15 minutes)-88%
Staff hours (total challenge)42 hours5 hours-88%

The mid-challenge dropout reduction — from 34% to 11% during days 10-18 — was the single most impactful automation result. The automated communication sequence sent personalized progress messages at days 7, 10, and 14, each including the participant's current score, leaderboard position, and specific actions needed to reach the next prize tier. Manual management had never delivered mid-challenge communication at this level because the group fitness manager could not compile 198 individual progress reports.

What the Communication Sequence Looked Like

The 12-touchpoint automated sequence for the January challenge:

  1. Pre-challenge (Day -7). "Your New Year Reset starts in one week. Here are 3 things to do before Day 1." Open rate: 72%.

  2. Day 1 start. "The challenge is live! Check in today for your first point. Your class schedule for this week: [personalized based on booking history]." Open rate: 81%.

  3. Day 3 momentum. "3 days in, [Name]. You have 5 points. You are in the top 40%. Keep this pace and you will hit Silver tier." Open rate: 64%.

  4. Day 7 streak milestone. "7-day streak! You earned 3 bonus points. Your total: 12 points. Leaderboard position: #[X] of 308." Open rate: 68%.

  5. Day 10 engagement check. SMS: "Quick check: How is the challenge going? Reply 1 (great), 2 (tough but doing it), 3 (struggling)." Response rate: 58%. Struggle responses triggered personal trainer follow-up.

  6. Day 14 mid-point. "Halfway there, [Name]! Your stats: [visits], [points], [rank]. To reach Gold tier, you need [X] more visits in the next 14 days. Here is your plan: [specific class recommendations for remaining days]." Open rate: 71%.

  7. Day 18 social nudge. "Your teammate [Name] just hit 25 points! The [Team Name] is in [X] place. Check in today to help your team climb." Open rate: 59%.

  8. Day 21 final push. "One week left! You are [X] points from [next tier]. This week's classes that fit your schedule: [personalized list]." Open rate: 74%.

  9. Day 24 urgency. SMS: "4 days left in the challenge. You need [X] more check-ins. Tomorrow's 6 AM HIIT has 3 spots." Click rate: 31%.

  10. Day 27 last chance. "Final day tomorrow, [Name]. Your position: #[X]. One more visit locks in your [tier] prize." Open rate: 78%.

  11. Day 28 results. "Challenge complete! Your final score: [X] points, rank #[X] of 308. You earned [prize tier]. [Prize details and fulfillment instructions]." Open rate: 91%.

  12. Day 30 reflection. "Your January Reset by the numbers: [visits], [classes tried], [streak days]. Your next challenge starts March 15 — early registration opens March 1." Open rate: 55%.

The fitness progress tracking automation system powered the personalized data in each message — member-specific visit counts, class history, and pace calculations were pulled automatically from the integrated tracking system.

Challenge 2: First Team Variety Challenge (March)

The second challenge tested a format that would have been impossible to manage manually: a 28-day team variety challenge where points were earned for attending different class types, with team-aggregated scoring.

Challenge Design

ParameterSetting
FormatTeam variety (points for unique class types attended)
Duration28 days
Team size5 members per team
Scoring1 point per visit + 3 points per unique class type (first time in that category)
Team scoringSum of individual scores, normalized by active team members
Prize structureTop 3 teams win, all completers get participation reward
Entry fee$39

Results

MetricValuevs. IHRSA Benchmark
Enrolled participants264 (24% of members)Above median (20-25%)
Teams formed53 (52 full teams + 1 partial)
Completion rate79% (209 of 264)Above median (72-85% automated)
Unique class types tried (average per participant)6.8
Members who tried a class type they had never attended78% of participants
Average visits during challenge17.4 / 28 days
Scoring disputes2 (both resolved automatically — system confirmed class categorization)
Staff hours3.5 hoursWell below benchmark (25-40 manual)
Entry fee revenue$10,296

How do team challenges affect member engagement compared to individual challenges? According to ClassPass's 2025 team dynamics research, team-based challenges show 12% higher completion rates than individual challenges of the same duration and format. The mechanism is social accountability: members who commit to a team feel obligation beyond personal motivation. The effect is strongest in teams of 4-6, where each member's contribution is visible. Above 8 members per team, the accountability effect weakens because individual contributions become less consequential.

The variety scoring format produced an unexpected retention benefit: 78% of participants tried at least one class type they had never attended before. Of those, 34% continued attending the new class type after the challenge ended, according to the facility's 90-day post-challenge tracking. This class exploration effect expanded each member's connection to the facility — members who attend 3+ class types show 28% higher annual retention than members who attend only one type, Mindbody's behavioral data confirms.

The gym attendance tracking automation system ensured that class type tracking was accurate — each check-in automatically categorized by class format (strength, HIIT, yoga, cycling, pilates, boxing, etc.), eliminating the manual categorization errors that would have plagued a variety challenge on a spreadsheet.

Aggregate Results: 4 Months, 3 Challenges

ChallengeMonthParticipantsCompletion RateEntry RevenueStaff Hours
New Year Reset (attendance)January30881%$9,8565
Team VarietyMarch26479%$10,2963.5
Spring Sprint (21-day attendance)April24283%$7,0183
Totals814 enrollments (410 unique members)81% avg$27,17011.5 hours

Comparison to Previous Year (Same Period)

MetricPrevious Year (Manual, Jan-Apr)Current Year (Automated, Jan-Apr)Change
Challenges run2 (Jan + Apr)3 (Jan + Mar + Apr)+50% more challenges
Total enrollments345814+136%
Unique participating members218410+88%
Average completion rate53%81%+28 points
Total entry fee revenue$12,075$27,170+125%
Total staff hours7511.5-85%
Cost per challenge (staff labor at $22/hr)$825$84-90%

Financial Impact: 4-Month Summary

Direct Revenue

CategoryPrevious Year (4 months)Current Year (4 months)Change
Entry fee revenue$12,075$27,170+$15,095
Prize costs-$5,434-$12,227-$6,793
Staff labor cost-$1,650-$253+$1,397
Platform cost (4 months)$0-$2,200 ($550/mo)-$2,200
Net direct contribution$4,991$12,490+$7,499

Retention Impact

The facility tracked 90-day retention for challenge participants versus non-participants during the same period.

CohortMembers90-Day RetentionExpected Retention (Baseline)Members Retained Above Baseline
Challenge participants41091%82%37 additional members retained
Non-participants (halo effect observed)69084%82%14 additional members retained
Total members retained above baseline51

At the facility's $93/month average rate, with an estimated 4.2 additional months of retention per saved member (Mindbody's calculation methodology):

  • 51 retained members x $93/month x 4.2 months = $19,876 in retained revenue (annualized from 4-month data: approximately $42,560)

Total 4-Month Impact

Value CategoryAmount
Net direct contribution (fees - prizes - labor - platform)$12,490
Retained membership revenue (estimated)$19,876
Acquisition cost savings (51 members x $175 avg)$8,925
Total 4-month value$41,291
Total 4-month cost (platform + prizes + labor)$14,680
Net 4-month ROI$26,611
ROI ratio2.8:1

How does this compare to IHRSA benchmarks for challenge program ROI? IHRSA's 2025 program economics data shows that automated challenge programs at facilities with 800-1,500 members typically achieve 2:1 to 4:1 annual ROI ratios, with the primary variance driven by average membership rate (higher rates amplify retention value) and challenge frequency (more challenges compound retention benefits). The Denver facility's 2.8:1 ratio at the 4-month mark falls squarely in the expected range, with IHRSA projecting improvement to 3.5:1 - 4:1 as the facility runs its planned 6 annual challenges.

Lessons Learned: What Surprised the Team

Surprise 1: Mid-Challenge Communication Was More Important Than Leaderboards

The facility expected the real-time leaderboard display (mounted on a 55-inch screen in the lobby) to be the primary engagement driver. Instead, post-challenge surveys revealed that the personalized mid-challenge progress messages were cited as the top motivator by 62% of completers. The leaderboard was cited by 28%.

"The leaderboard matters for the top 20% of competitive members," the group fitness manager noted. "But for the middle 60% who are just trying to finish, knowing exactly where they stand and exactly what they need to do next is what keeps them coming back. The automated messages gave them that."

Surprise 2: Team Challenges Were Easier to Automate Than Expected

The facility had avoided team challenges for years because "the logistics would be a nightmare." The team variety challenge in March — which would have required tracking 264 participants across 53 teams with variety scoring and normalized team aggregation — ran with 3.5 hours of staff time. The automation handled team assignment, score aggregation, normalization, and team-level communication automatically.

Surprise 3: Scoring Disputes Nearly Disappeared

Challenge PeriodManual Challenges (Previous Year)Automated Challenges (Current Year)
Total scoring disputes14 across 2 challenges3 across 3 challenges
Staff hours on disputes18 hours0.75 hours
Disputes escalated to management40
Members who cited fairness concerns in surveys22%3%

The transparency of automated scoring — where members could see their score update in real time with audit details for each point — eliminated the ambiguity that fuels disputes. The 3 disputes that did occur were all resolved instantly: the member opened their score detail, saw the check-in record, and identified that they had checked in but left before the class started (the system only awards class-type points for completed classes).

What the Facility Would Change

Initial ApproachIssueRevised Approach
Identical communication for all challenge typesVariety challenge participants wanted class recommendations more than score updatesChallenge-type-specific message templates
28 days for all challengesSpring Sprint worked better at 21 days (higher completion, less fatigue)Mix of 21-day and 28-day challenges in annual calendar
Team assignment by algorithm onlySome members wanted to choose teammatesHybrid: friends can request teams, algorithm fills remaining spots
SMS for all communication8% of members over 50 preferred email onlyChannel preference setting during enrollment

12-Month Projection

Based on 4-month results and the planned 6-challenge annual calendar:

Metric4-Month Actual (3 challenges)12-Month Projected (6 challenges)Basis
Total enrollments8141,700-1,900Seasonal adjustment (summer dip)
Unique participating members410 (37% of base)550-650 (50-59% of base)Cumulative reach across challenge types
Average completion rate81%78-82%Slight normalization expected
Entry fee revenue$27,170$55,000-$65,000Pricing mix across challenge types
Net direct contribution$12,490$26,000-$32,000After prizes, labor, platform
Retained membership revenue$19,876$42,000-$55,000Compounding retention benefits
Total annual value$68,000-$87,000
Annual automation cost$6,600$550/month
Projected annual ROI ratio3.5:1 to 4:1

The gym referral program automation system will be integrated with challenge completion starting in Q3 — challenge completers will automatically receive referral incentives, which Mindbody data suggests will generate 0.3-0.5 new member leads per completer.

Frequently Asked Questions

Is the 2x engagement claim accurate based on this case study?
The facility's baseline challenge participation was 14% of members. After automation, participation increased to 28% — exactly 2x. Visit frequency during challenges averaged 3.4 visits per week for participants versus 1.7 for non-participants — also 2x. Both metrics align with Mindbody's broader benchmark data. The "2x engagement" figure is conservative; some facilities see higher lifts depending on their pre-automation baseline.

How long did it take before the staff trusted the automated system?
The group fitness manager reported full confidence after the first challenge (4 weeks). The front desk staff trusted it within the first week once they saw that check-in data was syncing accurately and scores were updating in real time. The ownership team waited for the full 4-month financial analysis before confirming the investment was justified.

Did any members complain about the automated communication?
Three members (less than 1%) opted out of SMS messages during the first challenge. No members opted out of email communication. Post-challenge surveys showed that 78% of participants rated the communication volume as "about right," 15% wanted more communication, and 7% wanted less. The facility adjusted by making SMS an opt-in (rather than opt-out) channel for subsequent challenges.

Can this approach work for gyms smaller than 1,000 members?
The automation itself works identically for any facility size. The financial ROI depends on having enough members to generate meaningful enrollment numbers. Facilities with 300-500 members running 4 challenges per year with 20-25% participation would see challenge sizes of 60-125 participants — sufficient for effective leaderboards, team challenges, and communication sequences. The US Tech Automations platform scales down effectively because the per-challenge effort is the same regardless of participant count.

What was the biggest single factor in the completion rate improvement?
The mid-challenge communication sequence, specifically the Day 14 personalized progress message that included a "path to completion" plan with specific class recommendations for the remaining 14 days. This single message reduced days 10-18 dropout from 34% to 11%. The facility tested this by delaying the Day 14 message by 48 hours for a randomized 10% of participants in the March challenge — the delayed group showed 19% dropout versus 10% for the on-time group.

How much does the platform cost relative to the benefits?
The platform cost is $550/month ($6,600 annually). The 4-month measured benefits total $41,291. Annualized projected benefits range from $68,000 to $87,000. The platform cost represents 7.6-9.7% of the total value generated — a ratio that IHRSA's technology investment data characterizes as "high efficiency."

What happens if a challenge underperforms despite automation?
The March team variety challenge enrolled fewer participants (264) than the January attendance challenge (308), which is consistent with seasonal patterns. Automation does not guarantee every challenge exceeds the previous one. What it does guarantee is operational consistency — every participant receives the same quality of enrollment, tracking, communication, and fulfillment regardless of seasonal variability. The analytics dashboard also identifies underperforming challenges early enough to adjust subsequent campaigns.

Conclusion: Automation Unlocked the Challenge Program This Facility Already Wanted

Summit Fitness had been running fitness challenges for three years before implementing automation. They knew challenges worked. Their members responded to them. The concept was sound. What failed was the execution — and the execution failed because manual operations could not maintain quality at the participation levels that make challenges financially meaningful.

The automation did not change what the facility did. It changed how reliably and how often they could do it. Moving from 3 inconsistent annual challenges to 6 consistently executed challenges compounded the engagement and retention benefits that were always latent in the program.

For facilities experiencing the same trajectory — initial challenge success followed by declining participation and staff burnout — the solution is not better ideas. It is better infrastructure.

Schedule a free consultation with US Tech Automations to map your current challenge operations, identify the highest-impact automation opportunities, and build a 90-day implementation plan that transforms your challenge program from a sporadic event into a systematic engagement engine.

About the Author

Garrett Mullins
Garrett Mullins
Workflow Specialist

Helping businesses leverage automation for operational efficiency.