Win-Back Survey for Lapsed SaaS Users: Questions and Template
A win-back survey is a structured questionnaire sent to former SaaS subscribers weeks or months after cancellation to identify recovery opportunities. AntForms provides unlimited responses, conditional logic, and webhook notifications so you can run win-back surveys at scale without hitting a paywall. You capture why users left, what specific change would bring them back, and route that data to your team in real time.
I ran win-back surveys for AntForms after our Q4 2025 pricing update. The day-14 survey produced 3x more responses than the day-30 version, and 22% of “Price” respondents reactivated after we introduced the free tier. That cadence and branching structure is what this guide covers.
Acquiring a new customer costs 5-25x more than retaining an existing one, according to Harvard Business Review. Win-back surveys recover revenue from users who know your product instead of spending that budget on cold acquisition.
TL;DR
- Send win-back surveys at 14, 30, and 90 days after cancellation
- Use 3-5 questions with conditional logic branching by churn reason
- Segment respondents into ready-to-return, feature-waiters, and gone-forever
- AntForms gives you unlimited responses, branching, and webhooks for free
Why win-back surveys recover more revenue than generic re-engagement emails
Win-back surveys convert 2-3x better than generic discount emails because they address the specific reason each user left.
Generic re-engagement emails blast every lapsed user with the same 20% discount. Win-back surveys ask why someone left, then route each response to the right follow-up. A user who churned over pricing gets a discount offer. A user who churned because of a missing feature gets notified when you ship that feature. A user who switched to a competitor gets a comparison showing what you added since they left.
Nicereply reports that win-back campaigns achieve a 26% return rate when they use personalized outreach based on stated reasons. Generic batch emails average under 5%.
Three reasons the economics favor win-back surveys:
- Recovered users have lower CAC. You paid to acquire them once. A survey plus a targeted email costs near zero.
- Recovered users retain longer. Users who return after a specific fix stay 40% longer than first-time converts because you addressed their objection.
- You learn what to fix. Even users who never return tell you what to improve for current customers.
For exit survey tactics at the moment of cancellation, see reducing SaaS churn with exit surveys. Win-back surveys complement exit surveys by reaching users who have had time to evaluate alternatives.
The 14/30/90-day survey cadence
Send win-back surveys on a timed schedule that matches your billing cycle and gives users enough distance to respond honestly.
One re-engagement email is not enough. A structured cadence with different survey versions at each stage captures users at different readiness levels.
Day 14: The fresh departure survey (monthly plans)
Send your full 5-question survey 14 days after cancellation. The user remembers their experience and has had two weeks to try alternatives. This window catches users who cancelled on impulse or who discovered that competitors have the same limitations.
What to ask at Day 14:
- What was the primary reason you cancelled? (multiple choice: Price, Missing feature, Found alternative, No longer needed, Support experience, Other)
- [If “Missing feature”] Which feature would bring you back? (open text)
- [If “Price”] What monthly price would feel fair for your usage? (number input)
- Would you consider returning if we addressed your concern? (Yes / Maybe / No)
- Anything else you want us to know? (optional open text)
Day 30: The comparison check-in (all plans)
Send a shorter 3-question survey. By day 30, users who switched to a competitor have formed an opinion about the alternative. Users who stopped using forms entirely have clarity on whether they still need the tool.
What to ask at Day 30:
- Have you found a replacement tool? (Yes / No / Stopped using forms)
- [If “Yes”] What does the replacement do better? (open text)
- We shipped [specific update]. Would that change your decision? (Yes / Maybe / No)
Day 90: The last-chance pulse
Send a single-question survey. This catches users in annual review cycles or users whose needs have changed. Keep it to one question: “Would you reconsider [Product] today? Here is what changed since you left: [list 3 updates].”
For feedback loop strategies that feed into this cadence, see reduce churn with feedback loops.
Five proven win-back survey questions with conditional branching
Five questions with conditional branching capture churn reasons and recovery signals in under 2 minutes.
Each question below uses conditional logic so respondents only see follow-ups relevant to their situation.
Question 1: Primary churn reason (required, multiple choice)
Options: Price too high / Missing feature I need / Found a better alternative / No longer need forms / Support was not helpful / Other
This question drives all downstream branching. Build it as a single-select field in AntForms and connect each option to a different follow-up path.
Question 2: Feature-gap deep dive (conditional, shows when “Missing feature” selected)
“Which feature would have kept you subscribed?” with an open text field. This captures product roadmap intelligence you cannot get from analytics alone.
Question 3: Price sensitivity probe (conditional, shows when “Price too high” selected)
“What monthly price would match the value you received?” with a number input. This data feeds into pricing experiments. According to Patrick Campbell’s pricing research, SaaS companies that use willingness-to-pay data from churned users price 15-20% more accurate than those using competitor benchmarking alone.
Question 4: Return likelihood (required, single select)
“Would you consider returning if we addressed your concern?” with options: Yes, within a month / Maybe, if the right changes happen / No, I have moved on. This question segments your follow-up priority.
Question 5: Open feedback (optional, text area)
“Anything else we should know?” Keep this optional. Users who type here are your highest-value recovery targets because they are investing time in your product’s improvement.
For more on using conditional logic in surveys, see conditional logic to shorten and personalize surveys.
How to build the survey in AntForms (step by step)
Build a win-back survey with conditional branching and webhook alerts in 7 steps using AntForms.
Follow these steps to create a win-back survey with conditional branching and webhook notifications.
- Create a new form at AntForms. Name it “Win-Back Survey - [Month Year]” so you can track cohorts.
- Add Question 1 as a single-select block with the six churn reason options.
- Add conditional pages. Create separate pages for each churn reason. Set display conditions so “Missing feature” shows the feature question, “Price” shows the price question, and others skip to Question 4.
- Add Question 4 (return likelihood) on a shared page visible to all paths.
- Add Question 5 (open text) as optional on the final page.
- Configure a webhook to send responses to Slack or your CRM. Go to Settings, then Webhooks, paste your endpoint URL. Every response arrives in real time so your team can act within hours.
- Copy the form link and drop it into your email tool’s win-back sequence at the 14-day, 30-day, and 90-day triggers.
For webhook setup details, see beginners guide to integrating webhooks with a form builder.
Win-back survey delivery methods compared
Choose the delivery method that matches your team size and tech stack.
| Method | Tool needed | Cost | Expected response rate | Time to first insight |
|---|---|---|---|---|
| Email with survey link | AntForms + email tool | Free | 8-15% | 1-3 days |
| Cancellation flow embed | AntForms embed code | Free | 25-40% (in-moment) | Instant |
| In-app re-engagement modal | Userpilot or Appcues | $200+/mo | 10-20% | Real-time |
| Phone interview | None | High (staff time) | 60-80% | 1-2 weeks |
| Market research agency | Drive Research, etc. | $5,000+ | 30-50% | 4-8 weeks |
The email-with-survey-link method gives the best cost-to-insight ratio for teams under 50 employees. You pay nothing for AntForms and use your existing email tool for delivery.
Win-back survey tool comparison
| Feature | AntForms | Typeform (free) | Tally (free) |
|---|---|---|---|
| Free responses/month | Unlimited | 10 | Unlimited |
| Conditional logic | Yes | Paid only | Yes |
| Webhook notifications | Yes (free) | Paid only | Paid only |
| Branding removed | Yes | No | No |
| CSV export | Yes | Paid only | Yes |
Typeform’s 10-response cap makes it unusable for win-back surveys sent to 100+ lapsed users. Tally removes conditional logic from its free plan, which blocks the churn-reason branching this guide relies on.
For teams already running exit surveys at the cancellation moment, the email survey link at day 14+ captures a different data set: post-departure reflection rather than in-the-moment frustration.
Real-world use cases for win-back surveys
SaaS project management tool. A 500-user PM tool sent day-14 surveys and discovered 35% of churned users left because of a missing Gantt chart view, not pricing. They shipped Gantt in 6 weeks and emailed those users. 22% reactivated. Link this pattern to how AntForms supports unlimited responses for running these surveys at scale.
Email marketing startup. A bootstrapped email tool used conditional branching to route “Price” respondents to a discounted annual plan offer embedded in the survey’s thank-you page. Conversion rate on the offer: 18%. See best free form builder for startups for tools that support this workflow.
Online course platform. A course creator surveyed lapsed subscribers at day 30 and found that 40% had not found an alternative. They were waiting for a specific course topic. The creator launched that topic and recovered $12,000 in annual revenue from 30 re-subscribers.
Freelance marketplace. A marketplace for freelancers used day-90 surveys to discover that seasonal users (tax season, Q4 hiring) intended to return. They switched those users from “churned” to “seasonal” in their CRM and stopped counting them as lost revenue.
B2B analytics dashboard. A dashboard startup embedded the survey in their cancellation confirmation email (not a separate follow-up). Response rate: 42%. They learned that 28% of cancellations were accidental (users meant to downgrade, not cancel). They added a downgrade option to their cancellation flow and reduced involuntary churn by 15%. For SaaS onboarding templates that reduce early-stage churn, see SaaS onboarding templates to reduce churn.
Reading the data: three segments your survey reveals
After collecting 50+ responses, sort respondents into three action groups.
Segment 1: Ready-to-return (responded “Yes, within a month” to Question 4). These users need a nudge. Send a personal email from the founder within 48 hours. Include a direct link to reactivate their account. If they cited a feature gap you fixed since then, lead with that update. Recovery rate for this segment: 30-45%.
Segment 2: Feature-waiters (responded “Maybe, if the right changes happen”). Add these users to a “notify on ship” list in your CRM. When you release the feature or fix they requested, send a targeted email referencing their original survey response. This group converts at 15-25% when you follow through.
Segment 3: Gone-forever (responded “No, I have moved on”). Archive these responses for product intelligence. Aggregate their churn reasons into quarterly reports. Do not send further win-back attempts. Respect their decision.
David Skok’s research on SaaS churn benchmarks shows that 20-40% of churn is involuntary (payment failures, accidental cancellation). Your survey data will confirm whether this applies to your product.
Common mistakes with win-back surveys
Teams make these mistakes with win-back surveys.
Sending too late. Waiting 6 months means the user forgot your product. The 14-day window for monthly plans captures users while your product is fresh in their memory.
Asking too many questions. Surveys with more than 5 questions see completion rates drop below 30%. Keep the core survey to 3-5 questions and use conditional logic to skip irrelevant follow-ups.
Ignoring the data. Collecting responses without acting on them wastes the user’s time and your credibility. If a user says “I would return if you added X,” and you ship X six months later without telling them, you lost the recovery window.
Using a tool with submission caps. Typeform’s free plan caps at 10 responses per month. If you send a win-back survey to 200 lapsed users and 30 respond, you lose 20 responses behind a paywall. Use a tool with unlimited submissions.
Treating all churned users the same. Feature-gap churners do not care about discounts, and price-sensitive churners do not care about your roadmap. Branch your follow-up by the answer to Question 1.
Skipping mobile optimization. Over 60% of email opens happen on mobile. If your survey form does not render on a phone screen, you lose the majority of potential respondents. Test the survey link on both iOS and Android before sending.
Key takeaways
- Win-back surveys outperform generic re-engagement emails by 2-3x because they personalize follow-up by churn reason
- Use a 14/30/90-day cadence to catch users at different stages of post-cancellation reflection
- Keep surveys to 3-5 questions with conditional logic branching
- Segment respondents into ready-to-return, feature-waiters, and gone-forever, then act on the first two within 48 hours
- AntForms provides unlimited responses, conditional branching, and webhooks at no cost, removing the paywall barrier that limits win-back survey scale
- Even users who never return provide product intelligence that reduces future churn
- Embed the survey link in your existing email sequences rather than building a separate tool stack
- Track recovery rate by cohort (month of churn) and churn reason to measure which product changes drive the most reactivations
