Analytics

Alumni NPS: the 3 surveys you need every year

A single annual alumni NPS gives you an aggregate number that's already out of date the day you get it. To really run your network, you need three different surveys, placed at the right moments of a member's life cycle. Here is the exact cadence, the precise questions, and the minimal dashboard to turn these measurements into decisions — not into board slides no one reads.

April 8, 2026 ~6 min read By Thibault Sabathier

Why a single annual NPS is useless

Classic alumni-leadership reflex: launch a big annual survey in October or November, ask 15 to 25 questions, present results at the board, file the report. The problem isn't the principle of measuring — it's the granularity.

Aggregating NPS over the whole base masks everything that matters. A new graduate struggling in their job search and a 45-year-old senior who only comes to the yearly afterwork do not have the same experience. The €40 payer and the free member either. The single output — "our NPS is 22" — says nothing actionable.

Second problem: frequency. One measurement a year is a blurry photo taken on a random day. A failed event in September or a platform bug in June will impact the number, but you'll never know which. And above all, between two measurements, you can fix nothing.

The good alumni NPS isn't one NPS, it's three NPS. Each targeted at a precise moment of the life cycle. Each triggered automatically. Each read every week, not once a year.

Survey #1 — Membership NPS (D+7 after dues, 1 question)

The first survey fires 7 days after the payment of annual dues. Not 24h (too early, the member has tested nothing), not 30 days (too late, they've forgotten the transaction). Seven days is when they've had time to explore — or to do nothing — which is information in itself.

A single question: "On a scale from 0 to 10, would you recommend the alumni network to a classmate?"

If the score is 0 to 6 (detractor), an automated follow-up question: "What disappointed you?" with a free-text field. Nothing more. You're hunting for the weak signal, not running a long questionnaire.

Observed benchmark on this survey: expected NPS between +20 and +40 after dues payment. Below +10, it's a warning signal: the promise made before payment doesn't match what's delivered. Below 0, it's serious — payers feel short-changed, and they won't renew.

Typical response rate on this 1-question format: 30 to 45%. That's 5 to 10 times more than a long questionnaire. For more on dues revenue, see our alumni dues 2026 benchmark.

Survey #2 — Event NPS (D+2 after event, 3 questions)

The second survey fires 48 hours after each event (in person or online). Not 7 days — the memory of an event is short and details fade fast.

Three questions, in this precise order:

  1. Q1 (mandatory, 0-10): "Would you recommend this event to another alumnus?"
  2. Q2 (free text, optional): "What did you enjoy most?"
  3. Q3 (free text, optional): "What can we improve next time?"

Order matters. Starting with the score forces a global evaluation before the details. Asking what they liked BEFORE what they didn't like orients positively, which improves completion rate.

Benchmark: event NPS above +40 for a quality event, above +50 for a signature event. Below +20, you need to seriously look at the format.

The most useful use of these verbatims: preparing the next event. Rather than guessing, you read 30 comments and adjust the format. Combined with the practices in our article alumni events that fill up, we typically see a 15 to 25% increase in attendance after 6 months.

Survey #3 — Employability NPS (M+6 after graduation, 5 questions)

The third survey targets young graduates, 6 months after the degree ceremony. That's when most are in a job or actively searching — not before.

Five questions:

  1. School NPS (0-10): "Would you recommend the school to a prospective student?"
  2. Alumni network NPS for job search (0-10): "Did the alumni network help in your search?"
  3. Mentorship NPS (0-10, if the member had a mentor): "Would you recommend the mentorship program?"
  4. Current situation (list): full-time / fixed-term / long internship / apprenticeship / actively looking / other
  5. Spontaneous recommendation rate (0-10): "Have you already recommended the network to a classmate since graduation?"

This survey is complementary to the CGE placement survey, not redundant. The CGE survey answers "what" (salary, contract type, sector). This NPS answers "how" (feeling, perceived quality, future intent). For the CGE part, see the CGE survey response rate benchmark.

Benchmark: employability NPS above +20 is acceptable, above +40 is excellent. A low employability NPS is a powerful predictor: field studies show that a 6-month employability NPS below 0 is followed by a dues rate cut in half within the next 3 years. In other words, it's the indicator that foretells the network's financial trajectory.

Minimal dashboard to share with the board / leadership

Three surveys, three curves. The dashboard fits on one A4 page:

  • Monthly chart of the 3 NPS over the trailing 12 months
  • Automatic alert if a 5-point drop over a quarter
  • Weekly extraction of detractor verbatims, read by 1 dedicated person (30 minutes / week is enough)
  • Link to other network-health indicators — see our 7 KPIs for alumni leadership for the complete frame

Golden rule: this dashboard is read every week, not every quarter. An NPS dropping 5 points in 4 weeks doesn't wait for the next board — it demands immediate action. To structure re-engagement of disengaged members following detractor signals, see also our method to revive a dormant alumni network.

The mistakes that invalidate your measurements

Four classic mistakes are enough to render any NPS program unusable:

  • Survey too long: beyond 5 questions, the abandonment rate exceeds 30%. Remaining answers are biased (either extreme detractors or extreme promoters).
  • Wrong timing: an event survey sent 30 days after rather than 2 days measures overall memory, not experience. Guaranteed memory bias.
  • No verbatim follow-up: having a number without comments is having a fever without knowing where to look. Verbatims cost 1 hour a week and are worth 10 times more than the number alone.
  • No segmentation: an aggregated NPS masks cohort-specific problems. Segment at a minimum by class (−5, −10, −20 years) and by dues tier.

Set up properly, this triptych completely replaces the big annual survey. And above all, it turns the alumni office into a team that runs things rather than one that comments on results six months after the fact.

Measure alumni satisfaction continuously

Programmable surveys, real-time dashboard, automatic extraction of detractor verbatims. 14-day trial.