The 3 numbers to watch (and not to confuse)
"Response rate" is a catch-all. In practice, three indicators coexist — and confusing them gives you a biased read of your performance, in one direction or the other.
Gross response rate: number of respondents / number of graduates solicited. It's the most flattering number, and the least useful. It doesn't distinguish those who just clicked the link from those who actually filled out the questionnaire.
Net respondent rate: number of respondents / (number of graduates solicited − number of "not applicable"). "Not applicable" includes unreachable alumni (email obsolete for more than 18 months, no phone), deceased, those on a gap year abroad with no planned return. In a class of 1,000 graduates, we typically count 5 to 10% legitimate "not applicable".
Completion rate: number of completed questionnaires / number of graduates solicited. It's the only one that feeds the official CGE tables and that counts for your RNCP data sheet. A questionnaire started but abandoned at question 8 doesn't count.
Worked example on a class of 1,000 graduates: 600 gross respondents (60% gross rate), 200 of whom abandoned before the end. True completion rate: 400/1,000 = 40%. The gap between 60% advertised and 40% useful is 20 points — often ignored in communications. For CGE tables, only completion counts. For internal benchmarking, track all three.
2024-2025 benchmarks by school type
The 2025 CGE average sits around 53% completion rate. But that average lumps together very heterogeneous schools. The following orders of magnitude, drawn from public CGE reports and 2024-2025 field returns, help position yourself more precisely.
- Top-tier Parisian business schools ("top Parisian" cohort): completion rate 55-65%. Structural advantages: strong brand, engaged alumni, dedicated budget.
- Top regional business schools (major cities, triple accreditation): 50-60%. Rates close to Parisians but with tighter resources.
- Prestigious Parisian engineering schools: 60-70%. Highest level, pulled up by a strong class culture and marked institutional attachment.
- Regional CTI engineering schools: 45-55%. Average close to the national median. Plenty of room for improvement through tooling.
- Specialized schools (design, art, architecture, communications, film): 35-45%. More dispersed alumni, non-linear careers, low institutional attachment.
Important disclaimer: "around average" doesn't mean "high-performing". The CGE average is pulled up by about fifteen heavily-tooled schools at 65%+. A regional engineering school at 53% is around average, but structurally underperforming relative to its real potential. The right benchmark is your segment, not the overall average.
Second caution: some schools inflate their numbers by massively excluding alumni as "not applicable". A 72% net-respondent rate with 30% "not applicable" hides a real gross rate of 50%. The CGE survey guide details the strict exclusion rules.
5 levers that deliver +10 points on the rate
Five levers, documented on classes of 500 to 3,000 graduates, with additive gains. Applying all five simultaneously theoretically delivers +40 points. In practice, we observe +15 to +20 points actual (ceiling effects, correlations).
Lever 1: pre-filling responses from the alumni profile. If the alumnus lands on a questionnaire where their name, first name, class, degree, email, and LinkedIn are already filled in, completion time drops by a factor of 5. Documented effect: +8 points on completion rate. This requires a platform that knows the alumnus before the survey, which our survey module provides.
Lever 2: automated reminders D+7 / D+14 / D+21. Manual reminders "when you remember" miss the critical dates. The automated reminder set up at campaign kickoff, with fixed deadlines, never misses. Observed effect: +18 points on the gross rate. By far the most powerful lever.
Lever 3: email sent from the school's address, not an external vendor. "alumni-office@myschool.edu" has an open rate 2.3x higher than "noreply@survey-vendor.com". Observed effect: +6 points completion. Requires a platform that uses the school's domain (SPF / DKIM configured on the school side).
Lever 4: SMS on top of email. SMS wakes up alumni who no longer read their emails (typically 30-40% of classes 2-5 years out). Effect on that sub-segment: +15% conversion rate, or about +4 points completion over the whole class. To use only as a last reminder, not as a first contact.
Lever 5: transparency on data use. Add to the questionnaire intro a line like "Your answers feed the school's RNCP data sheet, the CGE ranking, and remain accessible to you alone in your alumni space". Effect on "wary" profiles: +4 points. Cost: 30 seconds of copywriting.
Summary: +8 +18 +6 +4 +4 = +40 theoretical points. Actual: +15 to +20 points. Going from 45% to 63% is realistic in one survey cycle with the right tooling.
Classic mistakes that kill your rate
Before optimizing, eliminate the mistakes that cost you 15 points without you knowing.
Timing: launching in August or December. In August, 70% of graduates are on vacation or in professional transition. In December, the holidays kill attention. The optimal windows are April-May (before summer vacation, right after first jobs stabilize) and September-October (back-to-school, activity pickup). Losing 15 points on timing alone is common.
Using Google Forms or generic SurveyMonkey. No pre-fill, no automated reminders, no CGE compliance (the 142 structured questions expected). You outsource the plumbing to a tool not built for it. Observed gap: -20 points vs a specialized platform.
Questionnaire too long. Beyond 12-15 questions, the completion rate drops non-linearly. Every question after the 12th causes about 1.5 points of abandonment. A 25-question survey instead of 15 costs 15 points of completion. Principle: every question should feed a required CGE number or a school decision. Otherwise, delete it.
Broken mobile form. 60% of alumni open emails on smartphones. If your form has overflowing tables, unreadable dropdowns, or non-adaptive fields, you lose half the clicks. Always test before launch on iOS Safari, Android Chrome, and tablet.
No tracking dashboard. Without a real-time dashboard, you don't know who answered, who started, who opened but didn't click. You can't target reminders. You're flying blind. A minimal dashboard (per-alumni status: not opened / opened / started / completed) costs zero with native integration, days with manual Excel glue.
Bring it in-house or outsource?
Three options depending on size and budget.
Pure in-house (0 to 1 FTE, simple tool like Google Forms or LimeSurvey). Viable if fewer than 500 graduates/year, budget below €15k, no complex CGE requirement. Expected rate: 30-45%. Fits small specialized schools or the first years of CGE existence.
External consultancy outsourcing (€30k to €80k per campaign). Viable if more than 1,500 graduates/year, need for in-depth sector analysis, reporting need for the board or general management. The consultancy runs the entire campaign and delivers a report. Expected rate: 55-65%. Major downside: you depend on the provider, alumni base not enriched by the survey.
Specialized SaaS tool (e.g. Terrilink, €3k to €8k/year). Optimal for schools with 500 to 2,000 graduates/year, i.e. most CGE and CTI schools. The school stays in control, the data stays in its IS, the alumni base is enriched at every survey. Expected rate: 55-65% with the 5 levers applied.
- Pure in-house: cost €3k-€8k/year (time + Forms license), rate 30-45%, no alumni enrichment
- External consultancy: cost €30k-€80k/campaign, rate 55-65%, no durable enrichment
- Specialized SaaS: cost €3k-€8k/year, rate 55-65%, durable alumni base enrichment
For the full methodology (CGE questions, schedule, RNCP compliance), see our CGE placement survey guide.
What your numbers really say about your network
There's a direct correlation between survey response rate and the overall health of the alumni network. That correlation is the most useful thing to understand — and the most ignored.
A living network — where alumni regularly get emails that interest them, show up at events, pay dues — answers at 65%+ to the survey without huge effort. The survey is just one more email in an already-active conversation.
A dormant network — where the last contact dated back to the AGM 18 months ago — plateaus at 35% despite all the technical levers. You can pre-fill, SMS, perfume the questionnaire with jasmine: you won't clear the bar.
If your school is stuck below 40% completion campaign after campaign, the problem isn't the survey. The problem is engagement. You have to wake up the network before, not during, the survey. A few wake-up levers: annual class event, quarterly alumni newsletter with useful content (not "school news"), active mentorship program. Our dedicated article details the method to revive a dormant alumni network.
The CGE survey isn't an isolated campaign. It's the thermometer of your network. If the thermometer is cold, work on the temperature — not the thermometer.
Pre-launch checklist for your next survey
Ten points to tick before hitting "send". If you can't tick at least 8 out of 10, postpone launch by 2 weeks and fix.
- Send date in April-May or September-October (not August, not December)
- Technical stack: specialized platform, not Google Forms
- Active pre-fill on identity, class, degree, email, LinkedIn fields
- Automated reminders scheduled at D+7, D+14, D+21 (minimum)
- SMS channel enabled as last reminder on the dormant segment
- Intro paragraph on data transparency (CGE / RNCP use)
- Questionnaire limited to 12-15 useful questions
- Mobile render tested and validated on iOS and Android
- Real-time dashboard accessible to alumni leadership
- RNCP compliance verified on mandatory CGE questions
A school that ticks 10/10 goes from 45% to 62% in one cycle, with zero additional consultancy budget. It's the difference between an RNCP sheet under pressure and a solid one.