How To Write A CES Survey That People Will Actually Respond To

A well-crafted CES survey can boost reply rates and help you calculate your CES. Here are some examples & a Google form you can copy and use
Published on: Oct 25, 2022
Last updated: May 15, 2023

TL:DR

  • CES — which measures how much effort a customer has to put into solving a problem or having an interaction with your company — is a crucial CX metric to track
  • We’ve already gone into more detail about what a CES score is and why it is important
  • And we’ve taken you through how to calculate CES step-by-step
  • In this article, we’ll focus on how you can write a CES survey that people will actually respond to. 

Join our community

The latest and greatest from the world of CX and support. No nonsense. No spam. Just great content.

A CES, or customer effort score, is one of those fundamental CX metrics that every company should be tracking.

It exists in the same league as some other familiar terms like CSAT and NPS, and it can tell you a lot about how well — or how poorly — your customer support team is performing and how likely your customers are to churn. 

What is a customer effort score?

But what exactly does CES measure? 

We’ve gone into greater detail elsewhere, but CES is a metric that defines how much effort your customer needs to use to complete a transaction, resolve a support issue or interact with your company/product in general. 

As a rule, you want to make sure you are creating seamless customer experiences that require very little effort. If you’re asking your customers to exert themselves every time they come to you to solve a problem, you’re pretty much guaranteeing that they’ll churn and use a competitor instead. 

That’s why it’s crucial to turn frictionless CX into one of your main competitive advantages. 

CES 1.0 vs CES 2.0

CES 2.0 was introduced to address some of the drawbacks of the first version of this score. The main differences are that:

  • CES 2.0 surveys grade answers on a 7-point scale to allow for more nuance.
  • CES 2.0 surveys are designed around testing a hypothesis rather than eliciting answers to a question. So, for example, while a CES 1.0 survey would ask some variation of 'How much effort did you extend today?', a CES 2.0 survey would look something like this instead: 'To what extend do you agree with the following statement: (Support agent name/Company name, etc) made it easy for me to solve my issue today/use this feature today, etc.'

How to calculate CES

The formula for CES is quite simple. Once you’ve sent out all your surveys, you’ll need to total the high scores (typically scores in the 4 and 5 range) and divide that sum by the total number of responses to your survey. You then multiply that to arrive at a percentage. And that percentage is your CES score. Check out this article on how to calculate CES for more information and an excel sheet you can use to make things easier.

How to write a CES survey

In order to calculate your CES score, you’ll first need to design and send out your CES surveys. The same best practices that we’ve spoken about in connection with CSAT essays apply here:

Make it easy

Given that we are measuring CES scores here, you want to make sure that your CES survey doesn’t require a lot of effort to complete! Make sure it consists of just one or two questions — preferably multiple choice questions, to make it really easy for your users to answer it. 

Keep it targeted

You’ll also want to make sure that you send out your CES survey in connection with very specific instances and at specific moments to make sure that the data you collect is meaningful. The best time to trigger them is after a user has completed an action or interacted with your support team. 

Think about delivery

Email may not be the best answer here, because not only are open and reply rates pretty dismal, emails also lack context. We’d suggest using in-app surveys instead, preferably immediately after a user has completed an action or interacted with your support team. 

Experiment with form

There are a number of grading scales and types of questions you can use to design your survey, so experiment with form if your response rates are lacking. Perhaps people are more likely to respond to a 5-point scale rather than a 10-point scale, for example. Or maybe they’d rather respond with an emoji! You’ll need to test these and figure out what works best. See our Google Forms example above for an example of what a CES survey could look like.

The different types of CES survey scales

There are a few different scales you can pick from when designing your CES survey. 

The Likert scale

These questions are graded on a scale from ‘strongly agree’ to ‘strongly disagree’, with values such as somewhat agree, neutral and somewhat disagree in between those two extremes. The neutral option is included for people who are undecided or don’t know how they feel about a particular experience or interaction. 

The number scale

Number scales feature numbers as a way to rate questions. Which most CES 1.0 scales were graded from 1 - 5, CES 2.0 introduced the 7-point scale to allow for more nuance. Think of the star ratings that you come across when looking for, say, reviews of software on sites like Capterra or G2. Those star ratings are based on number scales. 

Binary scales

As the name suggests, binary scales feature two options to choose from. Those options are often a ‘Yes’ and a ‘No’ and apply to questions such as ‘Was your recent support experience with us easy?’ Because these scales, by their nature, feature very little nuance, they may not be the best and can’t typically be used to get a very representative answer on where you stand in terms of metrics like CSAT and CES. 

Open-ended questions

In contrast to binary scales, which typically allow for the least amount of nuance, open-ended questions are where you ask someone something in a survey and give them a text field to type out their answer. Though you can get a lot of information in this way, it may not be easy to interpret or useful in calculating your CES score, so they should be used very sparingly and only when you want more context on a particular rating. It can be good to follow up with open-ended questions after someone has responded to a CES survey rather than build them into the survey itself. 

CES questions

Unlike CSAT questions, which are designed to measure customer satisfaction, you’ll have to design your CES questions to measure customer effort instead. 

With regards to the specific categories of questions you can ask, there are three primary ones for CES: 

  • Product
  • Experience
  • Follow-up questions

Product

These questions are designed to measure the effort your customers had to expend when using a particular feature or completing a particular process. Some examples include: 

  • How easy was it to complete our onboarding process? 
  • One a scale of 1-5 (with 5 being very easy and 1 being very difficult) how easy was it to use (enter feature name)?
  • How easy has it been to use (enter product name) so far?
  • Is (enter product or feature name) simple to use? 
  • To what extent do you agree with this statement: ‘It is easy to get value from (enter product or feature name)?’ (CES 2.0)
  • How easy is it to solve (enter pain point) with (enter product or feature name)?

Experience

These are questions geared towards experiences that your customers have with your company that are not directly related to your product. For example, they can be about recent support experiences, account set up or payment processes. Some examples include: 

  • Overall, how easy was it to solve your problem using our help center?
  • How easy was it to understand this article?
  • Was it simple to set up your account?
  • To what extent do you agree with this statement: ‘It was easy to get in touch with (enter company name)’s support team?’ (CES 2.0)
  • Did our support team make it easy for you to solve your issue today?
  • Was it easy to find the information you needed on our website?
  • How easy was it to interact with our team?
  • How effortless was your recent support interaction?

Follow up questions

Now that you’ve sent a targeted CES survey that is easy to answer and boosted your reply rates, feel free to send a more detailed survey or text-based question as a follow up for more information. You can’t use these when calculating your CES score, but they can give you mich needed insight when you’re trying to interpret it. 

Some examples include:

  • What could have improved your experience today?
  • What would need to improve to increase your answer by one point?
  • Tell us how you felt about your recent support experience today.

You can set open text fields for these kinds of questions so people can answer in greater detail. 

How should you distribute your CES survey?

There are lots of tools that you can use, from dedicated software to simple solutions like Google Forms (see our examples below).

Some pointers to keep in mind:

  • If users are prompted to submit a survey on a mobile app, it makes sense that it is timed to them having just used the app, for example.
  • If you are sending out a link they can open on any device, make sure to test the experience and responsiveness first.
  • If you are looking to collect your answers in an Excel or Google Sheet, there are lots of tools that connect to those.
  • If you want to first write the survey in Word or Google Docs, you can do that too, but make sure to always look at the final UX of the survey you send out.

CES Survey Examples

We've prepared three examples of CES surveys below: 

  1. A short CES 1.0 survey
  2. A medium CES 2.0 survey
  3. A longer CES 2.0 survey

CES 1.0 survey - Short

As you can see in the example below, this survey consists of one main question graded on a five-point Likert scale. It includes another question asking for more detail.

Given that the survey is short, it won't take your customers very long to answer, which means that response rates will typically be higher than more detailed CES surveys.

The length of your survey will — or should — also affect your method of delivery. Short CES surveys like this can be sent in-app.

Given that CES 2.0 has now been introduced, however, CES are being redesigned.

CES 2.0 survey - Medium

CES 2.0 surveys improve upon CES 1.0 surveys in two key ways: 

  • They test a hypothesis rather than a question
  • They use a 7-point scale rather than a 5-point scale for more nuance

Since this survey is also relatively short, you should expect response rates to be higher than if you sent out a longer survey that took several minutes to fill out. As with the CES 1.0 survey above, you can also send a survey like this one in-app to boost response rates.

CES 2.0 Survey - Long

This survey is longer and includes four questions, which means that response rates are likely to be lower. Distribute surveys like this over email or as an in-app message with a link to a Google Form, for example.

Wrapping things up

CES is one of the most crucial metrics to track because it gives you important insight into how easy your product is for people to use and how simple it is for them to interact with your company. 

Both those things are important to cultivate because, if you’re constantly asking your customers to exert a lot of effort in their interactions with you and your product, they’re more likely to churn, which will negatively affect your retention and product usage rates. 

Sending out short, targeted and easy-to-respond-to CES surveys can help you get a handle on the score, calculate it and then track it to make sure that you are creating the best experiences possible for your customers.

Guide customers to faster resolutions
Cobrowse with screen control
Highlight on screen
Integrate with Zendesk and more
Take interactive demo
Table of contents:

Related articles

No items found.

Supercharge customer support

Discover customer and product issues with instant replays, in-app cobrowsing, and console logs.

Get Started
Start on our forever free plan and upgrade to pro anytime.
Or try the product tour
Arrow right