How to Write Great Customer Satisfaction (CSAT) Surveys (With Examples)

How to Write Great Customer Satisfaction (CSAT) Surveys (With Examples)

Customer satisfaction is key to running a successful business. That’s obvious to most everyone. What is less obvious is exactly how you should go about determining whether or not your customers are, in fact, satisfied. If you’ve been struggling with trying to measure this, customer satisfaction (CSAT) surveys may be just the ticket. 

Why CSAT surveys are important

We’ve gone into more detail about CSAT before, but, in short, CSAT or Customer Satisfaction Score measures how satisfied your customers are with a business, feature, or interaction. 

Your CSAT score matters because good customer support helps build customer loyalty. And CSAT surveys are tools that you can use to both measure your customer satisfaction score and solicit feedback from them about products, features or services. 

How to write good CSAT survey questions

Working with good data is key when it comes to CSAT, so it’s important to ask the right questions. Here’s a list of some things to consider when formulating your CSAT survey: 

Keep it short

The best kind of surveys are ones that don’t take a lot of time and effort to answer, so keep things short and sweet! People are much more likely to respond when you do, and you want as many as possible to take the survey to get an accurate result. 

Keep it targeted

In much the same vein, make sure that your CSAT survey is targeted so you actually collect data that is actionable. Now is not the time for general questions — direct your users’ attention to one particular product, service or feature. 

Think about delivery

Most surveys are sent out via email, and the response rate on those is dismal. It helps to meet users where they actually are, rather than sending out emails en masse. The best way to do this? Send a quick survey in-app, preferably immediately after a user has logged in or used a feature for the first time. 

Experiment with form

If you’re sending out surveys that are set up exclusively for text-based responses, you’re not exploring all the avenues available to you. So experiment with the form of these questions and answers. Consider asking users for an emoji rating or to score something on a numerical scale instead.

Types of grading scales

As we just mentioned, experimenting with the form of questions and the rating scales you use can help inspire people to respond. You just need to be clear and make it easy for them to. There are a few different kinds of ratings scales that are commonly used to do this: 

Likert scale questions

This type of rating scale provides users with a list of options ranging from one extreme to the other and often (but not always)  includes a neutral response. So, for example, scales that look like this: 

How satisfied were you with the product? 

  1. Very Satisfied 
  2. Somewhat satisfied 
  3. Neither satisfied nor dissatisfied 
  4. Somewhat dissatisfied 
  5. Very dissatisfied 

fall under the category of Likert scales. 

Binary scales

Pretty self-explanatory, these scales give you a binary choice to pick between. An example of a scale like this would be: 

Were you satisfied with the product? 

  1. Yes 
  2. No

Scales like these leave no room for interpretation or nuance, but do make it easier to arrive at a conclusion about something if you want that conclusion to be very definite. So, for example, if more people say they hate a feature than love it, you know it’s probably time to pull that feature. 

Multiple choice questions 

These give you a way to find out a little bit more information about your users. They look a little something like this: 

Which of the following features do you use the most: 

  • Option A
  • Option B
  • Option C

Given that you’re the one providing the answers, their options are definitely constrained, but questions like this can still be helpful in providing deeper insight into your customers wants, needs and preferences. 

Open-ended questions

Want to give your customers a chance to tell you how they feel about your product in their own words? Ask them some open-ended questions! Here’s what that looks like:

How was your experience using our product? 

Questions like these give customers a way to elaborate and give you more information in the process, but they also take a lot longer to answer and require more effort, so response rates are bound to be lower. They also aren’t very easy to evaluate or score, so are probably better used sparingly to reach out to specific customers about specific interactions. 

Types of questions to ask

With regards to the specific categories of questions you can ask users, there are several: 

  • Usage 
  • Product
  • Demographics
  • Psychographic 
  • Satisfaction

Usage questions

Want to know how often your customers are using your products? Ask them questions like: 

  • How often do you use (insert product name)? 
  • How often do you use (insert feature name)? 
  • Would (insert change) cause you to use this product more or less frequently? 
  • Which of the following features do you use the most frequently? 

You can then either provide a Likert scale, binary or text field for their response. 

Product questions 

These questions can give you great insight into exactly how your customers are using your product, what they like and, perhaps most importantly, what they want you to fix. This sort of insight is invaluable when it comes to designing and amending your product roadmap. Some examples of questions are: 

  • What problem does (insert feature or product) help you to solve?
  • Are you enjoying (insert functionality)? 
  • How do you feel about (insert product)? 
  • How easy is (insert feature) to use? 
  • How intuitive is (insert product) to use? 
  • Which of the following features do you find the most useful?
  • How much value for money would you say our product provides? 
  • Do you have any feature requests? 
  • Has (insert feature or product) made (insert problem) easier to solve? 
  • Has (insert feature or product) made your (insert industry) workflow easier to manage? 

You can form these as Likert, Binary or open-ended, depending on the question. 

Demographic questions 

If you’re interested in the broad strokes of your user base, it’s helpful to ask them specific questions about themselves. That’s where demographics comes in. Demographic questions can be helpful in understanding which segments you are serving well and which you are undeserving. Some examples include: 

  • What gender do you identify as?
  • What is your educational level? 
  • What industry do you work in? 
  • What is your job title? 
  • What is your employment status?
  • What is your income?  
  • What is your relationship status?
  • Where do you live?
  • What is your zip code? 
  • What is your ethnic background? 

Psychographic questions

These delve deeper into a persona than demographic questions, answering the why rather than the what. They concern a person’s motivations, wants, desires, preferences, self-image and the like. Some examples include: 

  • Are you a member of a religious denomination?
  • Are you a member of a political organization?
  • How do you feel about (insert issue)? 
  • What do you prioritize when (insert activity)?
  • Do you prefer (insert option) or (insert other option)? 
  • How much time do you spend on (insert activity)? 
  • How do you see yourself when it comes to (insert trait)?
  • What beliefs would motivate you to (insert activity)? 
  • How long do you spend on (insert activity) every week?
  • Where do you get your news? 
  • Which is your favorite social media platform?

Level of satisfaction questions

The bread and butter of any good CSAT survey, these questions measure how your customers actually feel about your product and how satisfied they are with it. Some examples include: 

  • How was your experience with (insert feature or service) today? 
  • Are you satisfied with (insert feature or service)? 
  • How would you rate your (insert recent interaction) with us? 
  • Are you enjoying (insert feature or product)? 
  • How likely are you to recommend us to a friend or colleague? 
  • How likely are you to continue using our product?
  • How easy was it for you to use (insert feature or product)? 
  • Were you satisfied with your onboarding experience?
  • How would you rate your recent customer support experience? 

How should you distribute your survey?

There are lots of tools that you can use, from dedicated software to simple solutions like Google Forms (see our example survey below).

If users are prompted to submit a survey on a mobile app, it makes sense that it is time to them having just used it for example. If you are sending out a link they can open on any device, make sure to test the experience and responsiveness first. If you are looking to collect your answers in an Excel or Google Sheet, there are lots of tools that connect to those. If you want to first write the survey in Word or Google Docs, you can do that too, but make sure to always look at the final UX of the survey you send out.

What to do after sending out CSAT surveys

So you’ve sent out the surveys and compiled the responses. Here’s what’s next: 

Calculate CSAT

Firstly, you’ll need to calculate your CSAT score to figure out where you stand. We have an article about how to calculate CSAT, so make sure you read that. 

Compare with industry benchmarks 

Once you’ve calculated your CSAT score, you’ll want to compare it to industry CSAT benchmarks to see how you stack up. 

Do more research to identify the issues 

Once you’ve evaluated how well – or poorly — you’re doing, you’ll need to do a deep dive into the responses and identify which customers to follow up with for more information. 

A great tool to utilize here is something called a session replays tool, which allows you to watch video-like recordings of users using your product. 

Fullview Replays is specifically designed for improving CSAT scores, because it gives a way for support and product teams to actually watch sessions of specific users that they choose, rather than most other solutions on the market, which anonymize the data. 

Use it in conjunction with Fullview Console to see what problems and errors they encountered during their session. 

Follow up to increase customer satisfaction

If you’re feeling down about the fact that your CSAT scores are not quite up to standard and notice that many people have responded negatively, don’t lose hope yet! Reaching out to customers can still go a long way towards ensuring they don’t abandon your product. 

A great way to reach out to them is via cobrowsing, which is a much more immediate solution than sending out an email hoping to receive a reply. 

But how does it work? 

It’s simple! Once you notice the user you want to speak to is online in your product, you can use a tool like Fullview Live to call them in-app at the touch of a button. 

Fullview Live allows you to speak to your customers straight from within your product, so you don’t have to send out meeting invites or Zoom links. You can also cobrowse and control their screen right along with them to get them smoothly past sticking points or troubleshoot easily by looking at console information in the Fullview Console sidebar, available right on the call. 

You can also use this as an opportunity to ask them for more detailed feedback and take notes on product and/or service improvements they suggest. 

Continue to measure and improve

CSAT isn’t a one-and-done sort of thing. It’s a metric that you will have to measure on a continual basis to see exactly how it evolves over time. You’ll need to keep a close eye on these trends to make sure that you are still meeting customer expectations and still creating memorable experiences for them that result in satisfied, loyal customers.

Fullview was made to destroy silos between customer support and product teams precisely so valuable user feedback from CSAT and NPS surveys doesn’t get lost in translation. Sign up to give it a go or book a demo so we can show you around.
Author

Shifa Rahaman

Content Marketing Manager

Contributor