Customer Survey 2013

Did we mention that our Annual Customer Survey is up and running? We run a survey every year, giving up a significant portion of our margin for a month for some business intelligence.  Is it worth it?

Depends on who you ask I guess.  We obviously think so, but it’s still a significant amount of money to ‘give up’.  If you intend to run a customer survey, there’s a few things to keep in mind.

Is this Statistically Relevant

Okay, we could probably run the survey and not give up the 5%.  However, we start running in the problem of statistically relevant datasets.  We generally have that problem anyway with specific categories of our customers who answer the survey (example, women customers – last year we had 16 respondents).  The more responses you can get, the better in general.  There’s an actual formula (of course) to figure out the number of respondents you want or need.

If you have a customer base of a 1000, you’d want (for a 5% confidence level 278 customers sampled).   The total numbers continue to go up as the number of customers you have increases, so for all intents and purposes in an in-house; non-professioanlly run test; you’ll want as many answers as you can.

If you can’t get statistical relevance on your data, you have to be careful about making decisions from the information.  Sometimes though, the little additional data garnered can coincide with your gut feelings; which can be enough to base some major business decisions on.

Why are you asking me this?

All too often, I’ve seen surveys (and yes, including ours) where questions are asked that have no real point to them.   Questions on the survey should do one of two things:

a) Categorise your respondents (example – have you purchased from us before differentiates customers and browsers)

b) Will provide data you can take with (in previous years, we asked where people found us via so that we could stream our advertising / marketing a bit further and evalutate our marketing spend).

In both cases, you are gathering data so that you can use.  Asking someone whether they are right or left-handed, while amusing; is not very useful.

Don’t Lead

Work on keeping a neutral tone to your questions.   The questions should be phrased so as not to lead the answers ‘Starlit Citadel is the best game store because…’ is not a good question. Especially if it precedes the question ‘Which is the best game store of the following’.

I’m not sure if active or passive voice matters, but I generally go with passive just because it’s less likely to have non-neutral terms in it.

Keep it Short(ish)

On one side of the equation, you want as much data as possible; especially actionable data.  On the other, if you keep the survey running too long; you’ll lose respondents.  Generally I find that surveys in the 10 – 15 minute range is about the maximum for online surveys.  Again – you can go longer if you provide an incentive (our coupon code here); while no incentive surveys mean you got to keep it short.

Just remember to test both the maximum length as well as minimum length (i.e. if someone answers ‘no’ to all your questions, what data are you getting and how fast is he going through the questionnaire) to get an idea of your survey ‘length’.

Don’t Forget to Compare

The answers we get this year is going to be a lot more useful for us than it was in  year 1.  Not only because we have made the questions better (yeah, we did) but also because we will have 2 years of previous questionnaire data to compare it to.  This can provide you some interesting results that you can track as your company / marketing changes.

Oh yeah, don’t forget to test

Lastly, make sure to test your survey.  We’ve managed to make mistakes even after testing the survey a few times, I’m sure this year there will be mistakes too or missing components.  The more testing you can do, the better

So that’s the quick and dirty for surveys.

PPS: When creating the questions, don’t forget to ask yourself the question ‘can I get this data better somewhere else?’.  I could ask people if my stock levels for products are good; but I get much better data by just keeping track of out-of-stocks, sales velocity and turn rates (i.e. actual sales data).  When you can, it’s better to track what people do rather than what they say they do.