Should I send a survey? No.

My default answer to this question is “no.”

One might be surprised, then, to find that my team at Yammer uses surveys all the time for quick research. They can be incredibly lightweight, fast, and useful, if used appropriately.

That’s a big IF. Most people use surveys to avoid talking to humans, to attempt to prematurely ‘scale’ research, or as a (very poor) substitute for data analytics.

If you’re thinking “can I send a survey to learn this…?”, here are some questions to ask yourself.

Do you know the questions you need to ask?

This may sound silly, but much of customer research and customer development involves learning more about current behaviors and procedures. It is usually not clear what you need to ask until after you’ve started listening to someone answer your first few questions.

For example, you could ask me “What is your purchasing limit at work?” and I could answer you, correctly, that I can purchase something up to X thousand dollars.

But it’s also true that my team has a larger budget for research tools/services/incentives. And that travel rolls up to an org-wide budget, and because I frequently travel up to Redmond to give internal workshops, I have ‘spent’ more on travel than that purchasing limit.

Not to mention that some people have different limits for annual purchases vs. one-off purchases, or have different limits for what they can put on a corporate credit card vs. what they can get a PO approved for. There are literally dozens of potential questions rolled up in that one.

None of which you’d have learned by asking me a survey question. You’d have been better off scheduling a 10-20 minute interview.

On the other hand, suppose you’re doing some introductory customer development around kid’s toys. You might learn enough from a straightforward question like “In the past month, how much have you spent on toys (for your kids or others)?” to determine whether or not to continue. (assuming you didn’t ask that question in December, which likely have given you skewed data)

Do you know the probable answers to the questions you need to ask?

Again, this may sound silly — why would you need to ask anything if you already knew the answer?!

Let’s say you want to know what tools people use for project management. So you list options for all the project management tools you know of. Maybe you even apply some rigor and choose them based on number of downloads or some analyst report.

Except some percentage of respondents use a tool only very lightly because 90% of the time they track things on a whiteboard or via email or even pencil-and-paper. So they’ll choose “Pivotal Tracker” because they do one task in it, even though they wouldn’t really miss it if it were gone.

OR some respondents will not actually know what tool they use. (It is amazingly common how many people do not know the name of a piece of software but only recognize it by the icon they double-click on.) But they will think, “well, I see all these options, and one of them must be right”, and they will GUESS and odds are, they will guess INCORRECTLY.

You may think that a freeform text field will solve these problems (after all, people can just leave them blank if they don’t know! People can specify that they use X tool in this situation and Y tool in that one!), but they really don’t. People skip freeform text boxes when they can; if you make them required fields, you’ll get a ton of gibberish data.

For all of these cases, you’d more likely learn what you needed through a back-and-forth conversation. You’d be able to learn which tools for which situations, why they switched away from that tool last year, who makes the decisions on tool usage, that the respondent gets assigned tasks but never actually opens the program himself.

Use these 2 criteria — do you know the questions you need to ask, and do you know the probable universe of answers — and you’ll probably kill more than half of your surveys before you can pop open a SurveyMonkey tab.