The Visual8

If you can see it, you can say it.

Why Most Surveys Fail: Insights and Alternatives

While the quantity of surveys have grown, the quality of the output has only decayed.

Several factors accelerate this condition.

Internet-based means of communication, whether email or website visits, made the marginal cost of distribution close to zero. There is nothing holding back a provider, an employer, or a retailer from asking for feedback.

Software templates automate the publication of surveys. Low-to-high questions, pick-from-this-list questions, and fill-in-the-blank questions are created through wizard interfaces. You don’t need special skills to compose questions or even aggregate results.

At this point, the large volume surveys we’ve each taken have exposed us to common ways to ask questions. This leads to the false conclusion that we are all qualified in creating them.

Since the beginning of the industrial era, managers of factories look improve yield through statistical means. Six Sigma is a standardized approach for driving up the output and driving down the inefficiencies.

This mindset is soaked through our education system, too. We are conditioned to take tests to measure comprehension. We are used to a numerical calculation of our understanding of topics.

Even knowledge work cannot escape scoring. Employee reviews come with Net Promoter Scores (NPS). Conduct a training and your audience will score their satisfaction on a scale from 1 to 10.

If you are online, undoubtedly you will be asked to fill something out. It doesn’t matter if you just arrived to the site or not.

Research a product online and the web designer wants your opinion on the website. Complete a purchase, the vendor wants a product review. Watch a video and give it a thumbs up or down.

It is probable that you will skip many of these due to time pressures. Or indifference.

A very positive or very negative experience inspires filling out the survey. We are either thrilled or upset. Middle of the road experiences do not drive action.

Those that have a strong relationship with the institution asking will click through the survey. Newly introduced are less inclined.

The rare, altruistic survey respondent may undertake the effort. They choose to improve the condition for others that follow. However, that energy drains quickly with never-ending requests.

Without it we fall into the trap of self-deception.

Modest computer skills opens the door to create surveys, but nowhere have we trained you in the science of surveys. We are not likely to even appreciate the history or surveys.

The minute we depart from demographic questions into pyschographic questions, we opened to the door the challenges of self-reporting.

People cannot be trusted to answer truthfully because we live by a social order not to offend. We are more likely to provide feedback on what we think you want to hear, not what you should hear.

We have not addressed statistical relevance either. Professional researchers using paper-driven surveys account for the need to collect representative samples.

Clickstream surveys assume that we erase the need for representative samples. ALL groups are presumed to be represented. A reasonable conclusion, but where is the validation?

Surveyors believe they cover all possibilities. They think this as long as “other” appears at the bottom of the list. Choosing this option increases the burden on the respondent. This is friction. People give up with friction.

Lastly, we have to recognize that experiences are not so neatly quantified. A pick list does not reflect the weighting of that choice. Nor does it account for the nuance of actual experience.

1. Watch, don’t ask.

The ethos of Design Thinking suggests observing our stakeholders’ behavior. We should not ask them what they did. This approach illustrates the nuance of experience.

2. Engage in open-ended conversations.

Ask your customer to tell the story about the last time they hired a plumber. Do not ask them how they choose a plumber. In that narrative you will find the keys to their decision making process. It avoids self-reporting biases.

This format also provides the ability to drill down on points of interest that standardized forms cannot.

3. Close the loop.

Survey fatigue exists because of the overwhelming volume. But it also exists because there is no payback.

To be clear, most are not looking for a gift cards or thank you notes. They want to know what changes the you are going to take.

You have to show your gaps. You have to admit precisely where improvement is needed whether or not that news is exploited by others.

It requires resolve to address the issues. Not just with words. But with specific details on the plan. This has implications on the budget. Both in people’s time and money to institute the fix.

Bottom line: if you want high quality input, your feedback mechanism has to become relationship-driven and not transactional-driven. Improvements have to be budgeted in advance. And it has to be driven by scientific approaches, not by technological conveniences.

Posted on