Many companies are using research as a form of content marketing – be it survey-based studies that delve into industry trends or analyze internal user data to demonstrate expertise and expertise. brand points.
However, all that renewed interest comes with a learning curve. Initial research is one of those areas where a lot of mistakes can happen, especially if you are inexperienced . Most of the problems I encounter are related to one of two things: poor survey design or faulty statistical analysis.
Most of the problems I had with #research initially stemmed from poor survey design or faulty statistical analysis, @clare_mcd said via @CMIContent. #Research Click to Tweet Today, I will focus on one aspect of survey design – the survey experience.
Survey experience is about the extent to which survey participants think the questions are relevant, intelligent, and relevant. Can they (and feel motivated to) complete your survey? Will they answer honestly and openly? Would they respond to a future survey based on this experience?
While it may seem like a good thing to have, a good surveying experience promotes completion and accuracy. A bad surveying experience can hurt your survey results (a little more on that).
A good surveying experience isn’t just a nice thing. According to @clare_mcd via @CMIContent, it helps to increase completion rate and accuracy of results. #Research Click to Tweet After 20 years of working with customers using research-as-content I pay great attention to these eight experiential factors.
Table of Contents
1. Considerations before and after the survey
The survey participant experience begins before they begin the survey and after they complete it. In your survey invitation, be sure to explain why you’re conducting the survey, what you’re doing with the data, and how long the survey should reasonably take. (Be brief in these explanations. You don’t want someone to drop out of school before they even start.)
Enable the GDPR opt-in setting if you plan to collect survey responses from anyone in Europe, @clare_mcd via @CMIContent said. #Research Click to Tweet Also, keep in mind your post-survey experience. Someone who completes a survey will see a custom thank you page, not the default one provided by your survey platform.
If you collect email addresses from survey participants, be clear about your intentions. For example, we collect emails from people who want a copy of the final report or to enter a lottery to complete a survey. We never use email for any other purpose (and in fact, I remove the email column in my spreadsheet and put it in a different tab to separate identities from responses).
If survey participants believe their answers will be used to market to them in any way, they will not want to participate in your survey.
2. Survey duration
Don’t love the “please take ours – 20 – minute survey invitation”? It was a not difficult for me. Unless you’re paying someone to take your survey, never more than eight minutes – and even eight minutes is a big question.
Never create a survey longer than eight minutes or your completion rate will drop cliffs, say @clare_mcd via @CMIContent. #Research Click to Tweet Consider this: Completion rate decrease for every additional minute needed to answer the questions. In my experience it started falling off the cliff at about 8 minutes intervals. Editing the survey length is an absolute necessity, and it’s a great way to ensure you’re in focus from an editorial standpoint.
3 . Question length and complexity
Most survey platforms warn when the questions and/or answer options are too long. That’s because long questions or answers lead to fatigue, speeding, misunderstandings/errors for survey participants, and they look like hell on mobile devices. Avoid long questions and answers, unless it’s absolutely critical to one or two questions.
Also, be careful with compound questions (e.g., “Does your work give you satisfaction and pride?”) Reduce your question to an idea or variable for people Your survey participants can answer easily and you can report the results clearly.
Your survey tempo should be like a conversation between two strangers. Don’t delve into the most sensitive, probing questions ahead. Wait until survey participants can see your research as worthwhile based on the quality of your survey questions. They may then be willing to share more sensitive details. Earnings are always a sensitive topic, but there are other issues that people can’t help but wonder, such as plans to quit a job or reveal sensitive company information.
I recommend a few workarounds to this dilemma: Put those types of questions at the end of your survey and make them optional (or add an option for “don’t want to answer”). You can even remind survey participants at sensitive times that their answers will be completely anonymized and never used for any other purpose.
Capturing demographic information is essential to ensuring the survey sample is representative of the audience you’re trying to research. Demographic responses can also expand options for interesting “cuts” of data, showing you how different your research groups (e.g. generations) are.
In recent years, survey designers have evolved the way they ask demographic questions to be more comprehensive. For example, they make sure questions about sexual identity or gender identity use language consists of but not tear away – an important aspect of the surveying experience.
The challenge is to balance comprehensiveness and brevity. For example, instead of listing dozens of options for gender identity, I limit the answer choices to male, female, non-binary, and prefer to self-identify (write in). The self-defined preference option ensures everyone has the right choice. And fewer answer choices make it more likely that you’ll have segments large enough to compare.
Your survey platform can be a good source when designing demographic questions. (They usually have a library of questions to draw from. SurveyMonkey’ (Library of particularly good). I also like to see what the major research institutions like Pew Research Center used for demographic questions. Whether I’m questioning gender, race, ethnicity, age, gender identity, or other characteristics, I consult established surveys to find consensus.
When does this advice go out the window? When gathering specifics is part of the primary objective of your research (e.g. research focused on gender identity) and commonly used demographic questions do not provide specificity that you need.
6. Self-service questions
Don’t even think about asking self-service or promotional questions. Always, I work with companies that want to ask some not-so-subtle questions to promote their products/services. Problem? Your survey respondents are very smart and they will object to the question and even punish you for it.
Don’t even think about asking for self-service or promotional questions in your survey, @clare_mcd via @CMIContent said. #Research Click to Tweet The funniest example of this was a client I worked with a few years ago who adamantly asked them, “Which one do you prefer?” related dashboard analysis using the illustrations of each. One image is clearly more advanced (and belongs to the customer) and one is the primitive stone age option. Guess what? A third of respondents chose the Stone Age option. I doubt they knew it was a setup. The result cannot be used.
7. Survey test
Testing your survey is the most important thing to do before you release it into the wild. Recruit at least five people (over better) to do a survey and comment on ANY issue that gives them pause. These individuals should be in your target study group so that they can check question wording and answer choices.
Testing your survey is the most important thing to do before you release it to the wild, @clare_mcd said via @CMIContent. #Research Click to Tweet My company pays testers to make sure they take it slow and document all their questions and concerns. Are there any unclear questions? Are the answer choices reasonable? Can they answer every question or do some questions not apply? Make sure some testers answer the survey on mobile and others on desktop to validate both experiences.
Once your testers complete, run automated tests through your survey platform. It automatically generates responses. Go through your summary report. These mock responses can help identify any problems with survey logic and pipelines.
8. Survey responses
Be sure to include email addresses for people with questions in the introduction, on disqualification pages, and at the bottom. If you’ve designed a great survey experience, you may not receive any emails (we rarely do). But providing the option could serve as an early warning system for any survey issues missed during testing. (Note: Basically, you can’t edit the question once it’s live, but you can choose to restart the survey or remove the offending question.)
Better survey experience brings big benefits
Why is experience so important? The easy experience increases your completions and sample size – enhancing your research credibility and making you more likely to tell interesting stories. In addition, a good survey experience signals to survey participants that the study is worthwhile, which is critical when you ask customers or other members to participate.
Please Note: All note taking tools are suggested by the author. Feel free to add your favorite tools in the comments (from your company or those you’ve used).
If you want to be among the first to see the results of CMI’s annual benchmarks and research trends or relevant reports, register receive a daily or weekly newsletter.
Cover photo by Joseph Kalinowski / Content Marketing Institute