Topics Map > Campus Services > Qualtrics
Qualtrics Survey Hosting - Survey Best Practices
This document outlines some general best practices for creating effective surveys with Qualtrics.
Survey Best Practices
Implementation of an effective survey can be complex. Issues such as developing valid questions, identifying adequate sample sizes and implementing other best practices should all be taken into account as you develop your survey. To provide some basic assistance, we have put together some Best Practices.
However, you may wish to consult with other campus resources before, during and/or after your survey implementation. If so, we encourage you to read Qualtrics Survey Hosting - Survey Resources.
There are some known accessibility and usability issues with certain Qualtrics question types. As you are designing your survey please keep in mind that these question types make it difficult or impossible for individuals with disabilities to participate in your research or respond to a web form. For more information regarding the barriers users may experience see Qualtrics - Accessibility & Usability Information KB .
The Qualtrics Survey Hosting Service will only be helpful if used carefully. Simply sending surveys to everyone will soon train people to delete every survey message they receive. Be discerning in your decision to survey campus stakeholders; do not simply send to everyone. Remember, you are one of dozens, if not hundreds, of people using this tool to collect information.
- Response rates to web surveys typically range from 30-60%.
- Select only classifications of students, faculty, staff or customers that are appropriate for the topic of study.
- Select the smallest sample size that meets your needs. The figure may be smaller than you think. There are different formulas for calculating sample size. Creative Research Systems provides a sample size calculator.
- Track which individuals have responded, and only send reminders to those who have not yet completed the survey.
- Be respectful of those who do not wish to participate. If someone requests to be removed from follow-up mailings, please remove them.
- As with any research conducted on campus, federal human subject regulations may apply. You should contact the UW Institutional Review Board and review any other relevant policies before implementing your survey.
- For assistance with sampling, weighting, instrument and question design, construction of data sets and other survey management needs, you can contact University of Wisconsin Survey Center or consult our other resources: Qualtrics Survey Hosting - Survey Resources.
- A survey question should contain only a single concept so that you can define what you're measuring. Don't use a compound statement such as "The training environment was comfortable and the right temperature." These are two different ideas.
- Use a 5-point likert scale unless there is a compelling reason otherwise. The 5 values should be both clear and distinct from each other. The 'distance' between the values should be as 'equal' as possible since the variables are treated as interval data (e.g., you would not use a likert such as 'never', 'seldom', 'occasionally', 'often', 'always' - 'seldom' and 'occasionally' are too similar).
- Yes/no questions should, to the extent possible, only have two options-yes and no. For instance, the question "Do you work outside the home," the values should not be: Yes, Yes, but only part-time, No. This is a guideline; there are some times when "I don't know" or "Not applicable" are appropriate.
- Think how a question will look in the final report, both visually and with respect to text. Text answers for a single question should be consistent so that when they are listed in a table, they will be easy to read. For example, if the question is "To what extent do you use cell phones?" The possible answers read: "Cell phones are not used, Cell phones are used at least weekly, Cell phones are used every day, etc." rather than "We do not use cell phones, Cell phones are used at least weekly, Every day, etc."
- Use options such as "don't know" and "not applicable" sparingly, otherwise it gives respondents an easy way to skip past a question.
- Use page condition (branching) questions as sparingly as possible. They can be more difficult to implement and tend to make analysis more error prone. On the other hand, they allow respondents to skip sections that are not relevant to them.
- For "Check all that apply" and "Select up to three responses" questions, keep the list from getting too long. No more than 10 answers is a good guideline. Use "check all that apply" if you want to know everything that applies, and use "select up to three" if you want to get a sense of priority. It's sometimes useful to ask a respondent's "top priority," so that that choice can be singled out for analysis. If there is a "does not apply" answer, this should be listed first so that people don't waste time reading the rest of the list.
- Use "other" as often as necessary as a possible response, but use discretion when giving the option to "describe other (optional)." Only use that option for important information as it takes more time for the respondent to complete and adds additional information to analyze in the final data set.