Skip to main content

Survey Best Practices

Survey Development Best Practices

Included below are general and campus-specific tips to help you develop surveys. 

Critical questions to consider before beginning

  • What do you want to know?
    • This is not a question about ALL the things you could possibly ask (a brainstorming activity), but a fundamental exploration of what is important to know (not just interesting) to advance your work. Understanding the core purpose of your survey should guide every step of the process, from determining target audience, the questions posed, the distribution method, and how results will be reported. Don’t skip this step!
  • Who are you going to ask?
    • Is your population narrowly defined (i.e., students who participated in your program) or do you want to generalize your findings to a larger population (i.e., graduate students)? Knowing who your audience is will help determine the kinds of questions asked and the distribution method.
  • How do you plan to use the data collected? (also see later section on reporting results)
    • Begin with the end in mind. What possible implications for policy and practice will result from your data? Who will want to see and use the information gathered?
  • Is there existing data that might already provide helpful information?
    • Data is already collected from every corner of our campus. In addition, national studies already explore topics may be of interest to you (i.e., leadership, alcohol use, engagement). Is there anything you are planning to ask in your survey that already know or can find from another source?
  • Is a survey the best option for you?
    • Surveys are just ONE way to collect data. They may seem like a quick and easy solution, but may limit you in the types of information you can gather and the depth you can reach. Would a focus group, direct observation using a rubric, or another option better serve your core purpose?
  • When do I need to consult with SAALOG (Student Affairs Assessment and Learning Outcomes Group)?
    • While we do not require approval for most surveys to be created or distributed, we do recommend you consult with a SAALOG member (see link on the left) for projects that are division-wide or a major campus initiatives, when you will be asking for confidential data from the Registrar (i.e., emails or other personal identifiers), or if you are planning on sending a survey to more than 2000 people. SAALOG members can help you with survey development and distribution.

Survey design

  • A project manager should shepherd the process forward
    • When developing a new survey, it is best to designate one point of contact.  This person will be responsible for moving the process forward, soliciting and evaluating stakeholder feedback, and determining the items included in the final survey.  Of course, committee members or other stakeholders provide critical feedback, but assigning someone primary responsibility for overseeing the project will help ensure best practices are followed.
  • Develop good questions
  • Obtain feedback from multiple sources (pilot, expert on topic, expert on surveys]
    • Once you have finalized a draft of your survey, it is best to solicit feedback.  There are a variety of options for doing so, and it is best to solicit feedback through more than one of the following methods:
      • Pilot your survey – are there student employees who work in your office?  Ask them to complete the survey, noting any items that were confusing and places where the response they wanted to give to a question wasn’t available.  You might also want to ask students about the “flow” – does the progression of the survey make sense?
      • Ask a topical expert – perhaps there is someone else on campus with more advanced knowledge of some of the questions asked on the survey than you.  That’s ok!  Ask them if they wouldn’t mind reviewing the survey for you and making sure your survey items are appropriate.
      • As a survey expert – you can also ask for feedback from someone who is an expert in survey design.  The members of SAALOG can help provide feedback on the types of questions and general design of your survey.
  • Sampling
    • Consider the questions you are interested in.  Are you hoping to get a sense of what services all university students feel are missing on campus with regards to your units function?  Are you hoping to get feedback regarding the learning outcomes you established for the 40 students who participated in your program?  Having a clear idea of what body of students you would like to generalize to will help inform your sampling procedure.  Learn more about "Sampling and Sampling Sizes."
  • Length
    • Considering the length of your survey is important.  The shorter your survey, the more likely students will complete it.  For program assessments, it is best if the survey takes less than 5 minutes to complete.  Consider the types of questions you are asking as well; open-ended items tend to take more time to respond to than multiple response questions.
  • Response Rates
    • Having a high response rate is important for being able to “trust” the data you collect.  Low response rates introduce bias into your results, and make interpreting them with accuracy challenging.  With the move to conduct more surveys online, students are being asked to complete more surveys than in the past, and response rates are dropping.  In general, you should strive for your survey to have at least a 25% response rate (of course, what is considered "acceptable" will vary according to target population and survey purpose). Learn more tips on how to solicit an acceptable response rate.
  • Incentives
    • One method to increase response rates is providing incentives.  Being a state institution, we face some limitations in terms of what we may provide students.  See this document for incentive suggestions specific to the University of Maryland you could provide and other considerations.
  • Timing
    • Think carefully about when you want to distribute your survey to students.  If it is a program evaluation, sending students the survey closer to the end date of the program will increase the likelihood that they will respond.  If your survey is not tied to a specific program, consider the academic calendar, stressful times during the semester, and religious holidays when determining your administration timeline.
  • Dispersal methods (paper, online, mobile)
    • There are a variety of ways to get your survey in front of students.  You could email a link to an online survey to them – this will likely increase the number of students you are able to ask to complete the survey and will allow you to easily remind them to complete the survey.  You could distribute a paper survey to your students.  This may be more costly and limits the complexity of the survey design, however response rates are generally much higher with paper than online surveys.  You could also design a mobile survey that students can complete on a smart phone or mobile device.  (Someone in SA) has (iPods? iPads?) that can be checked out for an event. Mobile surveys should be extremely short and include no open-ended items.
  • Qualtrics
    • The University of Maryland utilizes Qualtrics is a survey platform provided to everyone at the University of Maryland.  Qualtrics is an advanced survey design tool, however the support available for Qualtrics is focused on designing the survey in Qualtrics, not on providing feedback.  The Division of IT offers workshops on how to develop surveys in Qualtrics.

Human Subjects

  • Consent: When engaging in an assessment, you should first ensure that the students have consented to be a part of this process.  Alerting them of how their responses will be used and providing the contact information of the staff member in charge of the assessment are key things to consider.
  • Confidentiality: When performing any assessment, it is a best practice to ensure those who are responding that their responses will be kept confidential.  That is, you may know who expressed what opinion for any given project, but you may not release this information to others.  This helps to ensure that students will give you their honest feedback.
  • Institutional Review Board (IRB): Sometimes, an assessment that you conduct may contribute knowledge that would be of use to the whole field.  You may wish to publish these findings or present them at an external conference.  By gaining IRB approval before you begin your project, you ensure that these avenues are open to you should you decide to pursue them.  It is the role of the IRB to protect human subjects (i.e., the students), and to that end they will review your research design thoroughly to ensure it meets their ethical guidelines.
  • FERPA (Family Educational Rights and Privacy Act): Under FERPA, you may not release information that includes identifiers.  That is, if you release the results of you assessment publicly, you must first ensure that the students who responded to your request for feedback (be it a survey or another method) cannot be identified through your report.
  • Data Security: Adequate provisions must be made to maintain the confidentiality of identifiable information and data collected. Data should be stored in locked file cabinets/offices and be password protected. Personally identifying data should be deleted from data sets when appropriate or after a set period of time and access to data should be limited. 

Obtaining sample data

  • Types of data you can obtain
  • Process (Lisa, Registrar, existing data)
    • Issues of approval and access are important to consider early on in the assessment process.
  • Listserv v. confidential data

Reporting and using the findings

  • Best practices
    • When reporting your data, it is important to not just provide raw numbers, but to provide context and interpret the findings for the reader. It is also recommended that you review your results with others to find additional meaning in the data.
    • Results that are not shared are less likely to be used, which in terms makes the survey process purposeless.
  • Maintaining records
    • If you left your position and your replacement wanted to duplicate your survey, could they do it? While a copy of the survey and a report of the results should be readily available, you should also leave detailed notes regarding sampling, distribution, analysis (including any coding, syntax, etc.), and reporting (who it was sent to, notes from presentations given, questions you received that would inform future surveys, etc.). If your survey was developed in Campus Labs, you can also enter project notes there.

Next Steps/ The assessment cycle

  • Assessment as an iterative process
    • Assessment is iterative in nature. Findings should support programmatic changes, which in turn inform new or revised assessment measures.
  • Incorporating assessment into your planning process
    • To be most powerful, assessment should be tied to your annual planning processes. When you are determining goals for the year (or longer term strategic planning), asking “how will be know if we are successful?” will help drive the assessment process forward. Take time each year to ask, “what do we need to know to improve our services (or better understand our students)?” and map out timelines for your assessment measures.
    • Here is a helpful assessment plan template to assist!

Note: documents in Portable Document Format (PDF) require Adobe Acrobat Reader 5.0 or higher to view, download Adobe Acrobat Reader.

Back to Top