Online Survey Checklist - Tech Learning

Online Survey Checklist

Why should administrators use surveys? In the age of data-driven decision making, collecting data and using it efficiently is key to all aspects of the education process. Online surveys are a great way to gather information on everything from lunchroom procedures to program design. This information can impact
Publish date:

Why should administrators use surveys?

In the age of data-driven decision making, collecting data and using it efficiently is key to all aspects of the education process. Online surveys are a great way to gather information on everything from lunchroom procedures to program design. This information can impact solutions for student achievement, community relations, and district management. Consider the following:

  • Evaluation: The nature of NCLB is driving schools to reevaluate comprehensive plans. Collecting information regarding opinions and beliefs allows administrators to harness the voice of many while evaluating areas for development.
  • School-community connections: Understanding parents, teachers, students, and community members on budgetary needs, school safety, and even transportation procedures is essential to the success of school-to-home connections. Demonstrating the value of community by asking questions, listening, and taking action builds trust.
  • Respectful decision making: Administrators are faced with making many difficult decisions. Having hard data at your fingertips eliminates the need for guesswork and basing actions on opinions — which can create emotional situations and impede real progress.

How to design questions

Careful question design is one of many considerations. The following is a list of items to contemplate when designing surveys:

  • Establish goals: Begin with defined objectives. For instance, if the goal is to gather information on support for a building project, focus on questions that align to perception of need such as class size, condition of existing facilities, and viewpoints regarding the impact of projected demographic growth.
  • Define audience: A clear picture of the population addressed is necessary. If opinions of community members that do NOT have students currently enrolled are desired, focus accordingly.
  • Anonymity: Decide whether it is pertinent to allow for anonymity of respondents.
  • Design questions worth asking: Be sure questions are neutrally worded to avoid bias, concise, and easy to understand. Keep each question focused on a single topic. Consider using third-person wording. Someone may be more honest about their own feelings when asked, "How do your colleagues feel about..." rather than "How do you feel about..."
  • Do not ask questions you are not prepared to have answered: It is inappropriate to disregard data simply because results may not correlate with original hypotheses. When all factors that could have affected survey results have been exhausted (broadness of sample, poorly worded questions) the data left is telling.
  • Consider scale and question types: Eliminating the "middle" in scaled responses ("3" in 1-5 scales) forces someone to "take a side." Surveys are designed for feedback, so consider whether a "no opinion" choice is valid to the process. Is there need for open-ended responses? Also, there is a difference between: "Do you want a new high school, yes or no?" and, "Rate community perception of need regarding facility expansion on a 1-4 scale."
  • Group themes and types of responses: Theme-clustered questions create clarity in the response process. Like-response types such as a Likert scale or multiple-choice should be grouped for consistency.
  • Keep surveys short: Provide directions with an amount of time to take the survey that has been tested, not estimated. A parent might give five to 10 minutes to feedback. A faculty member allotted assigned time may focus for 20 to 25 minutes.
  • Trial run: Use test groups for feedback. Examine sample data to be sure results align with objectives. Be prepared to redesign.
  • Getting the word out: Newsletters, listservs, Web sites, flyers, calendars, newspapers, radio, and Board of Education meetings can all be used for publicity.
  • Time frame: Make sure the time frame is long enough to reach your audience, but not long enough to allow procrastination. Faculty-level surveys may take two business days, while community surveys might be available between two Board of Education meetings.
  • Application of results: Quickly analyze results. When constituents are asked for input, expectations are that the input will be applied. Knowing how to use data is as important as collecting it. Training staff regarding the use of data is time and money well spent.
  • Presentation: Take advantage of charts, graphs, and reporting features to create professional materials.
  • Directions: Make no assumptions that the audience has the skills to effectively participate. Provide clear directions outlining goals, time required, and other pertinent information.

Choosing the right tool

There is a multitude of choices when it comes to selecting technology-enhanced survey tools. The same attention should be given to selecting the right survey product as to evaluating software or textbook purchases. The following are considerations to be aware of when seeking the best survey or remote-response tool for your school or district.

  • Question formats: Multiple-choice, open-ended, dual-scale, and various other question types are available. Not all tools offer the same question variety or flexibility.
  • Branching options: "If you answered no, skip to question 10." Some tools will automatically skip to a question based on the response.
  • Question banks: Sample or template question banks are sometimes available.
  • Required responses: Can questions be marked as required before a respondent may advance?
  • Distribution: Some tools offer features that distribute surveys and track responses.
  • Visual interface: What are the graphic design features of the program?
  • Technical: What are the related hardware or software requirements?
  • Results and data storage: How long is survey data stored, in what way are results presented, who owns it, and what are the graphing and import/export features? Downloads range from universal formats such as .CSV to proprietary databases. How easily can information be imported to presentation formats?
  • E-mail: Are there e-mail options to send results to system managers automatically? If respondents provide e-mails, are features available to quickly use this information? Is there an auto-reply feature?
  • Passwords: Determine if password protection of survey access is needed and available.
  • Accounts: Can a user finish an incomplete survey at a later time?
  • User friendliness: How much training is needed? Features such as wizards and tutorials may be included. What skills will the audience need to respond?
  • Price: Many online survey tools have free or trial accounts. Services structure pricing by the survey, number of questions, number of responses, or by monthly/yearly fees. How many user accounts are being purchased? With handhelds, look for software licensing, number of units, features, warranty and support, and hardware cost. Software should allow for open-ended questions without needing "correct responses." Examine recurring costs.
  • Support: Phone, online, e-mail, and chat are all forms of technical assistance. Determine what is offered. During what times (and in what time zones) is support available?


Careful planning is crucial. Listed are points to consider.

  • Access to technology: Does the audience have access to the technology required? Make computers in schools available outside business hours. Making arrangements with local libraries and community centers can also be a solution.
  • Anonymity: Even if it is stated in advance that the survey is anonymous, some people believe they can be identified by the "technology." Education on how it all works will ease this concern.
  • Multiple responses: Various tools address controlling response numbers from specific people or workstations, eliminating the worry regarding duplicate submissions.
  • Junk mail: Distributing the link to a survey via e-mail could result in messages being deleted as junk or spam if the receiver does not recognize the sender. Consider sending correspondence from the out-box of a superintendent or principal.

Wrapping it up:

Access to survey design and electronic implementation tools greatly enhances the data-driven decision-making process. Administrators who take advantage of these systems find them invaluable. We have witnessed the success of many district-based projects, and our own programs have been forever impacted.

Catherine Parsons is currently a regional professional development administrator in the Poughkeepsie, NY region.

Jodi DeLucia, with her background in Science and Adult Education, currently serves as a professional development specialist in the Poughkeepsie, NY region.


Remote responders help trainers gather data to customize staff development sessions.

As regional professional developers in a New York county with more than 100 school buildings, we've often found ourselves challenged in training sessions because of a dearth of information about the needs of our audience. In some cases we've had little more to go on than grade span, number of participants, and a broad target topic. Having information about the professional development history of our educator audience would allow us to customize the session to better meet their needs.

So with the goal of being able to assess on-the-spot "who you are, why are you here, what you know and what you need to know," we opted for a remote responder system. What worked best for our purposes was eInstruction's Classroom Performance System, with 32 handheld responders for on-the-fly questioning. (see "Tools We Use" below). We found that kicking off workshops with a quick needs assessment via hand-held response immediately engages audiences and helps us modify presentations. If our quick survey shows this is the fourth time in three years they've had an "introduction to differentiated instruction," for instance, we can immediately jump to another relevant topic. The result is maximized hands-on learning time and more instruction-ready, appreciative learners.

Remote responders such as this one from e-Instruction make it easy for professional developers to collect key data from participants before and during training sessions.

Tools We Use

There are many tools available on the market. The following is a list of those we use on a regular basis and with which we have had the most success:

eInstruction ( Thanks to local educator Janet Walter ( we were introduced to the Classroom Performance System (CPS). A set of 32 hand-held response pads with software will run approximately $1,995. We have found that these handhelds are simple for participants to use. The software is diverse and easy to navigate, and it integrates with common presentation formats. It is great for on-the-fly questioning, making it perfect for use at faculty meetings or board presentations. Both Mac and PC formats are available, which is a plus!

Qwizdom ( Easily integrating with Microsoft PowerPoint, Qwizdom Interact software allows for the creation of multimedia lessons, questions, and surveys. We have found the immediate feedback as well as two-way remote interaction to be constructive in meetings, presentations, and workshops. Qwizdom comes in handy when using prepared materials or placing questions in an interactive, game-like design. There is a wide variety of question formats available. The cost of 16 participant remotes, one facilitator remote, and the corresponding software runs around $1,175. ( At a New York State Staff Development Council event, was highlighted as an effective way to create and manage online forms and surveys. We have found that it's flexible, allowing for numerous design and question formats. The branching logic is easy to understand. Pre-designed backgrounds and templates are available, or you can import your own graphics. Respondents can complete forms at a later date, and password protection is available. There are also various e-mail options integrated into the system. Costs range from free to $1,000 per year. Results can be viewed and accessed in dozens of formats. All around, we have had the greatest success with



Survey Says...

Thanks to their ability to capture solid formative assessment data, SRSs are the hottest ticket since the doc cam boom. T&L checks out the latest products and features.

Writing Effective Surveys

Tip: Getting the right data from your survey means you have to create the right questions for the right audience. State the purpose of your survey and the reason why you are collecting data from this audience. Many people are concerned about answering questions about themselves and sharing their

Data-Driven Instruction Survey Released

LinkIt!, the K-12 education technology, assessment and data-management company, announced today the results from its “Step Back, Leap Forward Report: A 2012-2013 Data-Driven Instruction Survey,” a nationwide survey

Survey: Keys to Successful Online and Blended Learning

School districts throughout the country report they are offering students significantly more online learning programs in 2013 compared to 2012 – especially blended learning and online courses for credit – according to an annual survey of district- and school-level leaders, announced today by K12 Inc.

Surveys on Staff Development Needs

Question: Do you have samples of needs surveys for staff development? The IT Guy says: Take a look at this sample of survey questions I used last Spring with middle school teachers in Plainview, Texas. It is available on The results from that survey determined

Checklists for Planning a Technology Initiative

from Educators' eZine --> Recently, I was working with my Technology for School Leaders graduate students on planning for technology. I prompted the students to develop a Planning Technology Initiative. Below is a summary

Graduate Students Grade Online Instruction

Abstract: This paper examines graduate student perceptions about the advantages and disadvantages of online instruction, also referred to as distance learning. An open-ended survey assessed information from 110 graduate students majoring in special education. We analyzed the data after coding it into categories

Garbage In, Garbage Out: Survey Design

We recently conducted a survey of our students regarding their use of technology in school. The results are in, but we’re finding we have more questions than answers! Where did we go wrong? In most cases, when survey results don’t yield useful data it’s due to the design of the instrument itself. Writing