‘Fake surveys suck!” This is a piece of comment mail I’ll never forget. Scrawled boldly across the survey in thick, black marker, the donor’s opinion of the organization’s effort to engage him couldn’t have been more succinct.
Or more true. So many variables determine a survey’s success … or its suckiness.
How many questions are ideal? Is asking a donor to rank items in priority order a good idea? What about “True/False” questions? And if the issue requires some explanation, should each question have an educational preamble to help donors decide their answers? Or, is that more confusing and could it make them feel stupid?
A review of a dozen survey packages I received in the past few months reveals mixed practices.
The number of questions ranged from six (the National Republican Senatorial Committee) to 32 (the Democratic Congressional Campaign Committee). Many surveyors grouped questions under numbered subheads and gave the illusion of a shorter or simpler survey.
Three surveys with lists of issues or multiple-choice options directed, “Please fill in all that apply” or “You may check more than one.” Excellent, because that way there are no “wrong” answers, and it prevents donors from setting the package aside until they have time to think it over — the very last thing we want them to do!
Many surveys asked for “optional” or “voluntary” personal information such as age, profession and voter status, but Judicial Watch probed even deeper with an inquiry into my estate-planning intentions and my interest in receiving information on planned giving. A little odd mixed with a series of questions about corruption in Congress, illegal immigration and the likelihood of rampant White House wrongdoings if Hillary Clinton is elected president in 2008 — but if it doesn’t hurt response and will surface planned gifts for the organization, it’s a bonus.
‘Fake Surveys Suck!’
‘Fake surveys suck!” This is a piece of comment mail I’ll never forget. Scrawled boldly across the survey in thick, black marker, the donor’s opinion of the organization’s effort to engage him couldn’t have been more succinct.
Or more true. So many variables determine a survey’s success … or its suckiness.
How many questions are ideal? Is asking a donor to rank items in priority order a good idea? What about “True/False” questions? And if the issue requires some explanation, should each question have an educational preamble to help donors decide their answers? Or, is that more confusing and could it make them feel stupid?
A review of a dozen survey packages I received in the past few months reveals mixed practices.
The number of questions ranged from six (the National Republican Senatorial Committee) to 32 (the Democratic Congressional Campaign Committee). Many surveyors grouped questions under numbered subheads and gave the illusion of a shorter or simpler survey.
Three surveys with lists of issues or multiple-choice options directed, “Please fill in all that apply” or “You may check more than one.” Excellent, because that way there are no “wrong” answers, and it prevents donors from setting the package aside until they have time to think it over — the very last thing we want them to do!
Many surveys asked for “optional” or “voluntary” personal information such as age, profession and voter status, but Judicial Watch probed even deeper with an inquiry into my estate-planning intentions and my interest in receiving information on planned giving. A little odd mixed with a series of questions about corruption in Congress, illegal immigration and the likelihood of rampant White House wrongdoings if Hillary Clinton is elected president in 2008 — but if it doesn’t hurt response and will surface planned gifts for the organization, it’s a bonus.