Surveys and interviews form the central methodology for analyzing and discovering attitudes and opinions in social science research. With the advent of Web, online surveys have become an efficient way for researchers to collect and analyze large amounts of data. The popularity of the online survey tools like SurveyMonkey , Zoomerang, SurveyGizmo , etc. are testament to the productivity enabled by surveys. However, surveys represent a top-down rigid methodology forcing the survey designer to account for all possible answers up front, which is an impossible feat. In contrast, interviews allow the unanticipated information to bubble up bottoms up from the respondents. For instance, Integrity Watch Afghanistan (IWA), Afghan Perceptions and Experiences with Corruption: A National Survey 2010 primary data, involves interviewing randomly selected 6,500 respondents in 32 provinces on over 100 questions that deal with sectors where people experienced corruption; levels of bribes people paid to obtain services; what type of access people had to essential services; who people trusted to combat corruption; and experiences with corruption in the judiciary, police, and land management. However, the interview methodology is expensive and time-consuming as it requires implementation by research companies with expertise in effective research design, and precise management of data collection over several months.
Is there an alternative to surveys and interviews in social science research? Prof. Salganik’s team at Princeton came up with a hybrid approach, “wiki surveys”, that combines the structure of a survey with the open-endedness of an interview. To date, various organizations have created more than 1000 wiki surveys on the project Web site – All Our Ideas, generating in 45,000 ideas with 2 million votes. Wiki surveys range from the New York City Mayor’s Office’s engagement with citizens in shaping the city’s long term sustainability plan to the Catholic Relief Services surveying their 4000 employees to find out what makes an ideal relief worker. The figure below shows how the third question in Tactical Conflict Assessment Planning Framework (TCAPF) would be be implemented as a wiki survey:
Inspired by extending the kittenwar concept to ideas, the user interface guides the respondent to choose between two random alternatives, while encouraging the respondents to add their ideas into the mix of alternative responses. The additional ideas are added into the survey’s marketplace and voted up or down by the other survey-takers. Prof. Salganik says that “One of the patterns we see consistently is that ideas that are uploaded by users sometimes score better than the best ideas that started it off. Because no matter how hard you try, there are just ideas out there that you don’t know.”
All Our ideas have some basic visualization features to make sense of the wiki survey responses. Here is the visualization for the responses – “What do you think the Digital Public Library of America (DPLA) should be like?”:
It is worth noting that the top scoring 15 ideas starting with DPLA interoperability with Government Printing Office (GPO), Defense Technical Information Center (DTIC), an National Records Archive Administration (NARA) are all uploaded ideas not in the original set of alternatives. A powerful argument for crowd sourcing!
Admittedly, we still need boots on the ground to collect TCAPF data in Afghanistan given the demographics of the people we want to reach. On the other hand, wiki surveys hold great potential in reaching the younger generation fueling the Arab spring and the like.