«Development, Security, and Cooperation Policy and Global Affairs THE NATIONAL ACADEMIES PRESS 500 Fifth Street, N.W. Washington, DC 20001 NOTICE: The ...»
While only USAID has the ability to develop and carry out rigorous evaluations of its projects’ impacts, many organizations are carrying out studies of various aspects of democracy assistance, and USAID’s staff can benefit from the wide range of insights, hypotheses, and lessons learned being generated by the broader community involved with democracy promotion.
CREATING THE CONDITIONSWhile it will take some time for USAID to learn from undertaking the pilot impact evaluations, it will gain immediately from augmenting its overall learning activities and increasing opportunities for DG staff to actively engage with current research and studies on democratization. Several committee members wish to emphasize the considerable value to policymakers and DG officers of the many books, articles, and reports prepared in recent years by academic researchers, think tanks, and practitioners. Whatever the methodological flaws of these case studies and process evaluations from a rigorous social sciences perspective, this expanding literature has provided important lessons and insights for crafting effective DG programs.
Turning Individual Experience into Organizational Experience: voices from the Field Realizing that its DG officers often had valuable insights and experiences gained from years of implementing projects in various conditions around the world, USAID’s Democracy Office began a pilot project under its Strategic and Operational Research Agenda (SORA) in 2005 to collect this information systematically. Called collectively “Voices from the Field,” this pilot project attempted to use extensive anonymous interviews with DG officers who had served in two or more missions around the world to understand whether there were attributes that commonly led to project success and/or failure. In this pilot phase of the project, SORA developed a standard set of interview questions for each of its initial participants. Given SORA’s mission, these questions were largely designed to elicit descriptions of the best and worst projects in which the DG officer had participated (see the interview protocol in Appendix F). It then conducted interviews with eight participants, each of which lasted about two hours. The results of these interviews revealed a wide range of responses, although common trends in project success and failure also seemed to emerge.
As part of its efforts to explore methodologies that could be used to learn from past experiences, USAID asked the committee to offer suggestions as to how the Voices from the Field project might be expanded and integrated into the overall SORA research design. Based on discussions with current and former DG officers, the committee decided to explore various options for expanding this project during at least one of its field visits (see Appendix E). Practical issues the committee wanted to understand about a potential Voices from the Field project included how frequently such interviews or debriefs should occur, who should conduct such interviews or collect such insights and experiences, and in which format(s) should the information be collected and disseminated. In addition, one issue that had not been explored in the initial pilot phase of the IMPROVING DEMOCRACY ASSISTANCE project conducted by USAID was whether or not those people who work for USAID DG missions around the world as foreign service nationals (or non-American citizens) would be able to provide additional sources of insight.
While in Peru the field team attempted to address these questions through a series of meetings with current DG officers and foreign service nationals, including a dedicated meeting with two foreign service nationals with considerable DG experience. As their tenure at the missions tends to be much longer than that of career DG officers, who move from one mission to another every one to four years, foreign service nationals tend to have a great deal of institutional knowledge and experience, often in particular subfields of DG programming such as decentralization or political party strengthening.6 It is their historical knowledge that often provides the continuity across projects over the long term.
With regard to the frequency with which interviews or debriefings should occur, it seemed that a systematic inquiry of this sort would optimally be conducted every 12 to 18 months. This time frame would be consistent with other annual reporting requirements and would largely be reflective of the natural life span of projects that DG officers and foreign service nationals oversee. Careful timing of interviews and debriefs is an important consideration given the workload of those in DG missions.
During the initial pilot phase of the Voices from the Field project, the interviews were conducted by USAID and the transcripts of the interviews were made available to USAID, although the interviewees’ names were not attached to the transcript. The committee was also interested to learn whether participants would feel more comfortable responding to an interviewer who did not work for USAID even if their responses were anonymous. There was a question as to whether or not participants would feel comfortable honestly responding when asked to identify the primary attributes of both successful and unsuccessful projects if USAID were asking the questions. During the field visit inquiries the team found that this was not a great concern to potential participants. In fact, they said they felt very comfortable providing honest responses, even when discussing less successful aspects of programs. Further, they remarked that such honest discussions were a routine part of their work at that mission. The one aspect of their work, however, that those interviewed would like to highlight to a greater extent was success in more routine matters.
They expressed the desire to have a voice in sharing smaller everyday successes, which are often overlooked by bigger projects, programs, and efforts.
Finally, if these interviews were undertaken on a larger scale in the 6 The field teams in Albania and Uganda met equally experienced foreign service
future, the committee would be interested in learning which formats may be most beneficial in both collecting and disseminating information gathered from these interviews and debriefings. In Peru, foreign service nationals in particular expressed a willingness to participate in face-toface interviews, to complete written surveys, or to complete surveys or interviews conducted through other means such as a Web-based interface.
Their primary request, however, was that the results of the interviews or debriefings be widely shared with them and with other DG professionals around the world. They expressed concern that opportunities for learning may be lost if the interviews are given and no information on the insights or lessons learned was to reach those working in the missions. There was great interest in learning from their experiences as well as those of colleagues around the world; therefore they hoped that information from such programs would flow both in from and out to the field missions.
Depending on the interview design, information collected through a Voices from the Field project focused on systematic debriefings of DG officers, and foreign service nationals could offer very detailed information on project implementation or more general insights about potential sources of project success or failure. These would not be substitutes for the empirical evidence that impact evaluations could offer. They could, however, complement the face-to-face interactions of annual DG officers and partners recommended above by compiling a systematic record of experience; the results of these interviews might become part of the renewed conferences, further encouraging the sharing of experiences and collective learning.
As an opportunity for continued learning from its wealth of experiences, the concept of “Voices from the Field” is consistent with SORA’s overall goal of better understanding what has worked, why, and under what conditions. Other organizations, such as the military, employ such systematic debriefing techniques, often with great benefit. On a more ambitious level, other, more academic uses of oral history could complement or be a resource for the retrospective studies discussed in Chapter 4.
Even more ambitious efforts to use “truth telling” conferences to add information and explore the varying perceptions of key historical events that have influenced how USAID views its ability to affect democratization could potentially yield valuable insights.7 Given the potential benefit of learning from the insights and expertise of DG officers and FSNs, the pilot project seems to offer USAID an opportunity to gain unique project-specific information it cannot acquire through other means. If incorporated into a larger framework designed to 7An example from the foreign policy field is the work of James Blight and his colleagues on the Cuban Missile crisis (Blight and Welch 1989), which eventually included senior U.S., Soviet, and Cuban officials who had taken part in the decision-making process.
IMPROVING DEMOCRACY ASSISTANCE increase learning across the organization, “Voices from the Field” would complement other systematic approaches to gathering and employing more rigorously obtained information. The Committee therefore recommends that USAID consider a modest investment in continuing an improved “voices from the Field” project, the results of which would be made available to USAID Dg officers and FSNs. During the period of the evaluation initiative that we recommend in the next chapter, special attention might be given to interviews with those carrying out the new procedures for impact evaluations. If SORA decides to undertake additional retrospective efforts, either by commissioning its own case studies or systematically mining current academic research, then more ambitious oral history or “truth telling” conferences might be part of the mix.
While there is an opportunity to learn from this project, learning will only occur if that information is systematically collected and disseminated to those who may gain from that information, such as DG officers, FSNs, and other USAID employees involved in project direction and management. Further, as was clear from the discussions held in the field with DG professionals, their willingness to continue to participate in such efforts was largely linked to their ability to learn from the results. The insights and experiences collected must not only be studied, analyzed, and incorporated into a larger framework of learning, but they must also be shared in an easily accessible format with those who stand to directly gain from this information. This could be accomplished through the development of a Web-based interface where respondents could complete surveys and interviews via their work computers and also access the results of other respondents. Other dissemination options should also be considered, such as providing annual results at conferences and gatherings of DG officers and professionals. Whatever the mechanism for collection and dissemination selected, if USAID chose to continue this project, it should follow standard best practices and the results should be made widely available.
CONCLUSIONSThe potential changes to current USAID policy and practices discussed in this chapter range from specific suggestions for the contracting process to a broad shift in the organization toward a much more systematic effort to share and learn from its own work and that of others. In the next chapter we introduce a set of specific recommendations based around a DG evaluation initiative intended to increase the capacity of USAID to support and undertake a variety of well-designed impact evaluations, and to improve its organizational learning. We believe this initiative will demonstrate the value of increasing USAID’s ability to assess exactly what its DG programs accomplish, and provide guidance to help USAID
CREATING THE CONDITIONSbetter determine which projects to use, in which conditions, to best assist democratic progress.
REFERENCESBlight, J.G., and Welch, D.A. 1989. On the Brink: Americans and Soiets Reexamine the Cuban Missile Crisis. New York: Hill and Wang.
Clapp-Wincek, C., and Blue, R. 2001. Ealuation of Recent USAID Ealuation Experience. Washington, DC: Center for Development Information and Evaluation, USAID.