After participating in two rigorous impact evaluations of Community-Driven Development/Reconstruction (CDD/R) in Liberia and DRC, IRC and DFID embarked on this review as a next step in learning. They also wanted this review to inform design and evaluation strategies for new CDR programming in Somalia.

Findings:

  • CDD/R programs – that empower local communities to directly participate in development activities and to control resources to do so – aim to improve socio-economic well-being, governance, and social cohesion at a local level. While CDD/R is context-driven, it is generally implemented as a standard model.
  • According to rigorous impact evaluations from programs in Afghanistan, Democratic Republic of Congo (DRC), Aceh (Indonesia), Liberia and Sierra Leone, and interviews with practitioners, policymakers and academics, the record of CDD/R in conflict-affected contexts is mixed and, overall, disappointing in terms of reaching the ambitious goals set out.
  • As currently designed, implemented, and evaluated, CDD/R is better at generating the more tangible economic outcomes than it is at generating social changes related to governance and social cohesion, although even the economic effects are found in just a few studies. Moreover, CDD/R programming is better at producing outcomes directly associated with the project rather than broader changes in routine life.
  • CDD/R has been plagued by a panacea-type approach to goals and a generalized theory of change that is, as interviewees characterized it, “lofty”, “unrealistic”, “inherently flawed” and even “ridiculous”.
  • A variety of issues related to program design merit rethinking: the relatively short timeline of CDD/R projects, the small size of block grants, the limited reach of the projects, the menu restrictions on CDD/R programming, the limitations of social infrastructure, the quality and intensity of social facilitation, the manner in which communities are conceptualized and thus often not meaningful to participants, and how community institutions build on existing institutions and relate to the state.
  • Although the evaluations reviewed here are of high quality, they raise a number of methodological questions about the best measures and instruments for evaluating CDD/R, the timing of measurement, and levels of analysis, as well as if and how evaluations impact projects and outcomes.
  • Open and honest conversation about CDD/R – which has occurred too infrequently – must guide the way forward.
  • Future CDD/R efforts also need to be guided by humility and more realistic goals.
  • More questions can and should be asked in evaluations. Areas for future research on CDD/R consist of comparing CDD/R to other programming rather than a counterfactual of no program, parsing the social and economic aspects of program inputs and consequent outcomes, introducing variation within treatment communities to learn more about program design and contextual features, and asking how and why questions about the CDD/R process, and the outcomes it generates. Stronger monitoring is essential.
  • The road ahead must build on the important work undertaken so far and the many questions raised here, not simply replicate what has been done in the past.