You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Innovation Network1 describes their methodological innovation—the intense-period debrief—use to engage advocates in evaluative inquiry shortly after a policy window or intense period of action.

In the spring and summer of 2006, following a groundswell of activities that included marches in cities from coast to coast, every major U.S. news outlet was focused on the immigration debate. Innovation Network, as the evaluator of the Coalition for Comprehensive Immigration Reform (CCIR)—a collaborative of immigrant advocacy, grassroots, and religious groups, labor organizations, and policy leaders on Capitol Hill and throughout the U.S.—found itself facing unexpected challenges. Indeed, any high profile issue and intensive movement, such as the immigration debate, poses challenges to evaluators attempting to capture activities, especially real-time efforts, for ongoing learning.

With support from The Atlantic Philanthropies, a private foundation headquartered in Bermuda, Innovation Network sought to help document CCIR's work as it unfolded and capture best practices to inform other coalitions and the advocacy field. Because of the natural peaks and valleys of the immigration reform campaign, Innovation Network needed to remain flexible and to experiment with different approaches that would yield valuable information for CCIR, Atlantic, and the sector. The evaluation sought to provide an opportunity for continuous learning, so that CCIR leadership could act on evaluation findings and make real-time adjustments to their activities and strategies.

Short of policy changes, the evaluation was intended to yield a set of indicators or benchmarks that would signal the coalition's progress. All parties hoped that a better understanding of interim progress indicators would help funders, evaluators, and advocates identify what it takes to build a successful national coalition movement for human and civil rights.

Lessons About Evaluating Policy Advocacy: Context Matters

• When the campaign followed an offensive strategy, as happened during the support for the Hagel-Martinez compromise in the Senate in the spring of 2006, the coalition was more divided, resulting in a higher level of internal discomfort among its members. In this case, the evaluator is likely to get better and more candid information using key informant interviews as opposed to a group meeting.

• When the campaign followed a defensive strategy, as was the case in late summer and fall of 2006 due to stalemate between the House and Senate versions of immigration reform policy, the coalition was more unified. In this case, the focus group approach for debriefing an intense period is likely to be a comfortable format that can yield good information.

• It is useful to have an evaluation partner on standby (because the evaluator cannot predict the timing and pace of events) who is able and available to address learning opportunities from an objective perspective. New questions can emerge as events unfold. Flexibility is required to administer tools based on context, such as:

  • Public mood and political context of the opportunity window
  • Peaks and valleys of the policy advocacy cycle
  • The “inner circle” surrounding policymakers and the story of what happens behind the scenes
  • The players involved in an intense period and what activities took place

Initial Approach
Formed in 2004, CCIR sought to drive a legislative campaign to enact historic federal policy change in the form of “comprehensive immigration reform.” The campaign's premise is that the U.S. immigration system is broken and must be fixed to address the flow of people coming into the country and the 12 million undocumented immigrants who are already here. In addition, coalition organizations subscribe to five key principles: reform must include (a) a path to citizenship, (b) family reunification, (c) worker protection, (d) effective enforcement of the rule of law, and (e) civic participation to facilitate the integration of newcomers in local communities.

Initially, the CCIR evaluation design incorporated a variety of methods to collect data that would help answer key evaluation questions. Many aspects of the evaluation effort were similar to aspects of direct services work. Consequently, the general methodology consisted of gathering qualitative and quantitative data through traditional methods, including interviewing key informants, conducting surveys, reviewing documents, and documenting meetings on core strategies. Due to hectic timelines and stressful work plans associated with a campaign of this scale, the evaluation approach needed to emphasize data collection methods that imposed the least burden and demands on CCIR leadership and coalition members. Evaluators therefore chose to make use of the frequent opportunities for collecting qualitative and quantitative data through tracking and analysis of media coverage, legislation, field activities, and polling.

However, the fast pace of events, and the Coalition's rapid response to them, soon necessitated a greater amount of real-time data collection. The evaluation team began conducting more frequent observation and monitoring of the coalition dynamics that played out in meetings and conference calls. Other challenges inherent to collecting real-time data included massive amounts of data generated through numerous email lists, documents, and field reports.

The CCIR evaluation quickly proved to be time consuming and resource intensive on the part of the evaluators; there never seemed to be a down time. As the data collection activities kicked into full swing, Innovation Network gained greater insight about the unique and distinctive qualities of evaluating advocacy and policy change work (see sidebar). Two of these factors had a considerable impact on the data collection phase of the evaluation:

  • A legislative policy campaign, like advocacy work generally, involves faster cycles of evolving strategies out of the necessity to react to opportunity windows and respond to external factors.
  • The complex interactions among myriad players and stakeholder audiences—who are located along a continuum of connections to and engagement with policymakers—present greater challenges in capturing multiple stories and angles that oftentimes occur simultaneously.

For these reasons, the evaluators found they could not rely solely on traditional data collection methods and instead had to shift to a new approach.

Developing the Intense-Period Debrief
In the spring of 2006, the CCIR campaign was in the midst of what Innovation Network staff referred to as an opportunity “moment” or “window,” a phenomenon that has been described by other researchers of policy change. Due to external events, political and economic conditions, and the dynamics among multiple “players” around an issue, organizations that conduct policy advocacy cannot adequately predict nor control the influence that external forces have on their ability to achieve desired outcomes. In the case of CCIR, the campaign experienced a simultaneous emergence of several opportunities for immigration reform—a 3- to 6-month legislative opportunity following a bipartisan compromise in the Senate, an energized field that sparked historic mass demonstrations by hundreds of thousands of immigrants and their supporters in cities across the country, and heightened competition for claiming leadership of the movement by newly emerging national and immigrant rights groups.

During this intense period, Innovation Network continued to monitor numerous meetings and conference calls and read hundreds of emails and documents. But it was unthinkable to conduct interviews with coalition leaders, which resulted in gaps in the data. Moreover, immediately following this intense period, there was a tangible burnout among everyone in the campaign. The existing methods were not effective in fully capturing the multiple perspectives and many different stories of what happened, especially accounts of interactions with policymakers and their staff.

In recognition of the context within which the evaluation was occurring, the evaluation team designed a “Debrief Interview Protocol” specifically for intense periods of advocacy. The intent of this protocol was to engage key players in a focus group shortly after a policy window or intense period occurred, to capture the following information:

  • The public mood and political context of the opportunity window
  • What happened and how the campaign members responded to events
  • What strategies they followed
  • Their perspective on the outcome(s) of the period
  • How they would change their strategies going forward based on what they learned during that period

How is evaluating advocacy and policy change work unique?

• Common program evaluation challenges—ranging from attribution to limited organizational capacity and the role of external factors—can be more acute when evaluating advocacy.

• Advocacy generally involves faster cycles of evolving strategies based on advocates' need to react to opportunity windows.

• Complex interactions among myriad players and audiences—who are located along a continuum of engagement with policymakers—present greater challenges in capturing multiple stories and angles that often occur simultaneously.

• Advocacy typically affects and involves more people and communities (breadth), and leads to more fundamental changes in the legal, economic and social structures of society (depth) than direct service work, which often addresses symptoms of social ills rather than root causes.

By focusing on a specific moment in the campaign and conducting it in a timely manner, this method gathered in-depth and real-time information, while keeping the interaction targeted, practical, and relevant. The idea of the debrief grew out of the need to have a forum that encouraged participation from key groups and individuals engaged in different layers or “spheres of influence” surrounding decision makers. It was—and continues to be, as the campaign and evaluation continues—particularly useful for providing a way for individuals in the “inner circle” of those spheres, or concentric circles, to tell the story of what happened behind the scenes.

Will this approach work for all advocacy initiatives? Certain contextual and methodological factors should be considered when deciding if, when, and how to administer this tool. The Innovation Network evaluation team works with a small advisory group from the campaign to decide how to identify and anticipate intense periods so that individuals can be invited to participate in a debrief that is timely and captures specific information.

The novel aspects of the debrief lie in its systematic application to follow the peaks and valleys of the policy advocacy cycle. It also allows for continued tailoring of the selection of participants and, to some degree, the questions asked based on the nature of the intense period, the parties involved, and the activities that occur. As other campaigns experience similar highs and lows, it will be useful to see if application and administra-tion of this debrief protocol has wider application and implications. If so, advocates, evaluators and funders may find this new approach a standard protocol for future evaluations of advocacy initiatives.

1 Innovation Network is a national nonprofit based in Washington, DC, that provides evaluation services, consulting, training and online resources for the sector. For more information on advocacy and related evaluation, see www.innonet.org.

Jennifer Bagnell Stuart
Senior Associate
Innovation Network, Inc.
1625 K Street, NW, 11th Floor
Washington, DC 20006
Tel: 202-728-0727
Email: jabstuart@innonet.org

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project