You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Heather Weiss of Harvard Family Research Project and William Morrill of Mathtech discuss implementing knowledge development investments to solve the country's basic problems.

As America struggles with globalization and devolution, allocation of scarce resources, and growing public demand for accountable and cost-effective services, the demand for and the use of information to guide decision-making should be at an all-time high. They are not. Despite substantial investment that has been made in knowledge development over the years, experience shows that information about public programs has often not been used for policy formation, program design, or the high performance management of existing services (Darman, 1997; Levitan, 1992).

Now is an opportune time to begin working to make our knowledge development investments, including evaluation, more useful in the country’s efforts to solve its basic problems. To a great extent, the conditions, experience, and ingredients necessary to generate and use knowledge are largely in place. The pervasive, multilevel push for greater accountability has resulted in a new demand for a wide array of high-quality information to support policymakers and program managers as well as growing practical experience with indicators. There is also growing knowledge about how to build the capacity of public managers, community groups, and nonprofit organizations to use performance and other data from management information systems in an ongoing way to improve performance and outcomes (Usher, 1993; Gray and Associates, 1998). Additionally, domestic policy initiatives, such as standards-driven educational reform, managed care, and the privatization of human services, are forcing a critical examination of how best to invest limited research and evaluation dollars. The evaluation field itself has also changed, moving from a focus on single, large-scale experiments to more flexible, participatory, and community-based studies. Finally, the recent devolution of responsibility for welfare programs to the state and local levels (and the flexibility and governance changes implied by this shift) provides an unparalleled opportunity to develop the multilevel capacity to generate and use information to improve outcomes for our nation’s people significantly.

To move to a truly relevant system of information, we must strive to institutionalize knowledge development and use so that programs can continuously improve and be more accountable. Private-sector experience suggests that we must create “learning organizations.” A learning organization is able to create, acquire, and transfer knowledge as well as modify its own behavior to reflect new knowledge and insights (Garvin, 1993). To be successful “learners,” organizations must be willing and have the capacity to use data on a regular basis, assess implications, and make changes accordingly. Experience with past information building endeavors points to eight lessons that can guide the creation of public sector learning organizations:

  1. Investing in knowledge building should be strategic in concept, targeted to maximize impact.

  2. Learning must consider the incremental and iterative nature of the policy and decision-making process and thus should be structured to flow as continuously as possible at all levels.

  3. Collaboration among the research/ evaluation, the policy, and the practitioner communities is necessary.

  4. An improved learning system must make reasonable judgments about the content and timing of expectations for learning activities and the consequences that flow from results. It needs to emphasize low-risk learning, especially at the outset.

  5. Selection of learning methodologies must be judicious, both in terms of learning objectives and timing.

  6. Investment in a variety of data collection strategies, including an integrative MIS in support of operations, needs to be considered.

  7. Efforts will require strong federal leadership and support and should build on information and indicator work already underway.

  8. More integrated funding of evaluations and programs will be important in developing continuous learning systems that improve outcomes.

The first step in building a learning system is developing the common ground that unites the three communities relevant to the development of public sector learning organizations: the knowledge-building community, the policy and managerial community, and the professional services (“doing”) community. This common ground includes:

  • Development of a Collaborative Learning Agenda. Multiple forces push agencies to develop learning agendas, including the Government Performance and Results Act (GPRA) and the congressional reauthorization process. A learning agenda should be built around a learning process that emphasizes sequential learning, iterative and participatory processes, use of the full range of learning methodologies, and attendance to means whereby learning will be communicated for both policy and program improvement purposes. A longer-range objective of building the research agenda and the evaluation strategy should be the creation of learning organizations.

  • Performance Goals and Measures. Given the explosion of interest in performance indicators and standards due to the accountability movement, both the policy and the learning communities have an opportunity and interest in coming together for a discussion about what goals are being sought and what measures will be used to judge how well they are being achieved.

  • Allocation of Resources. Financial support will be necessary to implement the research agenda and the new learning approaches. All parties will want to move sequentially with more resources and then assess the results.

  • Accountability Consequences. All stakeholders must have reasonable expectations for and promises about their collective efforts. There should also be clarity and agreement about the consequences of the learning process from the outset.

  • Education and Communication. The three communities must work together on an education and ongoing communications component. Actors who have been talking at and sometimes past each other need to communicate clearly with each other, and the groups included in the dialogue should be broadened. More attention needs to be paid to the creation of a dissemination strategy in order to maximize the many potential payoffs from research and evaluation.

In order to begin the discussion about how to improve knowledge development and learning in the public sector, we propose a model for a continuous learning system. This model seeks to create continuous opportunities for the development and use of relevant information; for encouraging corrective actions, risk taking, and participation; and for recognition and rewards for performance improvement. It is intended to stimulate thinking about a learning system and will be revised based on comment and through examination of efforts to implement systems like it at the federal, state, and local levels. The model envisions a continuous, five-stage learning process:

  1. Engage key stakeholders in strategic planning and set learning agenda and performance goals and standards. Activities in this step include obtaining resources and commitment to learning, specifying performance goals, identifying research and evaluation questions and gaps, and designing an overall learning agenda.

  2. Learn from experience and relevant research and incorporate lessons into program/policy design. Activities here include assembling resources, specifying outcome and process measures and data to support them, networking to share successful innovations and identify common problems, and identifying technical assistance needs and providers.

  3. Engage in innovation, monitoring, and evaluation. Activities include continuously testing new ideas and approaches, designing evaluation, and monitoring and assessing process and progress with performance measures, evaluation, and data.

  4. Learn from evaluation and comparisons with others and make course corrections. Activities include using monitoring and evaluation information for corrections and improvement, using benchmarking to examine progress of the program and/or field of practice/policy, and assessing and applying knowledge from relevant basic and applied research.

  5. Transfer lessons for program respecification and identify knowledge gaps for further research and experimentation. Activities here include identifying gaps for further research and transferring knowledge for continuous improvement across the network and service/policy field.

Continuous Learning System Diagram

There are examples of organizations that have worked to apply such a learning model. They suggest the feasibility of such learning models and offer cases to use to see if such approaches appeal to both the policy and learning communities. Over the next few years, as devolution and accountability are further implemented, there will no doubt be more experiences and opportunities to implement a learning model such as this. At the same time, a series of related and supporting actions could be taken to better connect knowledge, policy, and practice. These include:

  • Systematic learning agendas could be added to the strategic planning, annual performance plans, and performance measures now incorporated at the federal level in GPRA and similar GPRA-like requirements at the state level.

  • Similar learning agendas could be continued as newly added to reauthorization legislation.

  • Public sponsors could offer increased regulatory flexibility in return for clear outcome accountability to provide incentives for innovation.

  • Foundations and corporate philanthropy could make a significant contribution to the development of institutional capacity for learning at the grassroots level.

  • While retaining their own identities, public agencies and private foundations could be far more effective in the aggregate by being considerably more open with each other and specific about their individual learning agendas.

  • Public and private agencies could invest more systematically in the development of knowledge about what works.

The framework presented here has many of the key features that we believe are necessary for a new learning and knowledge-building strategy that will better inform and guide public action: collaborative buy-in; a transparent and public system of performance measurement and accountability; opportunity for innovation, rigorous experimentation, and program and policy redesign; and processes and mechanisms to share lessons and learning. This approach has the potential to produce information that is more likely to be used for public policy decisions and resource allocation as well as to improve program and policy outcomes than the current system of episodic evaluation and experimentation now does. In the era of accountability, we believe that this approach has the potential to engage all the necessary players from the national to the community level in the learning process—and thereby create a learning community.

Sources

Darman, R. (1996, December 1). Riverboat gambling. New York Times Magazine, pp. 116–117.

Garvin, D. A. (1993, July–August ). Building a learning organization. Harvard Business Review, pp. 78–91.

Gray, Sandra Trice and Associates. (1998). Evaluation with power. Washington, DC: Independent Sector.

Levitan, S. A. (1992e). Evaluation of federal social programs: An uncertain impact [Occasional Paper 1992-2, June 1992]. Washington, DC: Center for Social Policy Studies, George Washington University.

Usher, C. L. (1993). Building capacity for self-evaluation in family and children’s services reform efforts. American Evaluation Association.

Heather B. Weiss, Director, HFRP

William A. Morrill
Senior Fellow
Mathtech, Inc.

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project