Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.
|
This resource provides definitions of evaluation terminology frequently used in the out-of-school time field. It also provides answers to frequently asked evaluation questions.
This paper offers ideas for the roles that evaluation can play in helping ensure a discussion about sustainability is started early enough and maintained throughout an initiative. The ideas in this paper are based on Harvard Family Research Project's broad spectrum of experience in the past two decades with large-scale initiatives.
This brief offers an in-depth review of logic models and how to construct them. A logic model can be a powerful tool for illustrating a program's theory of change to program staff, partners, funders, and evaluators. Moreover, a completed logic model provides a point of reference against which progress towards achievement of desired outcomes can be measured on an ongoing basis, both through performance measurement and evaluation.
A collaboration with the Finance Project, this brief provides practitioners of local out-of-school time programs with techniques, tools, and strategies for improving their program and tracking their effectiveness over time.
Sara Watson of the Pew Charitable Trusts explains that a results accountability system must extend beyond the purely technical to also address the management of people.
Amy Coates Madsen describes how, by setting best practices for nonprofits, the Standards for Excellence program both helps nonprofits to improve and increases public confidence in them.
Sharon Edwards and Ira Cutler of Cornerstone Consulting Group explain how organizations can use reflective assessments to assess their progress and consider the choices ahead.
John Bare of the Knight Foundation shares his foundation's definition of the term “risk” when it comes to investing in initiatives, borrowing from the language of money managers.
Kim Sabo of Sabo Consulting and Dana Fusco from York College, CUNY illustrate how they conducted a participatory evaluation of an after school literacy initiative to support its continuous improvement.
Paul Light is a Senior Fellow at the Brookings Institution in Washington, D.C., an instructor at the John F. Kennedy School of Government at Harvard University, and author of 14 books, including most recently Pathways to Nonprofit Excellence. Previously he was Director of the Public Policy Program at the Pew Charitable Trusts.
Philip Harris and Lori Grubstein of the Crime and Justice Research Center describe the “bottom-up” development of ProDES, an outcome-based information system that tracks youth in the juvenile justice system.
The following are excerpts from an evaluation panel at the conference, “Nurturing Strong Full Service Schools: Building Bridges with Communities,” that took place on May 20, 2002. It was the fifth in a series of national conferences about full service schools organized by Margot Welch and the Collaborative for Integrated School Services at the Harvard Graduate School of Education. Panelists shared their evaluation findings and lessons learned.
The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to this issue's theme of Evaluation for Continuous Improvement.
This issue of The Evaluation Exchange examines the use of evaluation for continuous improvement. It incorporates advice from well-known experts, such as Paul Light, Rosalie Torres, and Joe Wholey, outlines innovative evaluation practices, and provides insights into the evaluations of a wide range of initiatives.
Ann Dykman of MPR Associates illustrates that an organization's culture and mindset are important factors in the success of using evaluation for continuous improvement.
Rosalie T. Torres, Ph.D. is Director of Research, Evaluation, and Organizational Learning at the Developmental Studies Center in Oakland, California. Her 24-year career in evaluation has focused on researching, teaching, writing about, and practicing a learning approach to evaluation.
Professor of Public Administration, Joseph Wholey, explains that contrary to popular thought, it is possible to increase program equity without compromising program efficiency-through performance measurement and management systems.
Charlie Schlegel of Citizen Schools explains how their evaluation strategy successfully balances the need to determine program impact with the need for continuous improvement.
An introduction to the issue on Continuous Improvement by HFRP's Founder & Director, Heather B. Weiss, Ed.D.
This brief offers an overview of how out-of-school time programs can evaluate their family involvement strategies and practices. It draws on findings from our OST Evaluation Database, interviews, and email correspondence.
To inform municipal leaders who are developing out-of-school time evaluations, HFRP scanned the city-level initiatives in its evaluation profiles database and prepared this short brief that describes the evaluation approaches, methods, and performance measures that some cities are using for evaluation.
This report presents what has been happening in the field of public communication campaign evaluation in recent years. It examines evaluation challenges, criticisms, and practice and includes sections on relevant theory, outcomes, and useful methods for designing evaluations. It ends with opportunities for the road ahead.
This brief offers an in-depth look at the 21st Century Community Learning Center (21st CCLC) evaluation requirements (both performance measurement for accountability and program evaluation) and provides practical suggestions about how to implement 21st CCLC evaluation at the state and local level. It includes a checklist of issues to consider when designing state and local 21st CCLC evaluations.
Carl Dunst, Co-Director of the Orelena Hawks Puckett Institute, urges getting beyond the question of “what works” toward a more detailed scrutiny of the relationship among family support principles, program practice, and family outcomes.
Three experts in conducting Family Impact Seminars share their techniques for bringing research about families to legislators in a way that not only grabs their attention, but also supports policy change.