You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Browse by Topic
 | 
All Publications & Resources

Selected Evaluation Terms

This resource provides definitions of evaluation terminology frequently used in the out-of-school time field. It also provides answers to frequently asked evaluation questions.

Priscilla M. D. Little (2002) Research Report

Evaluation's Role in Supporting Initiative Sustainability

This paper offers ideas for the roles that evaluation can play in helping ensure a discussion about sustainability is started early enough and maintained throughout an initiative. The ideas in this paper are based on Harvard Family Research Project's broad spectrum of experience in the past two decades with large-scale initiatives.

Heather B. Weiss (December 2002) Research Report

Learning From Logic Models in Out-of-School Time

This brief offers an in-depth review of logic models and how to construct them. A logic model can be a powerful tool for illustrating a program's theory of change to program staff, partners, funders, and evaluators. Moreover, a completed logic model provides a point of reference against which progress towards achievement of desired outcomes can be measured on an ongoing basis, both through performance measurement and evaluation.

Harvard Family Research Project (2002) Research Report

Documenting Progress and Demonstrating Results: Evaluating Local Out-of-School Time Programs

A collaboration with the Finance Project, this brief provides practitioners of local out-of-school time programs with techniques, tools, and strategies for improving their program and tracking their effectiveness over time.

Priscilla M. D. Little , Sharon DuPree, Sharon Deich (September 2002) Research Report

ProDES: A Continuous Improvement System for Juvenile Justice

Philip Harris and Lori Grubstein of the Crime and Justice Research Center describe the “bottom-up” development of ProDES, an outcome-based information system that tracks youth in the juvenile justice system.

Philip W. Harris (Fall 2002) Evaluation Exchange Article

Advice From the Experts on Nurturing Strong Full Service Schools

The following are excerpts from an evaluation panel at the conference, “Nurturing Strong Full Service Schools: Building Bridges with Communities,” that took place on May 20, 2002. It was the fifth in a series of national conferences about full service schools organized by Margot Welch and the Collaborative for Integrated School Services at the Harvard Graduate School of Education. Panelists shared their evaluation findings and lessons learned.

Harvard Family Research Project (Fall 2002) Evaluation Exchange Article

New & Noteworthy

The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to this issue's theme of Evaluation for Continuous Improvement.

Harvard Family Research Project (Fall 2002) Evaluation Exchange Article

Evaluation for Continuous Improvement

This issue of The Evaluation Exchange examines the use of evaluation for continuous improvement. It incorporates advice from well-known experts, such as Paul Light, Rosalie Torres, and Joe Wholey, outlines innovative evaluation practices, and provides insights into the evaluations of a wide range of initiatives.

Evaluation Exchange Issue

Mindset Matters

Ann Dykman of MPR Associates illustrates that an organization's culture and mindset are important factors in the success of using evaluation for continuous improvement.

Ann Dykman (Fall 2002) Evaluation Exchange Article

What Is a Learning Approach to Evaluation?

Rosalie T. Torres, Ph.D. is Director of Research, Evaluation, and Organizational Learning at the Developmental Studies Center in Oakland, California. Her 24-year career in evaluation has focused on researching, teaching, writing about, and practicing a learning approach to evaluation.

Rosalie Torres, Ph.D. (Fall 2002) Evaluation Exchange Article

Increase Equity Without Losing Efficiency

Professor of Public Administration, Joseph Wholey, explains that contrary to popular thought, it is possible to increase program equity without compromising program efficiency-through performance measurement and management systems.

Joseph S. Wholey (Fall 2002) Evaluation Exchange Article

Moving Ahead on Results Accountability: The Human Element

Sara Watson of the Pew Charitable Trusts explains that a results accountability system must extend beyond the purely technical to also address the management of people.

Sarah Watson (Fall 2002) Evaluation Exchange Article

Helping Nonprofits Strive for High Performance

Amy Coates Madsen describes how, by setting best practices for nonprofits, the Standards for Excellence program both helps nonprofits to improve and increases public confidence in them.

Amy Coates Madsen (Fall 2002) Evaluation Exchange Article

Reflective Assessments: A Tool for Learning

Sharon Edwards and Ira Cutler of Cornerstone Consulting Group explain how organizations can use reflective assessments to assess their progress and consider the choices ahead.

Sharon Edwards , Ira Cutler (Fall 2002) Evaluation Exchange Article

Risk

John Bare of the Knight Foundation shares his foundation's definition of the term “risk” when it comes to investing in initiatives, borrowing from the language of money managers.

John Bare, Ph.D. (Fall 2002) Evaluation Exchange Article

Participatory Evaluation for Continuous Improvement

Kim Sabo of Sabo Consulting and Dana Fusco from York College, CUNY illustrate how they conducted a participatory evaluation of an after school literacy initiative to support its continuous improvement.

Kim Sabo, Ph.D. , Dana Fusco, Ph.D. (Fall 2002) Evaluation Exchange Article

A Conversation With Paul Light

Paul Light is a Senior Fellow at the Brookings Institution in Washington, D.C., an instructor at the John F. Kennedy School of Government at Harvard University, and author of 14 books, including most recently Pathways to Nonprofit Excellence. Previously he was Director of the Public Policy Program at the Pew Charitable Trusts.

Julia Coffman , M. Elena Lopez (Fall 2002) Evaluation Exchange Article

Evaluating Citizen Schools

Charlie Schlegel of Citizen Schools explains how their evaluation strategy successfully balances the need to determine program impact with the need for continuous improvement.

Charlie Schlegel (Fall 2002) Evaluation Exchange Article

From the Director's Desk

An introduction to the issue on Continuous Improvement by HFRP's Founder & Director, Heather B. Weiss, Ed.D.

Heather B. Weiss, Ed.D. (Fall 2002) Evaluation Exchange Article

Beyond the Head Count: Evaluating Family Involvement in Out-of-School Time

This brief offers an overview of how out-of-school time programs can evaluate their family involvement strategies and practices. It draws on findings from our OST Evaluation Database, interviews, and email correspondence.

Margaret Caspe , Flora Traub, Priscilla M.D. Little (August 2002) Research Report

Evaluating Municipal Out-of-School Time Initiatives

To inform municipal leaders who are developing out-of-school time evaluations, HFRP scanned the city-level initiatives in its evaluation profiles database and prepared this short brief that describes the evaluation approaches, methods, and performance measures that some cities are using for evaluation.

Priscilla M. D. Little , Flora Traub (2002) Research Report

Public Communication Campaign Evaluation: An Environmental Scan of Challenges, Criticisms, Practice, and Opportunities

This report presents what has been happening in the field of public communication campaign evaluation in recent years. It examines evaluation challenges, criticisms, and practice and includes sections on relevant theory, outcomes, and useful methods for designing evaluations. It ends with opportunities for the road ahead.

Julia Coffman (May 2002) Research Report

Evaluation of 21st Century Community Learning Center Programs: A Guide for State Education Agencies

This brief offers an in-depth look at the 21st Century Community Learning Center (21st CCLC) evaluation requirements (both performance measurement for accountability and program evaluation) and provides practical suggestions about how to implement 21st CCLC evaluation at the state and local level. It includes a checklist of issues to consider when designing state and local 21st CCLC evaluations.

Priscilla M. D. Little , Flora Traub, Karen Horsch (April 2002) Research Report

Family Support America: Supporting “Family Supportive” Evaluation

David Diehl of Family Support America outlines their top evaluation projects: compiling an online national database of family support programs and developing new ways to measure the effectiveness of family support programs.

David Diehl, Ph.D. (Spring 2002) Evaluation Exchange Article

Strengthening Programs and Summative Evaluations Through Formative Evaluations

Two evaluators from SRI describe the benefits realized by the Parent Institute for Quality Education when they prefaced their summative evaluation with a formative evaluation.

Shari Golan, Ph.D. , Dana Petersen, M.A., M.P.H. (Spring 2002) Evaluation Exchange Article

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project