You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Browse by Topic
 | 
All Publications & Resources

Understanding Parental Views of School Climate: Frequently Asked Questions About a New Survey Tool

K–12 schools are the foundation for children’s learning, and students in schools with positive climates tend to do well academically. Read about and download a new survey tool that families and school reformers can use to measure parents’ perceptions of school climate.

Beth Schueler (December 11, 2014) Research Report

A Review of Activity Implementation in Out-of-School Time Programs

This Snapshot examines the range and scope of activities being implemented in current out-of-school time programs to set a context for understanding the links between program activities and positive outcomes for youth.

Suzanne Bouffard , Priscilla M. D. Little (August 2003) Research Report

Detangling Data Collection: Methods for Gathering Data

This Snapshot describes the common data collection methods used by current out-of-school time programs to evaluate their implementation and outcomes.

Suzanne Bouffard , Priscilla M. D. Little (August 2004) Research Report

Insights From Teaching a Graduate Evaluation and Improvement Science Course

In teaching Learning From Practice: Evaluation and Improvement Science at Harvard Graduate School of Education, Candice Bocala creates ample opportunities for students and partner organizations to work together as they explore the complexities of program evaluation. Discover the three insights Bocala has learned about program evaluation along the way.

Candice Bocala (November 12, 2015) Research Report

Why We Need to Slow Down When It Comes to Evaluation

Read about lessons HFRP has learned from supporting evaluation efforts in the field. This commentary highlights the value of investing time to carefully consider the theory behind a program for evaluation to yield usable and actionable information.

Carolina Buitrago (November 12, 2015) Research Report

Framing Program Evaluation: Why We Should Tinker With Theories of Change and Logic Models

While evaluation needs may vary, all organizations can benefit from utilizing theory-based evaluation tools to frame evaluation efforts. This article explores how three organizations developed their program’s theory of change and logic model.

Carolina Buitrago (November 19, 2015) Research Report

Learning for All: The Value of Field Experience in Training a New Generation of Program Evaluators

Field experience in evaluation inquiry is a promising approach to preparing the next generation of evaluators. Learn what one group of student consultants and organizations did to make a field experience in evaluative inquiry a positive one.

Carolina Buitrago with Sunindiya Bhalla, Nomi Davidson, Sarah Davila, Anairis Hinojosa, Babe Liberman, and Katie Tosh (December 3, 2015) Research Report

Beyond the Head Count: Evaluating Family Involvement in Out-of-School Time

This brief offers an overview of how out-of-school time programs can evaluate their family involvement strategies and practices. It draws on findings from our OST Evaluation Database, interviews, and email correspondence.

Margaret Caspe , Flora Traub, Priscilla M.D. Little (August 2002) Research Report

Learning From Logic Models: An Example of a Family/School Partnership Program

This brief offers a step-by-step approach for developing and using a logic model as a framework for a program or organization’s evaluation. Its purpose is to provide a tool to guide evaluation processes and to facilitate practitioner and evaluator partnerships. The brief is written primarily for program practitioners, but is also relevant and easily applied for evaluators.

Julia Coffman (January 1999) Tool for Evaluation

A User's Guide to Advocacy Evaluation Planning

A User's Guide to Advocacy Evaluation Planning was developed for advocates, evaluators, and funders who want guidance on how to evaluate advocacy and policy change efforts. This tool takes users through four basic steps that generate the core elements of an advocacy evaluation plan, including what will be measured and how.

Julia Coffman (Fall 2009) Tool for Evaluation

Grantmaking to School Districts: Lessons for Foundations

This brief offers lessons and best practices from foundations across the country on grantmaking to school districts. It offers advice to foundations that are considering school district investments for the first time. It also offers a useful "check" to more experienced foundations that want to examine their thinking and approaches against the lessons and practices of other foundations.

Julia Coffman , Heather Weiss, Erin Harris, Priscilla M.D. Little (September 2010) Research Report

Lessons in Evaluating Communications Campaigns: Five Case Studies

This paper examines how communication campaigns with different purposes (individual behavior change and policy change) have been evaluated. It offers a discussion of theories of change that can guide evaluation planning, along with five case studies of completed campaign evaluations. Each case study includes lessons from the evaluation and the paper finishes with a set of cross-case-study lessons gleaned from these evaluations and others.

Julia Coffman (June 2003) Research Report

Combining Qualitative and Quantitative Methods in Social Inquiry

Book chapter on using mixed methodology in the social sciences. In B. Somekh & C. Lewin (Eds.), Research methods in the social sciences. Thousand Oaks, CA: Sage.

Jennifer C. Greene , Holly Kreider, Ellen Mayer (2004) Research Report

Engaging With Families in Out-of-School Time Learning

This Snapshot provides an overview of how researchers are evaluating out-of-school time programs’ engagement with families.

Erin Harris , Christopher Wimer (April 2004) Research Report

Research Spotlight: Get Started!—Resources on Using Evaluation for Continuous Improvement

This Research Spotlight, which follows up on our 2013 fall FINE Newsletter, has been compiled in response to our readers’ interest in using data for continuous improvement.

Harvard Family Research Project (April 2014) Research Report

Course Syllabus for Learning From Practice: Evaluation and Improvement Science

Learn how this course explores a variety of approaches to program evaluation through the readings and assignments outlined in this course syllabus designed by Candice Bocala, adjunct lecturer at Harvard Graduate School of Education.

Harvard Family Research Project (November 12, 2015) Research Report

Explore: Resources to Strengthen Program Evaluation

Interested in developing a logic model, learning more about improvement science, or advancing your program evaluation? This guide offers valuable resources practitioners can utilize to strengthen their evaluative work and develop more productive relationships with evaluators.

Harvard Family Research Project (November 19, 2015) Research Report

Indicators: Definition and Use in a Results-Based Accountability System

This brief defines and explores the role of indicators as an integral part of a results-based accountability system. The brief shows how indicators enable decision makers to assess progress toward the achievement of intended outputs, outcomes, goals, and objectives.

Karen Horsch (1997) Research Report

Evaluation Options for Family Resource Centers

This report examines different evaluation designs and their respective strengths and limitations. Using a realistic prototype of a child and family resource center, the authors present three alternative plans for evaluation.

Karen Horsch , Heather B. Weiss (1998) Research Report

Youth Involvement in Evaluation & Research

This brief draws on information collected from focus group interviews with representatives of 14 programs that are involving youth in their evaluation and research efforts. It examines the elements of successful youth involved research projects and offers short profiles of the 14 organizations included in the study.

Karen Horsch , Priscilla M. D. Little, Jennifer Chase Smith, Leslie Goodyear, Erin Harris (February 2002) Research Report

Family Strengthening Interventions: Evidence-Based Practices

The purpose of this class is to provide professional skills that will help students to select, implement, and evaluate the effectiveness of evidence-based family strengthening interventions. Students will increase their knowledge, skills, and expertise in the most up-to-date information on effective family strengthening interventions in their area of primary interest.

Karol Kumpfer (Spring 2006) Syllabus

Beyond School Hours VIII Annual Conference

Priscilla Little presented the workshop Learning What Works: An Evaluation Overview, providing an overview of what we know about after school evaluation. It examines how programs are collecting meaningful data for accountability and program improvement and what they are finding.

Priscilla M. D. Little (February 16, 2005) Conferences and Presentations

21st Century Community Learning Centers Summer Institute

This workshop, Redefining After School Programs to Support Student Achievement, provides an overview of current evaluation research, describes elements of effective after school programs, and discusses a theory of change approach to designing and implementing effective after school programs.

Priscilla M. D. Little (July 27, 2004) Conferences and Presentations

Performance Measures in Out-of-School Time Evaluation

This Snapshot outlines the academic, youth development, and prevention performance measures currently being used by out-of-school time programs to assess their progress, and the corresponding data sources for these measures.

Priscilla M. D. Little , Erin Harris, Suzanne Bouffard (March 2004) Research Report

Evaluation of 21st Century Community Learning Center Programs: A Guide for State Education Agencies

This brief offers an in-depth look at the 21st Century Community Learning Center (21st CCLC) evaluation requirements (both performance measurement for accountability and program evaluation) and provides practical suggestions about how to implement 21st CCLC evaluation at the state and local level. It includes a checklist of issues to consider when designing state and local 21st CCLC evaluations.

Priscilla M. D. Little , Flora Traub, Karen Horsch (April 2002) Research Report

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project