Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.
All Publications & Resources
|
This brief offers an overview of how out-of-school time programs can evaluate their family involvement strategies and practices. It draws on findings from our OST Evaluation Database, interviews, and email correspondence.
To inform municipal leaders who are developing out-of-school time evaluations, HFRP scanned the city-level initiatives in its evaluation profiles database and prepared this short brief that describes the evaluation approaches, methods, and performance measures that some cities are using for evaluation.
This brief offers an in-depth look at the 21st Century Community Learning Center (21st CCLC) evaluation requirements (both performance measurement for accountability and program evaluation) and provides practical suggestions about how to implement 21st CCLC evaluation at the state and local level. It includes a checklist of issues to consider when designing state and local 21st CCLC evaluations.
M. Elena Lopez, from Harvard Family Research Project, discusses expanding the role of family support to include supporting families’ using information to improve their communities.
Kathleen McCartney and Eric Dearing from the Harvard Graduate School of Education provide an overview on effect size and what it reveals about the effectiveness of family support programs.
Carl Dunst, Co-Director of the Orelena Hawks Puckett Institute, urges getting beyond the question of “what works” toward a more detailed scrutiny of the relationship among family support principles, program practice, and family outcomes.
Director of an organizational development consulting practice, professor, and author, Michael Quinn Patton reveals historical and emerging trends in evaluation practice.
The Boston Parent Organizing Network (BPON) mobilizes parents, local organizations, and communities to improve the quality of education in the Boston Public Schools.
David Diehl of Family Support America outlines their top evaluation projects: compiling an online national database of family support programs and developing new ways to measure the effectiveness of family support programs.
Two evaluators from SRI describe the benefits realized by the Parent Institute for Quality Education when they prefaced their summative evaluation with a formative evaluation.
A grassroots network of families of children with special health care needs shares the lessons they learned about conducting research to improve the health care for their children.
The Spring 2002 issue looks at family support evaluations and their role in moving the field forward. This issue features a conversation with Michael Quinn Patton about historical and emerging trends in evaluation practice, descriptions of national and local evaluations that are underway, a discussion of using “effect size” to measure program effectiveness, advice on how to bring family research to legislators' attention, a look at how data can help parents assess schools, and much more.
Pablo Stansbery, Senior Research Associate at Harder+Company Community Research, describes the process of developing an evaluation design that addresses the unique challenges created by California’s Children and Families Act.
An in-depth look at the challenges presented by the evaluation of the Early Head Start program - an evaluation which required the cooperation of multiple layers of research and program partners.
This brief draws on information collected from focus group interviews with representatives of 14 programs that are involving youth in their evaluation and research efforts. It examines the elements of successful youth involved research projects and offers short profiles of the 14 organizations included in the study.
Mark Dynarski and Mary Moore of Mathematica Policy Research, reveal the challenges of evaluating a national program implemented in multiple locations with inherently different key elements.
Luis Carlos Greer and Tamara Martinez, youth living in Arizona, describe how they got involved by working with a local community organization to make a change in their community.
JuNelle Harris of HFRP outlines the basics of designing logic models.
Olatokunbo (Toks) Fashola, Associate Research Scientist at the Johns Hopkins University Center for Research on the Education of Students Placed at Risk (CRESPAR), reveals the steps new programs can take to initiate evaluation.
The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to this issue's theme of Out-of School Time.
Kathleen Hebbeler of SRI International describes the evaluation of CORAL, which seeks to help communities view academic achievement as the shared responsibility of multiple sectors of the community.
Jennifer Smith from HFRP writes about involving youth in evaluation and research.
Cindy McMahon of the YWCA of Ashville, North Carolina, shares how YWCA as a whole, and her after school program as a part of it, used a logic model to show they make a difference for women and families.
Marielle Bohan-Baker, from Harvard Family Research Project, presents some of the challenges voiced by communications experts in interviews about the use and evaluation of mass media initiatives.
Jacqueline Dugery of the Pew Partnership for Civic Change offers some innovative ways to build on organizational learning to engage in strategic communications campaigns.