You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Browse by Research Area:

From the Director's Desk

An introduction to the issue on Democratic Evaluation by HFRP's Founder & Director, Heather B. Weiss, Ed.D.

Heather B. Weiss, Ed.D. (Fall 2005) Evaluation Exchange Article

New & Noteworthy

The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to the issue's theme of Democratic Evaluation.

Harvard Family Research Project (Fall 2005) Evaluation Exchange Article

Democratic Evaluation Approaches for Equity and Inclusion

Katherine Ryan, Associate Professor of Educational Psychology at the University of Illinois, describes three approaches to democratic evaluation and argues that they can provide field-tested methods for addressing equity and inclusion issues in evaluations of programs for children, youth, and families.

Katharine E. Ryan (Fall 2005) Evaluation Exchange Article

New & Noteworthy: Expanded Web Only Version

This web only version of the New & Noteworthy section features an expanded annotated list of papers, organizations, initiatives, and other resources related to the issue's theme of Democratic Evaluation.

Harvard Family Research Project (Fall 2005) Evaluation Exchange Article

Evaluating Evaluation Data

Kathleen McCartney and Heather Weiss of the Harvard Graduate School of Education describe the conditions for evaluations to maintain scientific integrity and serve the public good despite a politicized environment.

Kathleen McCartney , Heather Weiss (Fall 2005) Evaluation Exchange Article

Program Evaluation in a Democratic Society: The Vera Model

Tim Ross, Research Director at the Vera Institute of Justice, explains Vera's rigorous and multitiered data collection process and the benefits of partnerships with public programs.

Tim Ross (Fall 2005) Evaluation Exchange Article

Getting Creative in Holding Officials Accountable

Dennis Arroyo describes the performance-monitoring mechanisms that nongovernment agencies use to make public officials accountable to citizens.

Dennis Arroyo (Fall 2005) Evaluation Exchange Article

The Many Forms of Democratic Evaluation

Ernest House, Emeritus Professor at the University of Colorado, argues that democratic evaluation calls for more ingenuity than other forms of evaluation and that as a result its methods can take many forms.

Earnest House (Fall 2005) Evaluation Exchange Article

Combining Research Rigor and Participatory Evaluation

Anju Malhotra and Sanyukta Mathur from the International Center for Research on Women describe a study in Nepal that compared participatory and more traditional approaches to evaluating adolescent reproductive health interventions.

Anju Malhotra , Sanyukta Mathur (Fall 2005) Evaluation Exchange Article

Twists and Turns in the Journey: Youth Activists' Use of Research in Their Campaigns for Small Schools

Kristine Lewis shares Research for Action's experience with training youth to use social science research methods in their campaigns to im-prove their local high schools.

Kristine Lewis (Fall 2005) Evaluation Exchange Article

Democratic Evaluation

This issue of The Evaluation Exchange periodical focuses on democratic evaluation. At the forefront of the discussion are equity and inclusion in the evaluation of programs for children, families, and communities, as well as evaluation to promote public accountability and transparency. Katherine Ryan leads off the issue by presenting major theoretical approaches to democratic evaluation. Several contributors examine these different strands, highlighting the importance of power sharing. Jennifer Greene emphasizes the importance of broad inclusion of stakeholder perspectives in evaluations, while Saville Kushner offers guidelines for people and communities to help evaluation reposition itself as a collaborative effort and thereby begin to address the crisis in public trust between the professional bureaucracy and citizens. Kathleen McCartney and Heather Weiss focus on public accountability, especially the conduct of flagship evaluations to maintain their scientific integrity while also serving the public good. Several contributors provide practical methods and tools to promote democratic evaluation, including the facilitation of dialogue, the training of youth researchers, the use of photovoice and cell phone technology, and access to interactive information through the Internet.

Evaluation Exchange Issue

Free. 20 Pages.

A Conversation With Jennifer Greene

Jennifer Greene of the University of Ilinois talks about her efforts to advance the theory and practice of alternative forms of evaluation, including qualitative, participatory, and mixed-method evaluation.

M. Elena Lopez (Fall 2005) Evaluation Exchange Article

After School Evaluation Symposium

This 2-day meeting brought together the perspectives of diverse stakeholders to inspire new ideas and foster stronger links between research, practice, and policy. Participants discussed issues of access, quality, professional development, the role of evaluation research, and systems-building efforts.

Harvard Family Research Project (September 22, 2005) Conferences and Presentations

Free. Available online only.

Evaluating Complicated—and Complex—Programs Using Theory of Change

Patricia Rogers of the Royal Melbourne Institute of Technology describes how a theory of change can provide coherence in evaluating national initiatives that are both complicated and complex.

Patricia Roegrs, Ph.D (Summer 2005) Evaluation Exchange Article

The Knight Foundation's Approach to Cluster Evaluation

The John S. and James L. Knight Foundation and Wellsys Corporation describe how they plan to aggregate lessons learned across a "thematic cluster" of youth development investments.

Julie K. Kohler, Ph.D. , Lizabeth Sklaroff, Denise Townsend, Ph.D., Susan Boland Butts (Summer 2005) Evaluation Exchange Article

Ten Strategies for Enhancing Multicultural Competency in Evaluation

Teresa Boyd Cowles of the Connecticut Department of Education offers self-reflective strategies evaluators can use to enhance their multicultural competency.

Teresa Boyd Cowles, Ph.D. (Summer 2005) Evaluation Exchange Article

The Evidence Base for Increasing High-Achieving Minority Undergraduates

Mehmet Öztürk discusses findings from a review of evaluations of programs at selective colleges and universities to be used for improving undergraduate academic outcomes for underrepresented minority or disadvantaged students.

Mehmet (Summer 2005) Evaluation Exchange Article

Building a Pipeline Program for Evaluators of Color

Rodney Hopson and Prisca Collins of Duquesne University describe a new graduate internship program designed to develop leaders in the evaluation field and improve evaluators' capacity to work responsively in diverse racial and ethnic communities.

Rodney Hopson , Prisca Collins (Summer 2005) Evaluation Exchange Article

The Retrospective Pretest: An Imperfect but Useful Tool

Theodore Lamb, of the Center for Research and Evaluation at Biological Sciences Curriculum Study, discusses retrospective pretests and their strengths and weaknesses.

Theodore Lamb (Summer 2005) Evaluation Exchange Article

New & Noteworthy

The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to the issue's theme of Evaluation Methodology.

Harvard Family Research Project (Summer 2005) Evaluation Exchange Article

From the Director's Desk

An introduction to the issue on Evaluation Methodology by HFRP's Founder & Director, Heather B. Weiss, Ed.D.

Heather B. Weiss, Ed.D. (Summer 2005) Evaluation Exchange Article

Evaluation Theory or What Are Evaluation Methods for?

Mel Mark, professor of psychology at the Pennsylvania State University and president-elect of the American Evaluation Association, discusses why theory is important to evaluation practice.

Mel Mark, Ph.D. (Summer 2005) Evaluation Exchange Article

Eight Outcome Models

Robert Penna and William Phillips from the Rensselaerville Institute’s Center for Outcomes describe eight models for applying outcome-based thinking.

Robert Penna , William Phillips (Summer 2005) Evaluation Exchange Article

Evaluation and the Sacred Bundle

John Bare of the Arthur M. Blank Family Foundation explains how nonprofits can learn about setting evaluation priorities based on storytelling and “sacred bundles.”

John Bare (Summer 2005) Evaluation Exchange Article

Assessing Nonprofit Organizational Capacity

Abby Weiss from HFRP describes the tool that the Marguerite Casey Foundation offers its nonprofit grantees to help them assess their organizational capacity.

Abby Weiss (Summer 2005) Evaluation Exchange Article

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project