You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Browse by Research Area:

Getting Creative in Holding Officials Accountable

Dennis Arroyo describes the performance-monitoring mechanisms that nongovernment agencies use to make public officials accountable to citizens.

Dennis Arroyo (Fall 2005) Evaluation Exchange Article

The Many Forms of Democratic Evaluation

Ernest House, Emeritus Professor at the University of Colorado, argues that democratic evaluation calls for more ingenuity than other forms of evaluation and that as a result its methods can take many forms.

Earnest House (Fall 2005) Evaluation Exchange Article

Combining Research Rigor and Participatory Evaluation

Anju Malhotra and Sanyukta Mathur from the International Center for Research on Women describe a study in Nepal that compared participatory and more traditional approaches to evaluating adolescent reproductive health interventions.

Anju Malhotra , Sanyukta Mathur (Fall 2005) Evaluation Exchange Article

Twists and Turns in the Journey: Youth Activists' Use of Research in Their Campaigns for Small Schools

Kristine Lewis shares Research for Action's experience with training youth to use social science research methods in their campaigns to im-prove their local high schools.

Kristine Lewis (Fall 2005) Evaluation Exchange Article

Democratic Evaluation

This issue of The Evaluation Exchange periodical focuses on democratic evaluation. At the forefront of the discussion are equity and inclusion in the evaluation of programs for children, families, and communities, as well as evaluation to promote public accountability and transparency. Katherine Ryan leads off the issue by presenting major theoretical approaches to democratic evaluation. Several contributors examine these different strands, highlighting the importance of power sharing. Jennifer Greene emphasizes the importance of broad inclusion of stakeholder perspectives in evaluations, while Saville Kushner offers guidelines for people and communities to help evaluation reposition itself as a collaborative effort and thereby begin to address the crisis in public trust between the professional bureaucracy and citizens. Kathleen McCartney and Heather Weiss focus on public accountability, especially the conduct of flagship evaluations to maintain their scientific integrity while also serving the public good. Several contributors provide practical methods and tools to promote democratic evaluation, including the facilitation of dialogue, the training of youth researchers, the use of photovoice and cell phone technology, and access to interactive information through the Internet.

Evaluation Exchange Issue

Free. 20 Pages.

A Conversation With Jennifer Greene

Jennifer Greene of the University of Ilinois talks about her efforts to advance the theory and practice of alternative forms of evaluation, including qualitative, participatory, and mixed-method evaluation.

M. Elena Lopez (Fall 2005) Evaluation Exchange Article

Democratic Evaluation in Practice

Cheryl MacNeil, an evaluation consultant, describes the asymmetries of power in evaluation and her efforts to make her evaluation practice more democratic.

Cheryl MacNeil, Ph.D. (Fall 2005) Evaluation Exchange Article

Social Capital in the Connected Society

Andrew Nachison, director of the Media Center, an organization that studies the intersection of media, technology, and society, writes about social capital and democratic processes in a digital society.

Andrew Nachison (Fall 2005) Evaluation Exchange Article

Using Democratic Evaluation Principles to Foster Citizen Engagement and Strengthen Neighborhoods

Arnold Love and Betty Muggah describe how Hamilton Community Foundation applied democratic evaluation principles to transform challenged neighborhoods into vibrant communities.

Arnold Love, Ph.D. , Betty Muggah (Fall 2005) Evaluation Exchange Article

Untangling Logic Models and Indicators: Reflections on Engaging Stakeholders

Seema Shah, a researcher at the Institute for Education and Social Policy, shares her experience of engaging community organizing groups to develop a logic model on how community organizing leads to better student outcomes.

Seema Shah, Ph.D. (Fall 2005) Evaluation Exchange Article

The Trenton Central High School Obesity Prevention Project: Encouraging Democracy Through Inclusion

Katrina Bledsoe of the College of New Jersey writes about the inclusion of student voices in the evaluation of an obesity prevention program

Katrina L. Bledsoe, Ph.D. (Fall 2005) Evaluation Exchange Article

How does evaluation create options to enhancing social justice?

Saville Kushner of the Centre for Research in Education and Democracy at the University of the West of England suggests ways that an evaluation's participants can make evaluations more democratic.

Saville Kushner (Fall 2005) Evaluation Exchange Article

After School Evaluation Symposium

This 2-day meeting brought together the perspectives of diverse stakeholders to inspire new ideas and foster stronger links between research, practice, and policy. Participants discussed issues of access, quality, professional development, the role of evaluation research, and systems-building efforts.

Harvard Family Research Project (September 22, 2005) Conferences and Presentations

Free. Available online only.

New & Noteworthy

The New & Noteworthy section features an annotated list of papers, organizations, initiatives, and other resources related to the issue's theme of Evaluation Methodology.

Harvard Family Research Project (Summer 2005) Evaluation Exchange Article

From the Director's Desk

An introduction to the issue on Evaluation Methodology by HFRP's Founder & Director, Heather B. Weiss, Ed.D.

Heather B. Weiss, Ed.D. (Summer 2005) Evaluation Exchange Article

Evaluation Theory or What Are Evaluation Methods for?

Mel Mark, professor of psychology at the Pennsylvania State University and president-elect of the American Evaluation Association, discusses why theory is important to evaluation practice.

Mel Mark, Ph.D. (Summer 2005) Evaluation Exchange Article

Eight Outcome Models

Robert Penna and William Phillips from the Rensselaerville Institute’s Center for Outcomes describe eight models for applying outcome-based thinking.

Robert Penna , William Phillips (Summer 2005) Evaluation Exchange Article

Evaluation and the Sacred Bundle

John Bare of the Arthur M. Blank Family Foundation explains how nonprofits can learn about setting evaluation priorities based on storytelling and “sacred bundles.”

John Bare (Summer 2005) Evaluation Exchange Article

Assessing Nonprofit Organizational Capacity

Abby Weiss from HFRP describes the tool that the Marguerite Casey Foundation offers its nonprofit grantees to help them assess their organizational capacity.

Abby Weiss (Summer 2005) Evaluation Exchange Article

What is strategic learning and how do you develop an organizational culture that encourages it?

John A. Healy, Director of Strategic Learning and Evaluation at The Atlantic Philanthropies, shares ways to position learning as an organizational priority.

Julia Coffman , Erin Harris (Summer 2005) Evaluation Exchange Article

What is the Campbell Collaboration and how is it helping to identify “what works”?

Robert Boruch, a founder of the Campbell Collaboration and professor of education and statistics at the University of Pennsylvania, discusses how the Campbell Collaboration and randomized trials contribute to evidence-based policy.

Abby Weiss (Summer 2005) Evaluation Exchange Article

Evaluation Methodology

This issue of The Evaluation Exchange periodical focuses on evaluation methodology, covering topics in contemporary evaluation thinking, techniques, and tools. Mel Mark, president-elect of the American Evaluation Association, kicks off the issue with a discussion about the role that evaluation theory plays in our methodological choices. Other voices in the issue include Georgia State University evaluator Gary Henry, who makes the case for a paradigm shift in how we think about evaluation use and influence, and Robert Boruch, a Campbell Collaboration founder, who discusses the role of randomized trials in defining “what works.” Other contributors to the issue respond to various “how to” questions, such as how to foster strategic learning, how to find tools that assess nonprofit organizational capacity, how to select and use various outcome models, how to increase the number of evaluators of color, how to enhance multicultural competency in evaluation, and how to measure what we value so others value what we measure. Finally, the issue explores theory of change, cluster evaluation, and retrospective pretests—methodological approaches currently generating much interest and dialogue.

Evaluation Exchange Issue

Free. 20 Pages.

An Introduction to Theory of Change

Andrea Anderson is a research associate at the Aspen Institute Roundtable on Community Change, where she focuses on work related to planning and evaluating community initiatives.

Erin Harris (Summer 2005) Evaluation Exchange Article

A Conversation With Gary Henry

Gary Henry makes the case for a paradigm shift in how we think about evaluation use and influence.

Julia Coffman (Summer 2005) Evaluation Exchange Article

Evaluating Complicated—and Complex—Programs Using Theory of Change

Patricia Rogers of the Royal Melbourne Institute of Technology describes how a theory of change can provide coherence in evaluating national initiatives that are both complicated and complex.

Patricia Roegrs, Ph.D (Summer 2005) Evaluation Exchange Article

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project