You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


This section features an annotated list of papers, organizations, initiatives, and other resources related to the issue's theme.

The Administration on Children,Youth and Families (ACYF), United States Department of Health and Human Services. (n.d.). The program manager’s guide to evaluation. This guide explains the program evaluation process and provides background information on how to use it successfully. Chapters answer questions such as the following: Why evaluate? What is program evaluation? Who should conduct your evaluation? How do you hire and manage an outside evaluator? How do you prepare for an evaluation? What should you include in an evaluation? How do you get the information you need? How do you make sense of evaluation information? How can you report what you have learned? Companion handbooks address evaluation issues specific to the different ACYF program areas. www.acf.hhs.gov/programs/opre/other_resrch/
pm_guide_eval/reports/pmguide/pmguide_toc.html

Bernard, H. R., et al. (Eds.). (1999). Field methods. Thousand Oaks, CA: AltaMira Press (a division of Sage Publications). Formerly CAM, the Cultural Anthropology Methods journal, this is a new peer-reviewed journal whose editors include an international, interdisciplinary editorial board of scholars. The journal is published quarterly. Articles focus on the methods used by field workers in the social and behavioral sciences and humanities for the collection, management, and analysis of data about human thought and/or human behavior in the natural world. The emphasis is on innovations and issues related to methods used, rather than the reporting of research or theoretical-epistemological questions about research. For information about submitting an article reviewing a book or piece of software, contact the editor at ufruss@nersp.nerdc.ufl.edu. To obtain subscription information, email the publisher at order@altamira.sagepub.com.

Bond, S. L., Boyd, S. E., & Rapp, K. A. (1997). Taking stock: A practical guide to evaluating your own programs. Chapel Hill, NC: Horizon Research. Produced at the request of the American Association for the Advancement of Science for its Science Linkages in the Community Initiative, this manual is a useful guide to program evaluation for community-based organizations. Focusing on internal evaluation, the manual describes the evaluation process from beginning to end. Chapters address issues such as why evaluation is necessary, identifying goals and objectives, qualitative and quantitative data, data collection strategies, and interpreting and using data. This guide includes examples using fictional CBOs; appendices include sample reports using evaluation data.

Frechtling, J.,& Sharp, L. (Eds.). (1997). User-friendly handbook for mixed method evaluations. Washington, DC: Directorate for Education and Human Resources, Division of Research, Evaluation and Communication, National Science Foundation. Chapters in this publication provide an introduction to mixed method evaluations, an overview of qualitative methods and analytic techniques, and approaches for designing and reporting mixed-method evaluations. Also included are supplementary materials and exhibits that highlight the concepts discussed in the handbook. www.ehr.nsf.gov/EHR/REC/pubs/NSF97-153/start.htm

Patton, M. Q. (1997). Utilization-focused evaluation: The new century text (3rd ed.). Thousand Oaks, CA: Sage Publications. The latest edition of this important work, which won both the Alva and Gunner Myrdal Award from the Evaluation Research Society and the Paul F. Lazarsfeld Award from the American Evaluation Association, includes numerous new thought-provoking topics for consideration. Added topics include using participatory evaluation processes to change a program’s culture and build a learning organization; alternative evaluator roles connected to varying situations and diverse evaluation purposes; getting started: generating commitment to use; how evaluators can nurture results-oriented, reality-testing leadership in programs and organizations; and specific techniques for managing the power dynamics of working with primary intended users as well as evaluation stakeholders. New pedagogical features include more than 50 new exhibits for teaching and training use, and menus developed as special tools for working with stakeholders in selecting evaluation decision options. Sage Publications, Inc., 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 805-499-0721. Fax: 805-499-0871. www.sagepub.com

Rossi, P. H., Freeman, H. E., & Lipsey, M. W. (1999). Evaluation: A systematic approach (6th ed.). Thousand Oaks, CA: Sage Publications. Long considered a benchmark publication in evaluation, this book has been completely revised to include the latest techniques and approaches to evaluation as well as guidelines for how evaluations should be tailored to fit programs and social contexts. The new edition includes content on assessing program theory—illustrating procedures that evaluators use to tease out theory when it is implicit in a program, and information on approaches to assessing the quality of program design and conceptualization. Another new chapter offers practical approaches to fashioning effective evaluation questions, guidelines for selecting an evaluation type, and tips for deciding what focus an evaluation should have. Sage Publications, Inc., 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 805-499-0721. Fax: 805-499-0871. www.sagepub.com

Karen Horsch, Research Associate, HFRP

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project