You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Kim Sabo of Sabo Consulting and Dana Fusco from York College, CUNY illustrate how they conducted a participatory evaluation of an after school literacy initiative to support its continuous improvement.

In 1998, we were commissioned to evaluate a collaborative of 10 community-based after school programs in New York City that had just instituted a literacy initiative. Since the programs were small and were interested in continuous improvement, we thought a participatory evaluation (PE) would be the best method to use.

Working with the collaborative’s literacy coordinator, the initial step was to create a logic model of the entire literacy initiative. We quickly learned that there were many different understandings and practices of literacy across the 10 after school programs. If PE was going to offer programs the opportunity for continuous improvement, then we had to create strategies to evaluate the initiative while staying true to the uniqueness of each program within the initiative.

One strategy was a Literacy Program Model (LPM). The LPM allowed each program to identify three of their “best” literacy practices and then position them in relation to the program’s objective(s), resources, and assessment tools. In developing the LPM, staff began to recognize the areas within the program that could be strengthened, i.e., which objectives were not being met. Staff defined a Plan of Action that specified next steps with regard to adding or changing literacy activities, acquiring additional resources, and developing documentation and assessment strategies. In this way programs used the LPM to guide continuous improvement of program planning.

The extent to which the Plan of Action was implemented became a central focus of both individual program evaluations and the evaluation of the overall initiative. Next we needed methods to document the progress of the programs and the initiative.

Participatory Evaluation


As an approach to organizational and program assessment, participatory evaluation (PE) involves a collaboration between evaluator and client. Participatory evaluation:

  • Increases the likelihood that evaluation results are accurate and relevant
  • Ensures that program staff will be motivated and prepared to use evaluation findings
  • Builds the program’s capacity to continue to design and conduct quality evaluations with diminished reliance on outside assistance
  • Stimulates deep thinking about programmatic issues, often leading to refinement in the program itself
  • Gives staff new tools for communicating their program to others

Two User-Friendly Data Collection Tools
Throughout the PE process, two data collection tools were especially helpful: the House Matrix and the House Portfolio.

The House Matrix is a program overview organized according to seven areas of potential change: philosophy/mission, staff development, facilities, literacy practice, assessment, parent involvement, and school/community collaborations. Each matrix includes specific program information (mission statement, number of staff workshops, child-staff ratio, academic and enrichment activities, assessment procedures, number of parent meetings and use of parent advisories, and school and community partnerships). Data for the house matrices are gathered, reviewed, and revised in collaboration with staff. This organizational tool provides a contextualized snapshot of each program that helps both evaluators and staff members understand their program better.

The House Portfolio expands on the information represented in the House Matrix, with the seven areas of change sectioned off in a three-ring binder. All staff collect data for the portfolio and periodically come together to reflect on the findings. The portfolios help individual programs, the literacy coordinator, and the evaluation team understand their progress and accomplishments over the course of a year. In addition, the portfolios become a communication tool, allowing staff to share the program’s overall work and development with funders, parents, students, and other stakeholders.

The Results
By the end of the evaluation, all programs were able to collect data and were beginning to understand the usefulness of evaluation and reflection. Even when programs valued the long-term gains of PE, high turnover rates often made the directors wary of investing in further staff development. Thus, it is unclear whether we could have fully built evaluation capacity within each of the programs. What we were able to do was address each of our evaluation questions while simultaneously supporting programs to gain a better understanding of the relationship between program planning and evaluation.

Kim Sabo, Ph.D.
Consultant
Kim Sabo Consulting
424 West 49th Street, #3A
New York, NY 10019
Tel: 212-307-1663
Email: kimsabo@aol.com

Dana Fusco, Ph.D.
Department of Teacher Education
York College, CUNY
94-20 Guy R. Brewer Boulevard
Jamaica, NY 11451
Tel: 718-262-2698
Email: fusco@york.cuny.edu

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project