You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


The first four articles in this issues Evaluations to Watch section spotlight the national Parental Information and Resource Centers (PIRC) program and the ways in which a new evaluation approach is helping it build the family involvement field. We begin with an overview of the evaluation strategy and continue with articles describing three PIRCs' evaluation plans and lessons learned.

Jerrell Cassady and Jackie Garvey illustrate how an ongoing, collaborative process between director and evaluator has informed and improved the Indiana State PIRC's programs to support family involvement.

For the past 5 years, we have served as the project evaluator and director for the Indiana State Parental Information and Resource Center (PIRC). In that time, we have found evaluation to be integral to establishing a progressive growth model for the organization. In addition to meeting federal and program mandates, effective external evaluation can provide PIRC staff with the knowledge necessary to be adaptive and flexible in meeting the needs of constituents. Our strong, collaborative relationship as director and evaluator has played a major role in helping the evaluation and, consequently, the PIRC to thrive.

Clarity and Consistency
Building this relationship has taken time and effort. During the first year of the PIRC evaluation, Jerrell—an experienced evaluator—had difficulty identifying the PIRC's core project goals and functions. From his perspective, the PIRC was "doing too many things." From the perspective of Jackie—the PIRC director—Jerrell "just didn't get it." We find this difference in perspective common in evaluations of PIRC programs.

After several meetings, we came to agree that there was, at least from an outside point of view, a lack of consensus about what the PIRC “did” and that, as a result, evaluation activities did not sufficiently connect the diverse components of the PIRC. Jackie responded to this issue by articulating the organization's framework—outlining the interconnections between the people and programs affiliated with the PIRC. The value of this process of clarification was twofold: Jerrell gained insight into the "big idea" of the PIRC, and the staff saw how their individual roles fit into a bigger picture, which, in turn, fostered greater synergy in their work processes.

Consistency was key at this stage of the game and is a key strategy for success. Evaluator turnover would require repeating this year-long learning process and thus negatively impact evaluation quality.

Bridging the Paradigm Gap
Evaluators operate from a paradigm of evaluation design that places great importance on methodological processes and sampling concerns. Program staff often see these concerns as unimportant details, instead placing greater emphasis on program content and delivery. In our experience, successfully bridging the gap between the two paradigms occurs through a process of mutual respect, whereby both parties recognize the parameters underlying their joint activities and acknowledge that the other party is qualified to negotiate the barriers that arise between their different paradigms.

Trust
For the project director, trusting the evaluator is essential because the evaluator requests information that may not shine a wholly favorable light on the organization. Distrustful project directors have been known to withhold information that they believe will lead to a negative evaluative report. Directors must realize that only when all the information is laid out for evaluator to examine can true improvement and change can be attained—especially in evaluations with a strong formative component.

In our experience, focusing on short-term goals and submitting brief reports on specific evaluation questions promotes the ability of the PIRC to respond to observed limitations in program efficacy and make gains within a program year. This trust does not mean that the evaluator is "sugarcoating" the PIRC's weaknesses and over-reporting strengths. The evaluator needs to maintain a critical eye and provide information to the PIRC in a timely fashion so that it can improve upon identified weaknesses.

Building Evaluation Plans Together
To ensure that a quality collaborative relationship produces a quality evaluation design, we have developed a three-step process for evaluation design:

1. Goal identification. The program director identifies a set of goals, research questions, or benchmarks that serve as the key focus for a given time period.

2. Clarify evaluation needs. Once the goal or evaluation question has been articulated, the evaluator identifies the data sources and controls necessary to provide a confident and reliable conclusion.

3. Negotiation and problem solving. The evaluator and PIRC staff bring their own expertise to solve the problem. The evaluator communicates the requirements for a valid finding, while the staff highlight the realities of interacting with the parents, teachers, and school administrators. At the intersection of these two bodies of knowledge rests the optimal evaluation design for each PIRC goal.

This process helped the Indiana State PIRC more effectively design and deliver the Indiana Academy for Parent Leadership, which has been the central focus of the evaluation process for 3 of the evaluation's 5 years. In part as a result of our collaboration as PIRC director and evaluator, the Academy has grown in enrollment and refined services to reach more stakeholders. Now, sufficiently validated by evaluation, it serves as a central feature in a new parent engagement and leadership training program that provides the participants with university credit.

Jerrell C. Cassady, Ph.D.
Associate Professor of Psychology
Department of Educational Psychology
Ball State University, Muncie, IN 47306
Tel: 765-285-8522
Email:
jccassady@bsu.edu

Jackie Garvey
Executive Director
The Indiana Partnerships Center
Parental Information and Resource Center
921 E. 86th Street, Suite 108
Indianapolis, IN 46240
Tel: 866-391-1039
Email:
jgarvey@fscp.org
 

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project