You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Increasingly, evaluators are engaging key stakeholders in the design and implementation of evaluations. The following three articles discuss experiences working with different sets of stakeholders: program staff, youth, and client families. The authors offer insights into training and support as well as some of the lessons learned in using this type of participatory approach to evaluation.

Use of Staff in Family Program Evaluations


Betty Cooke of the Minnesota Department of Children, Families, and Learning describes Minnesota’s experiences using program staff as data collectors.

Early Childhood Family Education (ECFE) is a voluntary public school program for all Minnesota families with children between the ages of birth and kindergarten. The mission of ECFE is to strengthen families through the education and support of all parents in providing the best possible environment for the healthy growth and development of their children. During the 1996-97 school year more than 280,000 young children and their parents participated. Funded by more than $35 million in state and local funds, ECFE is the nation's largest and oldest program of its kind.

Evaluation of ECFE has been a priority since the first pilot programs began in 1975. In the early years, evaluation was essential to document local acceptance of the program, help shape program development, and inform policymakers of ECFE progress. Over the years, evaluation has become integrated into ECFE practice and is viewed by most staff as a strategic learning opportunity. Both the process and products of evaluations are used to understand families and shape program effectiveness. Today many ECFE professionals are sophisticated evaluation users.

ECFE has developed and adopted a set of evaluation principles which underscore that ECFE evaluations are collaborative ventures and should be an integrated component of programs. The principles emphasize staff involvement in evaluation and use of information to enhance program effectiveness. They set clear parameters for making evaluation design decisions and allow staff to work as partners with evaluation consultants in all phases of an evaluation.

The most recent study (Mueller, 1996) included use of outside evaluation experts working collaboratively with state and local ECFE staff in determining the evaluation purpose and design. Twenty-eight ECFE staff members from the 14 school district ECFE programs involved served as data collectors. All site evaluators were trained in interviewing, observation, videotaping, and analysis. Site evaluators also participated in the pilot phase of the evaluation and helped revise data collection strategies. Support available to these evaluators included detailed evaluation guides prepared for each round of data collection, access to evaluation consultants for technical assistance, and evaluation workshops held four times during the evaluation. Training and technical assistance for staff involved in this study were provided by the outside evaluator and the state level program coordinator.

In addition to data collection and analysis, site evaluators maintained detailed technical notes on their work. Evaluators also responded to several surveys designed to monitor their reactions to the process and document preliminary conclusions about families and recommendations for program change. Site evaluators or their districts received a nominal honorarium.

Staff described several benefits of the study experience, including personal change, in-depth understanding of the families they serve, ideas for program change, better understanding of the evaluation process and information, and the superior advantage staff have over outsiders in engaging families in the evaluation. Challenges noted by the staff involved the complexity and demands of the evaluation, including the amount of time, energy, and organization required. Staff members found it difficult to complete evaluation duties while fulfilling their regular work with families.

Key lessons include:

  • Involving outside evaluation consultants who understand and value the importance of collaboration between evaluators and state and local evaluation users is essential.

  • Having a project manager—in this case a state-level program coordinator—working in partnership with evaluation consultants to communicate and monitor details of the evaluation, and act as a liaison among all involved provides a necessary link.

  • Planning training and technical assistance carefully (i.e., providing evaluation guides for each study phase, workshops when needed, and easy access to evaluation consultants) enhances the development of staff evaluation skills.

  • Approaching evaluation as a learning opportunity facilitates capacity building in organizations.

Discussion about new evaluation issues began before the study was completed. A new evaluation agenda is being implemented that responds to the recommendations of the most recent study and continues the tradition of the participatory approach to evaluation in ECFE.

Betty Cooke
Early Childhood and Family Initiatives Specialist
Minnesota Department of Children, Families and Learning
310 Capitol Square
550 Cedar Street
St. Paul, MN 55101
Tel: 612-296 - 6130
Email: betty.cooke@state.mn.us

High Stakes Stakeholders: Involving Youth in the Collection and Analysis of Evaluation Data


Stan Schneider and Berle Mirand Driscoll from Metis Associates writes about using students as ethnographers in a study of a family resource center.

In 1988, the Georgia legislature created the Youth Futures Authority (YFA) to “develop a comprehensive plan for public and private organizations to deal with youth problems, help implement the plan, and contract with the appropriate agencies to provide direct services under that plan.” The YFA established the St. Pius X Family Resource Center (FRC) in 1994 to serve as a community-based center offering various social services and community activities that benefit area residents.

Metis Associates recently completed an evaluation of the FRC. We used a participatory approach in which local stakeholders were asked to play significant roles in designing and executing the evaluation, and in analyzing and utilizing the results. Consistent with this, and with the FRC's dual interests in youth development and community service, fifteen neighborhood youth (middle school and high school students) were recruited, hired and trained to help design, administer, and analyze a youth survey about community conditions and needs. The survey also determined whether the FRC's programs and activities corresponded to the interests and needs of neighborhood youth.

Our goals in involving youth were to:

  • Incorporate youth into the evaluation in a meaningful way
  • Ensure that the youth survey reflected the ideas, concerns, and colloquial language of young people
  • Provide participating youth with experiences that enhance critical-thinking and research skills
  • Introduce career pathways related to research
  • Reinforce an emerging sense of communitas, or community spirit, that is reflected in the FRC's approach
  • Develop a cadre of neighborhood youth to support ongoing and future local evaluation efforts.

Our approach included training and support for the “youth survey consultants” in reviewing and summarizing the survey findings, preparing tables and graphs, writing about findings, and preparing oral reports to groups such as the YFA. Through a series of workshops, we provided training for these youngsters to make presentations to homeroom classes in their schools, and to distribute, collect, and tally surveys. Metis staff worked with youth survey researchers to identify strategies they could use to obtain commitments and support from school administrators and staff for the survey effort. We wanted youth to explain the purpose of the survey to their schoolmates, describe their own involvement in its development, and identify strategies they could use to encourage high participation rates. This, we believed, would enable youth to learn about and overcome typical administrative challenges as well as improve on the quantity and thoughtfulness of their peers' responses.

The youth survey researchers were able to obtain almost 800 responses from students, and many were able to obtain a response rate of 100 percent. Metis arranged for data entry and analysis of the survey. We met with the youth researchers to review and interpret the findings, and to help them write up selected findings. One student's article made the front page of The Savannah Herald. Two of the participating youth presented their findings at the Annie E. Casey Foundation's annual conference on research and evaluation. The survey uncovered a number of significant issues that are currently being addressed by the FRC.

Metis negotiated contracts with the youth participants specifying the activities in which they would provide assistance, and the terms of their employment. Stipends were provided based on the response rate for each class that was surveyed. To recognize and celebrate the students' accomplishment, Metis hosted a reception for the youth survey researchers and their families. At the reception, we presented the youth with Certificates of Achievement and letters of reference that they could use to help secure part-time or summer jobs.

We believe that the use of youth as evaluators and data collectors deepened the participatory evaluation approach by incorporating them into the assessment in a respectful and useful way. This methodology strengthened the FRC's understanding of and commitment to youth, and strengthened youths' connection with and commitment to their community. It also provided a means for Center staff to learn more about the interests and concerns of young people, and in turn, to develop a focused response that incorporated youths' perspectives into planning. Involving local youth in this type of evaluation supports them in developing skills needed to clearly define their interests, create plans or programs that enable them to meet their objectives, and advocate on their own behalf. The participatory evaluation process also left youth participants with enhanced interpersonal communication capabilities developed through training and use of interviewing skills that are transferable to other situations. Such a process helped increase their understanding of the importance of research, needs assessment and evaluation, and introduced possible career pathways. By providing youth with opportunities to participate directly in the assessment of their own community, we believe that one can strengthen the unity between learning and social life, and can promote the transfer of knowledge and social wealth from one generation to the next.

For further information, please contact Gaye Smith, Interim Executive Director, Chatham-Savannah Youth Futures Authority, 316 East Bay Street, Savannah, GA 31412. Tel: 912-651-6810. Metis Associates has developed a package of sample materials that can be readily adapted for use in a youth assessment activity. To obtain these, contact Berle Driscoll (see contact information below).

Stanley J. Schneider
Senior Vice President

Berle Mirand Driscoll
Senior Associate

Metis Associates, Inc.
80 Broad Street, Suite 1600
New York, NY 10004
Tel: 212-425-8833

Consumers Survey Each Other About Medicaid Managed Care


Cheryl Fish-Parcham of Families USA and Theresa Shivers of United Planning Organization/Head Start write about using client families in a study of managed health care.

In 1997, Families USA and the United Planning Organization's Head Start program in the District of Columbia began a project to involve parents in monitoring Medicaid managed care. A team of twelve Head Start parents surveyed 120 households regarding their Medicaid managed care experiences. The team reported their findings to District officials and health plans, successfully advocating for managed care improvements.

Parents described their own problems in managed care in early team meetings. Families USA used the team's suggestions and questions from other standardized surveys to develop a survey instrument. The resulting instrument included multiple choice and open-ended questions. After role playing and a pilot test, team members each received a list of randomly selected Head Start parents to survey face-to-face.

We learned several lessons about how to get maximum participation and improve the accuracy of survey results when using peer interviewers:

  • We needed to establish interviewers' legitimacy to respondents. We did this by providing letters of introduction signed by Head Start staff and by scheduling the face-to-face interviews at Head Start centers

  • Face-to-face surveys were effective but telephone surveys were not. Troubled by the time it was taking to schedule face-to-face appointments, we allowed parents to try a few interviews by phone. This did not work—respondents thought the interview might be a telephone solicitation or were otherwise suspicious; and many people in our randomly selected sample did not have working phones.

  • Staff follow-up on surveys was essential. We thought going into this project that peers would be especially effective in eliciting health experiences from respondents. In fact, peers were good initial screeners for managed care problems, but respondents sometimes told peer surveyors that their health problems were personal and asked that staff instead call them to learn about the details of a problem. We did this.

  • Ongoing meetings and education of the survey team helped the project succeed. The survey team needed to understand how managed care was supposed to work in order to probe for problems in the open-ended section of the interview. We discussed Medicaid beneficiaries' rights and case experiences in our monthly meetings and we developed a brochure, “Know Your Rights,” which addressed some frequently reported problems. Surveyors gave this brochure to respondents after they had completed a survey.

To keep the team together, the project paid members stipends to attend meetings and complete surveys; encouraged parents to speak at community functions; and convened monthly meetings. At these meetings, parents shared survey experiences with each other, and health officials provided updated information about managed care.

Overall, we found the peer survey an effective method of gathering information and of involving Medicaid consumers in advocacy. The survey team testified in city council hearings about their findings and met with the District's Medicaid officials. As a result, parents have seen changes in Medicaid managed care this year. The District has invested in enrollment education. Plans have revised their member handbooks to explain how to get mental health services and how to file complaints. The survey team has established an ongoing relationship with local officials. The team refers individual and systemic problems to the Medicaid agency and meets with officials to ensure a response.

The project's effect on team members is as important as its results for managed care. Participating parents have become empowered with information. Parents have shared their knowledge of managed care with other parents and have learned how to make changes in the system. Because they now know both how managed care is supposed to work and how consumers actually fare, team members are confident in expressing their needs and recommendations to government officials.

For further information, contact Cheryl Fish-Parcham (see contact information below).

Cheryl Fish-Parcham
Associate Director of Health Policy
Families USA
1334 G Street, N.W.
Washington, DC
Tel: 202-628-3030

Theresa Shivers
Chief of Health Maintenance and Special Needs Branch
United Planning Organization/Head Start

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project