You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


A hypothetical initiative of Family Resource Centers was developed by Pelonomi Taylor Khumoetsile, a Research Assistant at the HFRP; she called it the Early Childhood Education and Family Support Center Initiative in Old Grove. This prototype was then evaluated by three evaluators, or teams of evaluators, including Sally Ward, Kathleen McCartney and Cynthia Duncan from the University of New Hampshire; Karen Pittman and Merita Irby of the International Youth Foundation and Michele Cahill of the Fund for the City of New York; and Becky Winslow of the Georgia Department of Human Resources. The case study of the prototype and the full versions of the three commissioned evaluations will be published together as an HFRP monograph this summer.

Old Grove's Early Childhood Education and Family Support Center Initiative

A coalition of representatives from Old Grove was brought together to develop a plan to improve children's long-range education, development, health, and well-being outcomes. The coalition included city council members, community and child welfare advocates, representatives from the local hospital and schools, state agencies, and business people. The central feature for implementing the initiative is the development of neighborhood-based, community-managed family centers that incorporate and coordinate a range of child and family services at a site located near an elementary school.

The initiative is beginning the first year of a five-year funding period. Presently, one family support center, the Robinswood Early Childhood and Family Support Center, has been established. Depending on Robinswood's success, four other FSCs, with similar program designs, will be implemented over the next five years. The yearly FSC operating budget of $600,000 comes from state, federal, and private funds.

The centers will serve a diverse population of mostly white, African-American, and Puerto Rican families located in communities where poor and working class families predominate. FSC services are open to low-income and poor families living in the targeted communities with children ages 0 to 11.

The FSC service delivery model was designed to provide coordinated services in which case management is used to organize delivery and referral of services. Mandated services for all FSCs include: case management, home visits, parent education classes, adult literacy classes, ESL classes, health and development screening for children, tutoring for school-age children, enrichment programs for preschool age children, child care, toy-lending, support groups, mental health, substance abuse, family planning, budgeting, and housing services, as well as other appropriate family counseling services, father focused parenting and support group services, and employment training.

An Executive Management Team (EMT) was set up to get the project through the hiring, recruitment, and start up phases. The EMT was intended as a transitional structure to be partially phased out and replaced with a Community Advisory Board (CAB). The EMT sets minimum service delivery parameters and approves funding allocations and spending for all FSCs. It is intended, however, that each center's CAB will jointly plan, implement, and evaluate services and procedures, and delegate outcome responsibility for its particular center.

A foundation is funding one-half of the five-year FSC evaluation costs. Another foundation is funding one-third of the evaluation costs and the state is funding the remainder. The evaluation budget is approximately $100,000 per year for five years (total budget is $500,000). Each of the funders has its own program performance reporting requirements.

Evaluators will have access to MIS data from the school district, local social service, and public health agencies. This data includes: family profile forms, adult educational and vocational training records, child care and education records, developmental screening and assessment forms, family and child service plans, family service contact forms, and referral forms.

Evaluating Old Grove's FSC Using an Experimental and Quasi-Experimental Approach

The first team of evaluators—Sally Ward, Kathleen McCartney, and Cynthia Duncan from the University of New Hampshire—used an experimental and quasi-experimental approach. According to the evaluators, the proposed evaluation is constrained by the available resources ($100,000 per year) and by the fact that some concrete findings must be reported within 18 months.

Their evaluation addressed both the short- and long-term goals of the Center by focusing on three sets of questions at three levels:

  1. At the institutional level, does the FSC deliver services in a more integrated and efficient way? How effective are staff training and development efforts that aim to improve coordination? This is the immediate, short-term goal of the FSC: to improve the delivery of services.
  2. At the individual child and family level, does a family's connection with the FSC improve the development of the children and all members' educational achievement, social, physical, and mental health? Does the FSC connection improve access to the services needed by the child and family? Improving conditions for families and children is an important longer-term goal of the FSC.
  3. At the community level, have family conditions improved throughout Old Grove? Does the FSC build on family strengths through the provision of services to at-risk families? The FSC model presumes that, ultimately, the condition of the whole community will improve, since the situation for community families will improve. This is, then, an important long-term goal of the FSC.
Research Question One. To answer the first set of research questions, the evaluators used a quasi-experimental design where the staff of the center serves as its own control group in a simple pre- and posttest only design. There will be two post-tests, one at nine months and one at eighteen months, paralleling the observation times for the family and child data collection. Two types of data will be collected at each of the observation times.

Research Question Two. A simple pre- and posttest design is not a strong design for drawing conclusions about program outcomes. To answer questions about the impact of the FSC on individual children and families, the evaluators proposed a randomized controlled experiment with two treatment groups and one control group. The data will be collected by periodic interviews with the families.

Research Question Three. The community level question will be addressed with a quasi-experimental interrupted time series design. This is a powerful design for analyzing community-level trends before an intervention (in this case, the Family Support Center) and the change in the trend after the intervention. Aggregate data on several community indicators will be collected, including data on economic trends, education, community health, poverty and unemployment, and crime.

The strength of the experimental aspect of the design lies in the ability to maximize the internal validity of the conclusions. Nonetheless, there are problems with such designs, particularly for social interventions like family support centers. One obvious issue is the definition of the treatment itself. The Family Support Center provides many services in many different forms, and it is thus difficult to define the treatment. A second issue is that it is difficult to carry out an experimental design in a field setting, as opposed to a laboratory. Families move, drop out of the program, or otherwise remove themselves from the study.

The evaluation challenges are extensive. At the same time, the promise of the family resource movement and centers like the Robinswood Family Support Center require the strongest evaluation designs possible. The evaluators believe an experimental and quasi-experimental approach, as outlined here, would provide valuable data for program development and program support.

Mixing it Up: A Participatory Evaluation of Old Grove's FSC

The second team of evaluators—Karen Pittman and Merita Irby of the International Youth Foundation and Michele Cahill of the Fund for the City of New York—point out that increasingly researchers are recognizing the limitations of traditional evaluation. Researchers are joining with practitioners and stakeholders to seek methodologies that enable both systematic and independent assessment and encourage participant ownership of the goals, processes, and findings. Participatory evaluation is one such approach.

Their evaluation was written as an example of the type of preparatory document a participatory evaluation consultant team might produce for a program such as the Robinswood Family Support Center. It does not present an evaluation plan. Rather it offers a framework to guide the Community Advisory Board (CAB) as they think through why and how they should plan and staff an evaluation and share findings with stakeholders. It offers the CAB a process for answering three broad sets of related questions:

  1. What are the goals and expectations for the Family Support Center?
  2. What questions should guide the planning and evaluation processes?
  3. How will the success of the Center be measured?
The team of evaluators recommends that the CAB consider a four-step process.

Step #1: This step involves a retreat to identify the goals, strategies, and expected services of the Robinswood Center. CAB members would define long-term individual outcome goals (e.g., improve job readiness), immediate strategic goals (engage the community, eliminate service fragmentation), process or implementation goals (e.g., use family advocates), specific service goals (e.g., offer child care), and concrete impact goals (e.g., serve 75 children). An evaluation committee will be formed to work with the evaluation consultants.

Step #2: Next the consultants would work with the CAB to write a report, laying out an evaluation framework in simple, straightforward terms, and inviting specific comments and definitions of indicators. This report would be shared with all the stakeholders, including staff, participants, community leaders, other service providers, and funders. This is the crux of a participatory evaluation: involving the stakeholders in an interactive, mixing it up process (interviews, focus groups, and public meetings) that allows them to take ownership of the evaluation process.

Step #3: The CAB members, or its evaluation team, are then ready to work with evaluation consultants to develop an evaluation plan. Again, this is an interactive process. This step allows stakeholders to prioritize the types of outcomes desired and define the specific criteria for success. The evaluation plan flows directly from the planning questions which have been raised.

Step #4: The final step involves developing a management information system to collect, analyze, and disseminate data and information. The management information system would include “hard data” (e.g., family profile forms and school records), as well as the information gathered during the interviews, focus groups, and public meetings. The findings would be disseminated frequently, for example, in the form of a newsletter.

According to the evaluators, one of the most powerful aspects of a participatory evaluation approach, especially when implemented in the early stages of program development, is that planning, implementation, and evaluation can be intertwined. Programs and services can be added, deleted, or revised. Rather than taking statements of goals and objectives as givens to be tested, a participatory approach can be used to develop a dialogue between planning and evaluation processes that strengthens both.

Participatory evaluation substitutes multiple and varied “snapshots” of progress and process for more elaborate but singly focused snapshots. (Hence their strong recommendation for an on-site coordinator to synthesize these multiple perspectives). If done correctly, however, they believe that these snapshots will present a truer and more useful picture to guide planning, generate ownership, and ensure that an impact is made.

A User-Focused Evaluation of Old Grove's FSC

Becky Winslow of the Georgia Department of Human Resources uses a self-evaluation approach, one in which evaluation is wholly integrated with an ongoing strategic planning process. Her approach includes: getting started; organizing and managing the evaluation; and the evaluation plan. One of the primary goals of this approach is to develop an information system that supports ongoing data entry for evaluation as an integral part of service delivery and management, thus reducing the burden of data collection and reporting.

The evaluation of the Robinswood FSC would be formative and summative and would use multiple measures with multiple groups of subjects. The proposed plan would proceed in three phases covering a period of approximately two years in which time the capacity for having process, impact and outcome data would be in place, as well as the ability to use the evaluative data.

Further Reading


Bruner, C. (1994). A framework for measuring the potential of comprehensive service strategies. National Center for Service Integration: Resource Brief, 7, 29–40. Falls Church, VA.

Connell, J. P., Kubisch, A. C., Schorr, L. B., & Weiss, C. H. (Eds.). (1995). New approaches to evaluating community initiatives. Washington, DC: The Aspen Institute.

Coulton, C. J. (1992). Framework for evaluating family centers. Center for Urban Poverty & Social Change.

Dunst, C. J. (1991). Evaluating family resource programs. Family Resource Coalition Report, No. 1.

Knapp, M. (1995). How shall we study comprehensive, collaborative services for children and families? Educational Researcher, 24(4), 5–16.

Phase 1: Defining the Evaluation Questions; Methods and Data to be Collected; Information and Reporting.

In collaboration with a steering group created early in the process, the evaluator aims to define the questions the evaluation is intended to answer and to design an evaluation process to get the answers. The proposed first phase of the evaluation involves understanding the context in which the Robinswood Family Service Center was conceived, developed and initially implemented.

Data for the evaluation would be quantitative and qualitative and derived from a variety of sources. Phase 1 would produce two studies, a policy/context study of Old Grove and Robinswood and an implementation study.

Phase 2: Methods and Data to be Collected; Information and Reporting.

The second phase would include process and impact studies. In this phase, baseline data of Robinswood community and Old Grove city would be developed (e.g., the demography of the residents) and the indicators selected, such as infant mortality, low birth weight, immunization status, teen pregnancy, and high school completion. Phase 2 would produce two reports, a Process Study Report and an Outcome Study Report.

Phase 3: Methods and Data to be Collected; Information and Reporting.

The methods and data collected for Phase 3 are similar to Phase 2. Process and outcome study reports would be produced similar to those produced in Phase 2. Databases for longitudinal analyses would be established and local staff would continue to be trained in self-evaluation practice.

Many of the benefits of self-evaluation are noted in Becky Winslow's evaluation. However, one of the greatest benefits of this approach, as she points out, is having sensitive, community-owned evaluative data for use strategically in ongoing decision making regarding policy, program and budgets. Other benefits include:

  • Ownership of the process and the results by users at all levels is possible with a self-evaluation approach. It can be used as a community-building approach when celebrating successes that are revealed by the evaluation.

  • A self-evaluation approach can lead to improved management and regular “course corrections” of strategies based on informed decision making.

  • Self-evaluation makes possible local accountability which encourages local program investment.

  • Evaluators become part of the team and are viewed as extensions of internal resources.

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project