Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.
Volume Il, Number 2, Fall 1996
Issue Topic: Family Resource Centers
Special Feature
Old Grove's Early Childhood Education and Family Support Center Initiative
A coalition of representatives from Old Grove was brought together to develop a plan to improve children's long-range education, development, health, and well-being outcomes. The coalition included city council members, community and child welfare advocates, representatives from the local hospital and schools, state agencies, and business people. The central feature for implementing the initiative is the development of neighborhood-based, community-managed family centers that incorporate and coordinate a range of child and family services at a site located near an elementary school.
The initiative is beginning the first year of a five-year funding period. Presently, one family support center, the Robinswood Early Childhood and Family Support Center, has been established. Depending on Robinswood's success, four other FSCs, with similar program designs, will be implemented over the next five years. The yearly FSC operating budget of $600,000 comes from state, federal, and private funds.
The centers will serve a diverse population of mostly white, African-American, and Puerto Rican families located in communities where poor and working class families predominate. FSC services are open to low-income and poor families living in the targeted communities with children ages 0 to 11.
The FSC service delivery model was designed to provide coordinated services in which case management is used to organize delivery and referral of services. Mandated services for all FSCs include: case management, home visits, parent education classes, adult literacy classes, ESL classes, health and development screening for children, tutoring for school-age children, enrichment programs for preschool age children, child care, toy-lending, support groups, mental health, substance abuse, family planning, budgeting, and housing services, as well as other appropriate family counseling services, father focused parenting and support group services, and employment training.
An Executive Management Team (EMT) was set up to get the project through the hiring, recruitment, and start up phases. The EMT was intended as a transitional structure to be partially phased out and replaced with a Community Advisory Board (CAB). The EMT sets minimum service delivery parameters and approves funding allocations and spending for all FSCs. It is intended, however, that each center's CAB will jointly plan, implement, and evaluate services and procedures, and delegate outcome responsibility for its particular center.
A foundation is funding one-half of the five-year FSC evaluation costs. Another foundation is funding one-third of the evaluation costs and the state is funding the remainder. The evaluation budget is approximately $100,000 per year for five years (total budget is $500,000). Each of the funders has its own program performance reporting requirements.
Evaluators will have access to MIS data from the school district, local social service, and public health agencies. This data includes: family profile forms, adult educational and vocational training records, child care and education records, developmental screening and assessment forms, family and child service plans, family service contact forms, and referral forms.
Evaluating Old Grove's FSC Using an Experimental and Quasi-Experimental Approach
The first team of evaluators—Sally Ward, Kathleen McCartney, and Cynthia Duncan from the University of New Hampshire—used an experimental and quasi-experimental approach. According to the evaluators, the proposed evaluation is constrained by the available resources ($100,000 per year) and by the fact that some concrete findings must be reported within 18 months.
Their evaluation addressed both the short- and long-term goals of the Center by focusing on three sets of questions at three levels:
Research Question Two. A simple pre- and posttest design is not a strong design for drawing conclusions about program outcomes. To answer questions about the impact of the FSC on individual children and families, the evaluators proposed a randomized controlled experiment with two treatment groups and one control group. The data will be collected by periodic interviews with the families.
Research Question Three. The community level question will be addressed with a quasi-experimental interrupted time series design. This is a powerful design for analyzing community-level trends before an intervention (in this case, the Family Support Center) and the change in the trend after the intervention. Aggregate data on several community indicators will be collected, including data on economic trends, education, community health, poverty and unemployment, and crime.
The strength of the experimental aspect of the design lies in the ability to maximize the internal validity of the conclusions. Nonetheless, there are problems with such designs, particularly for social interventions like family support centers. One obvious issue is the definition of the treatment itself. The Family Support Center provides many services in many different forms, and it is thus difficult to define the treatment. A second issue is that it is difficult to carry out an experimental design in a field setting, as opposed to a laboratory. Families move, drop out of the program, or otherwise remove themselves from the study.
The evaluation challenges are extensive. At the same time, the promise of the family resource movement and centers like the Robinswood Family Support Center require the strongest evaluation designs possible. The evaluators believe an experimental and quasi-experimental approach, as outlined here, would provide valuable data for program development and program support.
Mixing it Up: A Participatory Evaluation of Old Grove's FSC
The second team of evaluators—Karen Pittman and Merita Irby of the International Youth Foundation and Michele Cahill of the Fund for the City of New York—point out that increasingly researchers are recognizing the limitations of traditional evaluation. Researchers are joining with practitioners and stakeholders to seek methodologies that enable both systematic and independent assessment and encourage participant ownership of the goals, processes, and findings. Participatory evaluation is one such approach.
Their evaluation was written as an example of the type of preparatory document a participatory evaluation consultant team might produce for a program such as the Robinswood Family Support Center. It does not present an evaluation plan. Rather it offers a framework to guide the Community Advisory Board (CAB) as they think through why and how they should plan and staff an evaluation and share findings with stakeholders. It offers the CAB a process for answering three broad sets of related questions:
Step #1: This step involves a retreat to identify the goals, strategies, and expected services of the Robinswood Center. CAB members would define long-term individual outcome goals (e.g., improve job readiness), immediate strategic goals (engage the community, eliminate service fragmentation), process or implementation goals (e.g., use family advocates), specific service goals (e.g., offer child care), and concrete impact goals (e.g., serve 75 children). An evaluation committee will be formed to work with the evaluation consultants.
Step #2: Next the consultants would work with the CAB to write a report, laying out an evaluation framework in simple, straightforward terms, and inviting specific comments and definitions of indicators. This report would be shared with all the stakeholders, including staff, participants, community leaders, other service providers, and funders. This is the crux of a participatory evaluation: involving the stakeholders in an interactive, mixing it up process (interviews, focus groups, and public meetings) that allows them to take ownership of the evaluation process.
Step #3: The CAB members, or its evaluation team, are then ready to work with evaluation consultants to develop an evaluation plan. Again, this is an interactive process. This step allows stakeholders to prioritize the types of outcomes desired and define the specific criteria for success. The evaluation plan flows directly from the planning questions which have been raised.
Step #4: The final step involves developing a management information system to collect, analyze, and disseminate data and information. The management information system would include “hard data” (e.g., family profile forms and school records), as well as the information gathered during the interviews, focus groups, and public meetings. The findings would be disseminated frequently, for example, in the form of a newsletter.
According to the evaluators, one of the most powerful aspects of a participatory evaluation approach, especially when implemented in the early stages of program development, is that planning, implementation, and evaluation can be intertwined. Programs and services can be added, deleted, or revised. Rather than taking statements of goals and objectives as givens to be tested, a participatory approach can be used to develop a dialogue between planning and evaluation processes that strengthens both.
Participatory evaluation substitutes multiple and varied “snapshots” of progress and process for more elaborate but singly focused snapshots. (Hence their strong recommendation for an on-site coordinator to synthesize these multiple perspectives). If done correctly, however, they believe that these snapshots will present a truer and more useful picture to guide planning, generate ownership, and ensure that an impact is made.
A User-Focused Evaluation of Old Grove's FSC
Becky Winslow of the Georgia Department of Human Resources uses a self-evaluation approach, one in which evaluation is wholly integrated with an ongoing strategic planning process. Her approach includes: getting started; organizing and managing the evaluation; and the evaluation plan. One of the primary goals of this approach is to develop an information system that supports ongoing data entry for evaluation as an integral part of service delivery and management, thus reducing the burden of data collection and reporting.
The evaluation of the Robinswood FSC would be formative and summative and would use multiple measures with multiple groups of subjects. The proposed plan would proceed in three phases covering a period of approximately two years in which time the capacity for having process, impact and outcome data would be in place, as well as the ability to use the evaluative data.
Further ReadingBruner, C. (1994). A framework for measuring the potential of comprehensive service strategies. National Center for Service Integration: Resource Brief, 7, 29–40. Falls Church, VA. Connell, J. P., Kubisch, A. C., Schorr, L. B., & Weiss, C. H. (Eds.). (1995). New approaches to evaluating community initiatives. Washington, DC: The Aspen Institute. Coulton, C. J. (1992). Framework for evaluating family centers. Center for Urban Poverty & Social Change. Dunst, C. J. (1991). Evaluating family resource programs. Family Resource Coalition Report, No. 1. Knapp, M. (1995). How shall we study comprehensive, collaborative services for children and families? Educational Researcher, 24(4), 5–16. |
Phase 1: Defining the Evaluation Questions; Methods and Data to be Collected; Information and Reporting.
In collaboration with a steering group created early in the process, the evaluator aims to define the questions the evaluation is intended to answer and to design an evaluation process to get the answers. The proposed first phase of the evaluation involves understanding the context in which the Robinswood Family Service Center was conceived, developed and initially implemented.
Data for the evaluation would be quantitative and qualitative and derived from a variety of sources. Phase 1 would produce two studies, a policy/context study of Old Grove and Robinswood and an implementation study.
Phase 2: Methods and Data to be Collected; Information and Reporting.
The second phase would include process and impact studies. In this phase, baseline data of Robinswood community and Old Grove city would be developed (e.g., the demography of the residents) and the indicators selected, such as infant mortality, low birth weight, immunization status, teen pregnancy, and high school completion. Phase 2 would produce two reports, a Process Study Report and an Outcome Study Report.
Phase 3: Methods and Data to be Collected; Information and Reporting.
The methods and data collected for Phase 3 are similar to Phase 2. Process and outcome study reports would be produced similar to those produced in Phase 2. Databases for longitudinal analyses would be established and local staff would continue to be trained in self-evaluation practice.
Many of the benefits of self-evaluation are noted in Becky Winslow's evaluation. However, one of the greatest benefits of this approach, as she points out, is having sensitive, community-owned evaluative data for use strategically in ongoing decision making regarding policy, program and budgets. Other benefits include: