You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼

Program Description

Overview The Walnut Street Elementary After School Program (WSEASP) was one component of the William Penn Initiative for Neighborhood Success (WINS), an initiative to provide youth in the William Penn School District (in the Philadelphia metropolitan area) with summer and after school services. WSEASP was one of three elementary school after school programs and one middle school after school program operating in the district funded by a 21st Century Community Learning Center (21st CCLC) grant and jointly operated by Foundations, Inc. The after school program was primarily designed to support improved youth academic outcomes through the use of a supplemental curriculum, basic skills instruction, and homework assistance. All programs were discontinued after the 2002–2003 school year when the grant ended.
Start Date fall 2000; completed spring 2003
Scope local
Type after school
Location urban
Setting public school
Participants elementary school students
Number of Sites/Grantees 1
Number Served 105 in 2002–2003
Components WSEASP operated from 3:00 p.m. until 6:00 p.m. Monday through Friday on days when school was in session. The majority of the program time was devoted to a supplemental curriculum, basic skills instruction, and homework assistance, delivered in partnership with Foundations, Inc., a local provider of out-of-school time programming. Fridays were typically reserved for clubs or other special events. According to program plans, on Monday through Thursday, youth would spend the first 45 minutes at recess and then eat a snack. Then youth were to be divided between four instructional groups based on age, with the remaining program time devoted to the academic support services. Each instructional group was to be led by two staff (one program teacher and one assistant).

Once youth were in their instructional groups, program teachers were expected to implement the Foundations, Inc. curriculum, which was developed to reflect state standards in reading and math and was organized around themed units (e.g., Global Festivals). While the curriculum was developed for third graders, program teachers were expected to modify the activities as appropriate for the age level of participants. In instructional groups, teachers were to begin basic skills instruction, known as Foundations Achieves Math and English (FAME) activities. During the last 30 minutes of the program, youth were to be given time to work on and receive assistance with their homework. The short time devoted to homework relative to the other academic programming was reflective of the belief of Foundations, Inc. that the homework should not be finished during the program so that youth and their parents could work on it together in the evening. Finally, a Walnut Street Elementary classroom teacher was hired to provide tutoring to fifth-grade youth 3 days a week in preparation for statewide standardized testing.

Program teachers had several means of supporting their implementation of the planned programming. First, Foundations, Inc. offered professional development, which focused on the planned curriculum activities for the year, at the beginning of each school year. Second, the program had several means of identifying areas of youth academic successes and weakness in order to better plan for the basic skills instruction. For example, the Head Teachers had access to youth’s monthly FAME test data and biannual skills assessments administered by Foundations, Inc. These data were to be passed on to the program teachers, who then had the flexibility to decide how they would use the data to inform their FAME instruction. The school’s interim principal also shared participants’ PSSA and district-administered TerraNova test data with the Head Teacher, and youth were asked to show their report cards to program staff.
Funding Level Approximately $150,000 for WSEASP for 2002–2003
Funding Sources 21st Century Community Learning Centers program, parent fees


Overview An experimental impact study and process evaluation began at the WSEASP during the 2002–2003 school year and was to be expanded during the 2003–04 school year to the other two elementary school after school programs funded by the 21st CCLC grant in the district. While the loss of the program in 2003 prevented the study’s expansion, it provided the opportunity to answer questions concerning response to program loss, such as how the loss of the program impacted families. Outcomes were collected for the initial experimental WSEASP study sample for 2 years (following the first year of program enrollment and then again after the program ended), and a descriptive study of parental response to the loss of programming was added during the 2003–2004 school year.
Evaluator Susan Goerlich Zief, University of Pennsylvania
Evaluations Profiled A Mixed-Methods Study of the Impacts and Processes of an After-School Program for Urban Elementary Youth
Evaluations Planned None
Report Availability Zief, S. G. (2005). A mixed-methods study of the impacts and processes of an after-school program for urban elementary youth. Unpublished Doctoral Dissertation, University of Pennsylvania.


Evaluation Susan Goerlich Zief, Ph.D.
Mathematica Policy Research
600 Alexander Park
Princeton, NJ 08540
Tel: 609-275-2291
Fax: 609-799-0005
Program Susan Goerlich Zief, Ph.D.
Mathematica Policy Research
600 Alexander Park
Princeton, NJ 08540
Tel: 609-275-2291
Fax: 609-799-0005
Profile Updated July 5, 2006

Evaluation: A Mixed-Methods Study of the Impacts and Processes of an After-School Program for Urban Elementary Youth

Evaluation Description

Evaluation Purpose To address four questions: What impact does WSEASP have on adult supervision and support of youth, youth activity participation, youth behavior, youth social and emotional development, youth academic outcomes, and parental outcomes? What is the demand for after school programming among WSES youth? What are the program’s key operational features and strategies? How does the loss of programming affect youth, their families, and the surrounding community?
Evaluation Design Experimental and Non-Experimental: There were two primary evaluation components: a randomized impact study designed to measure WSEASP’s impact on 35 outcomes and a non-experimental process evaluation of program activities and operations used to contextualize and understand the impact estimates. The randomized study consisted of 102 youth who had not previously participated in the program, while the non-experimental study included both new and returning participants, parents, staff, nonparticipants, and program observations.

All youth who participated in the program prior to the 2002–2003 school year were offered a program slot and invited to be part of the overall study for the purpose of helping evaluators understand participant interest in and response to the program. However, the sample of returning youth was not included in the random assignment impact analysis.

At the beginning of the school year, program slots were filled mostly by returning youth, but over the course of 6 months, enrollment was eventually offered to 40 youth selected at random from 118 new applicants. These spaces were made available after an additional teacher was hired by the program and after some returning youth stopped attending the program. The 40 youth who were selected to participate comprised the experimental group, and the remaining 62 applicants who consented to participate in the study but were not assigned to the program constituted the control group. There were no significant differences on any measured background characteristics between the program and control groups.

All program and control youth from the 2002–2003 experimental sample who were still living in the area and attending district elementary schools in 2003–2004 were included in the 2nd-year follow-up sample. The 2002–2003 experimental sample closely resembles the 2003–2004 experimental sample, allowing for a comparison of impact findings across the 2 years. Similarly, the experimental and control groups within the 2003–2004 sample are similar on all measured background characteristics except that a significantly greater percentage of program youth were receiving special education services (p = .02).

The control group response rates (youth: 80%, parents: 59%, teachers: 83%) were lower than the participant group response rates (youth: 96%, parents: 85%, teachers: 90%) for all surveys conducted in spring 2003. Similar patterns of response rates were obtained in the 2004 survey distribution (control group youth: 84%, control group parents: 71%, participant youth: 94%, participant parents: 100%). In 2003, parents who responded were more likely to have children who were eligible for free or reduced price lunch (p = .01) and were more likely to be parents of daughters (p = .02) than parents who did not respond. Also, respondents were more likely to be parents of children who were eligible for special education services (p = .12) than nonrespondents. These patterns of nonresponse were similar in both groups. Because of these patterns of differential survey response, these background characteristics were controlled for in the analyses, along with family background, youth demographics, and youth’s previous academic performance.

The process evaluation was designed to capture the range of programming experiences (academic support, recreational activities, and homework assistance), relationships between students and the staff, stakeholder response to the program, and response when programming unexpectedly ended for the 2003–2004 school year.

Data were also collected from 58 students enrolled in Walnut Street Elementary who had never applied to the program to examine if and how fall 2002 program applicants differed from other youth in the school.
Data Collection Methods Interviews/Focus Groups: The program’s Head Teacher was interviewed in January 2003, and then again before his departure from the program in April 2003. The new Head Teacher responded to informal questions immediately after her hire, and then a more formal interview was conducted with her at the end of the 2002–2003 school year. The research team was in frequent contact with the district’s program coordinator, and data were collected from these informal conversations. Frequent trips to the school provided opportunities to speak with the school’s interim principal on many occasions; a formal interview was then conducted at the end of the 2002–2003 school year.

A focus group was conducted with program teachers and assistants in spring 2003 to probe further on many of the survey questions the program teachers had recently completed, with a specific focus on suggestions for program improvement.

Observation: During the 2002–2003 school year, 20 separate program observations were made by two observers, and descriptive field notes were transcribed shortly after each observation. Observers paid particular attention to the program schedule, activities, and relationships between youth and staff.

Secondary Source/Data Review: Youth’s school records provided demographic data (including gender, grade in school, age, race, and special education status), school attendance/absence data, and academic outcome data (math and reading grades). These data were collected at baseline (following the 2001–2002 school year), and after 1 year of the impact study (summer 2003).

Surveys/Questionnaires: In spring 2003, surveys were administered to program and control group youth in grades 3 through 6, classroom teachers, and parents to understand youth supervision and support, youth participation in activities, youth behaviors, youth social and emotional development, youth academic engagement, and parental outcomes (e.g., ability to work, parent involvement in school). Follow-up surveys were distributed to youth and parents of the 2003–04 experimental sample in winter 2004. Youth surveys included questions related to parental help with homework; availability of homework help; art/music/dance/drama lessons and tutoring received; participation in school activities and clubs; time spent hanging out with friends, watching television and doing homework after school; deviant behaviors (skipping school, taking something without paying, smoking, drinking alcohol, using drugs); safety after school; college aspirations; school attachment; and self-esteem. Parent surveys included questions related to their child’s unsupervised time; whether they missed work over childcare; and concerns over their child’s academics, safety after school, getting into trouble after school, friends, and substance use. Teacher surveys included questions related to youth’s academic focus and teachers’ use of discipline with the youth.

Surveys administered to program participants and their parents in spring 2003 also asked about key aspects of the program, the quality of program activities, and overall satisfaction with the program. Both treatment group members and other program participants who were not randomly assigned were surveyed. Program teachers completed a survey which asked them about the program’s organization and activities. Classroom teacher surveys asked about their knowledge of the program’s goals and activities and suggestions for improving the integration of the program with youth experiences during the normal school day.

In winter 2004, surveys were administered to former program enrollees and their parents to better understand their response in the wake of the loss of programming.

Tests/Assessments: Two observation tools were used to gauge program quality: the School-Age Care Environment Rating Scale (SACERS; Harms, Jacobs, & White, 1996) and the Quality Assurance System (QAS) developed by Foundations, Inc. The two program observers used the SACERS during one of the scheduled program observations. External consultants hired by Foundations, Inc. rated the site using the QAS in spring 2003. The SACERS was specifically designed to rate after school care arrangements for elementary youth, and is based on criteria for developmental appropriateness for school-age children. The ratings are based on a 7-point scale, including “inadequate” (1), “minimal” (3), “good” (5), and “excellent” (7). The QAS examines program space; health and safety; program materials and supplies; program structure; staff development; staff responsibilities, involvement and interaction; parent responsibilities, involvement and interaction; district staff and community partnership, involvement and interaction; and, program content. Ratings are based on a 4-point scale, including “non-existent” (0), “below standard” (1), “standard” (2), and “above standard” (3).

Harms, T., Jacobs, E., & White, D. (1996). School age care environment rating scale. New York, NY: Teachers College Press.

Foundations, Inc. Quality Assurance System. Available at:
Data Collection Timeframe Data were collected between 2002 and 2004.

Formative/Process Findings

Activity Implementation Interviews and observations revealed that WSEASP did not follow the planned academic program’s structure. Instead, the enacted program reflected the more relaxed goals of the staff, who wanted to create an environment where youth could enjoy themselves and “just be kids.” Reflective of these beliefs, recess and snack activities often comprised the majority of the program time. After a new Head Teacher arrived in April 2003, most program time was devoted to dealing with discipline issues.

According to program observations, on a typical program day, youth first gathered for recess. Recess activities were fairly similar, regardless of whether they occurred indoors or outside, and were not judged to be of the quality that would engage youth interest over the long term. Most boys played basketball or football, sometimes joined by a male program assistant. Girls generally jumped rope or talked in small groups. When recess was outdoors, some youth chose to play on the playground equipment. Because several program teachers could not be at the program when it began at 3:00, few staff were available to assist with recess and to organize recreational activities. There was little variation in activities, and staff generally did not initiate or supervise any organized games.

When the academic program was implemented, it was not implemented consistently between the instructional groups, and the overall quality was observed to be low. For example, one of the program teachers brought enthusiasm and creativity to the curriculum instruction. However, the observed curriculum activities that were implemented in the other three instructional groups were mostly well below the level of the youth, if they were implemented at all, and teachers rarely followed the curriculum activities with basic skills instruction. The research team noted that at least two of the instructional groups received very little valuable academic instruction during the entire program year.

The homework assistance was often unorganized and poorly monitored, and youth’s work was not usually checked for completion or understanding. Many youth, especially the older ones, spent little time on homework and instead turned to computer or board games.

The observed tutoring sessions revealed that they were offered to more than those youth originally targeted for standardized test preparation (the fifth grade program participants), were rowdy, and mostly consisted of math games instead of focused preparation for the upcoming PSSA.

While Fridays were usually devoted to clubs, evaluators noted a lack of available and interesting options. Program teachers were expected to arrive each Friday with a planned club activity for their group, but few did so. One program teacher often arrived with interesting ideas, but she was hampered by a lack of materials to implement them unless she brought her own. On a typical club day, a movie was usually shown to the entire group. The older youth, who were unenthusiastic about the movie choices, would go to the gymnasium for unorganized games instead.

The “one size fits all” program curriculum was found to be unsuccessful with youth and staff. The Head Teacher explained that the staff and youth “despised” the curriculum, finding it “repetitive, very boring, and full of drawing pictures” and well below the achievement level of the older youth. Despite curriculum developers’ assumptions that the lessons could be modified to meet the youth’s achievement levels, teachers either found the lessons not conducive to this or found little time to do that work. As the Head Teacher explained, “The older kids think the curriculum is too low-level for them. It is really difficult to ‘scale up’ activities for the older kids—how do you scale up a lantern dance?”

Program teachers and assistants suggested that the curriculum could be improved with more “hands-on” lessons that asked youth to use their creativity. Also, they supposed that activities that had particular connections to the youth’s age and issues they frequently encountered in their immediate environment, like poverty and substance abuse, would have been engaging. The Head Teacher added that the older youth would have been more interested in activities that took them out of the school and into the community. As he explained, the older youth were especially resistant to doing “school after school,” and the program needed a more creative approach to engaging these youth.
Program Context/Infrastructure According to evaluator observations, the space available for the program’s academic and recreational activities was not optimal. Outdoor recess activities occurred on the school’s playground, which consisted of concrete play surfaces that were in need of repair and two pieces of playground equipment that were old, not well maintained, and on a patch of hard dirt. When outdoor play was not possible, recreational activities were held in the gymnasium, but after a flood warped the floor, activities in the gymnasium were limited.

Some of the program space was found not to be conducive to academic work. The Head Teacher reported that only two classroom teachers had volunteered their classrooms for program use; these spaces were given to the two youngest instructional groups. The two older instructional groups worked in the cafeteria. One group gathered on a stage, which housed few desks and had no board on which the teacher could write, while the other group worked at the cafeteria tables.

Program teachers identified the need for supplies to support recreational and academic activities, such as books, problem-solving games, educational computer software, and sports equipment. The lack of Internet connection was also cited as a drawback by program teachers, who suggested that youth’s interest in using the program’s computers could have been enhanced with the availability of online resources, while the staff may have used the Internet to find activities to supplement the curriculum and recreational programming.

Both the SACERS and QAS rubrics generally supported the observation and interview data, which suggested the limitations of the program’s space and materials.
Program–School Linkages While the program received several sources of youth data, including test scores and report cards, this information did not appear to be readily available to the program teachers, and there was little time for them to use any data they did receive. Program teachers reported that they had a general knowledge of who experienced disciplinary and general academic problems during the school day, but used very little additional information to plan programming or assist with homework. Program teachers recommended receiving periodic feedback from classroom teachers on the youth in their instructional groups. Some classroom teachers also recognized the lack of communication as a weakness and suggested “time to meet and discuss youth” and “more open dialogue about the youth participating in the program.”

The program teachers suggested that youth may have gained more of an academic boost if their after school program work had been more aligned with what they were learning in the classroom.
Recruitment/Participation Applicants in September 2002 and nonapplicants differed on three of the dimensions measured. First, parents of younger youth were more interested than parents of older youth in enrolling their child in the program (p = .01). Second, applicants had lower academic achievement during the previous (2001–02) school year than nonapplicants (p = .01). Third, applicants included a lower proportion of White youth and a higher proportion of African American youth than did nonapplicants (p = .09).

Of the 40 youth randomly assigned to WSEASP, 22 youth actually attended. There were no measured differences between attendees and nonattendees, including the number of weeks a student was eligible for program participation.

Days of program attendance per youth ranged from 0 days (45% of program group) to 132 days over the 2002–2003 program year. Only 35% of program youth randomly assigned to the program attended for more than 60 days; the average participation rate of program enrollees was 61% of the days for which they were eligible to attend. The number of days of program attendance was not related to the number of weeks a youth was eligible for the program or any other factors

Youth who had not attended the program 2 or more days the week before survey distribution were asked to identify why their attendance had been low. While 40% of these youth did not identify a specific reason, 20% said they did not attend because the program was boring, and 40% said their parents restricted their involvement with the program the week before.

According to survey responses, parents were not as interested in the planned academic programming as they were in having dependable, affordable care while they worked (64% of respondents). In fact, parents did not expect that the program would improve youth’s academic achievement (0%), and few thought that the program primarily served the purpose of helping youth complete their homework (3%).
Satisfaction The majority of program participant survey respondents were not enthusiastic about their program experiences: 52% reported that the “program is OK” and another 22% reported that they did not like the program. Additionally, 45% reported that they would not recommend the program to a friend, and 35% did not look forward to going to the program. Yet, 74% of youth reported making four or more new friends at the program.
Youth also reported dissatisfaction with the program’s disciplinary techniques: 75% of survey respondents agreed or strongly agreed that there were too many rules to follow at the program, and 68% agreed or strongly agreed that program teachers punished kids without knowing what happened.

In contrast, a majority of parental survey respondents reported that: their child enjoyed the program (97%); they were satisfied with the overall program (91%); they agreed or strongly agreed that they were pleased with the number and variety of program activities (88%); and that they found the program to be a positive and safe place for their child (80%). Additionally, 88% of parent respondents gave the program either an A or B grade.
Staffing/Training Interviews revealed that program and school staff had a strong relationship. Without any formal agreement, classroom teachers and the Head Teacher knew each other and understood the need for communication about youth. The Head Teacher reported frequent conversations with the classroom teachers, which occurred by chance in common gathering areas throughout the school. Classroom teachers expressed a willingness to share data on the youth who enrolled in the program to support the academic services.

The research team observed many interactions and received survey responses that suggested that strong, positive relationships had developed between some program staff and families over the many years that the program had been available. The staff likened themselves to an extension of, or surrogate for, youth’s families and called the program “home life” for many youth. In many ways, the staff seemed to be aware of youth needs and preferences, providing the potential to create a program that was relevant to youth’s experiences and expectations. In fact, on the day the program was assessed using the QAS, the observer found that the program was “above standard” in all areas of staff supervision, involvement and interaction, giving the program 27 out of a possible 27 points. On the other hand, SACERS ratings implied that the relationships between youth and staff were inconsistent and varied by staff. While one program teacher was rated as “excellent” in interactions with youth, relationships between other staff and youth were rated as “inadequate” and “minimal.”

The program hired staff who were not available for the entire time youth were present. Three program teachers traveled to the WSEASP from other schools where they taught during the school day, often arriving close to 4:00 and after almost 1 hour of programming. Program teachers noted that youth may have been more able to focus on academic programming at the beginning of the program day instead of at the end of the day, yet without sufficient staff at that time, the academic components could not be implemented earlier. Furthermore, one program teacher was only available to teach at the program 3 days a week, and on the other 2 days, the assistant often took the youth to the gym to play basketball during the time other youth were in their instructional groups.

The quality of the professional development experiences available to the staff was rated poorly by program teachers, and likely not supportive of improving instruction. Program teachers described the mandatory sessions in the beginning of the year as “repetitive and boring” and reported that they would have brought more enthusiasm to their implementation of the academic programming if they had had a larger repertoire of ideas and materials on which to draw. However, the Head Teacher implied that as veteran classroom teachers, the program teachers didn’t need professional development.

The quality of the academic programming was not well monitored, despite recognition from program leadership that it was not well implemented.

In April 2003, the program's Head Teacher left the program. A replacement came to WSEASP with many years of experience in the after school field, in academically focused programs. Despite the new Head Teacher's familiarity with the program and staff, the transition in leadership was difficult. For example, on June 2003 surveys, 96% of participants reported that they were “happy” or “very happy” with the first Head Teacher, while only 38% were happy or very happy with the new Head Teacher. Program teachers and assistants reported liking the first Head Teacher’s flexibility and ability to adjust to the needs of the youth.

Summative/Outcome Findings

Academic No significant effects were found on academic outcomes.
Family No significant effects were found on parent outcomes.

After the program was not available during 2003–2004 school year, program parents were significantly more concerned than control group parents about their child’s safety after school (p < .05). However, this difference was not due to an increase in the parents of program youth being concerned about safety (there was a slight rise from 85% in 2003 to 90% in 2004). Instead, this finding reflects a decrease in the percentage of control group parents who were concerned about safety (from 86% in 2003 to 58% in 2004).

After the loss of the program, one third of the parents who had formerly enrolled their children in the program reported either no major impact or that they did not look for alternative programming. While 24% of parents reported that they eventually found after school care for their child, none identified an organized program that their child attended. Several parents identified a new activity at school. However, the principal reported that no organized programming had developed. Instead, the principal described former program participants loitering in the hallways until parents picked them up.

After the loss of the program, more than 25% of parents reported an impact on their work hours, and half of these respondents (12% of overall respondents) were losing pay or their employment as a result of efforts to supervise their child after school. According to parental reports, 12% subsequently left their child unsupervised after school.
Youth Development Program youth saw a tutor more frequently than did control group youth, and this difference was significant (p < .05).

Program youth spent time with friends in the neighborhood after school less frequently than did control group youth; this difference was significant (p < .05).

Program youth watched television in the afternoon on fewer days than control group youth (this difference was significant, p < .001), yet watched no less television in a given day than did their control group counterparts. These impacts did not persist at the time of the 2004 follow-up (after the program had been terminated).

No significant effects were found on social or emotional outcomes or on other behavioral outcomes.

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project