You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview The Communities Organizing Resources to Advance Learning (CORAL) initiative in California worked to link communities, institutions, and residents around the common goal of improving youth academic achievement through the provision of structured literacy programming and enriching out-of-school time opportunities. Communities with low-income, low-achieving schools in five cities (Sacramento, San Jose, Fresno, Pasadena, and Long Beach) were part of the initiative. CORAL served youth in grades 1–8 (with the majority in grades 1–5).
Start Date The initiative launched in 1999, and programming started in 2001 in Long Beach and Pasadena, in 2002 in Fresno, and in 2003 in Sacramento and San Jose. The initiative ended in 2006.
Scope state
Type afterschool
Location urban
Setting community-based organizations and public schools
Participants elementary through middle school students
Number of Sites/Grantees 31 school-based sites and 6 community-based sites in 2004-2005; 30 school-based sites and 3 community-based sites in 2005–2006
Number Served Approximately 5,000 per year
Components

A lead agency (a local nonprofit organization) was selected by the initiative funder to plan and run CORAL in each city. The lead agency worked with schools and community-based organizations (CBOs) to provide afterschool programming for youth. CORAL cities initially implemented a variety of programming. The common components across sites were homework help and enrichment programming, which included activities such as art or dance, usually without an academic focus.

In the 2004–05 school year, CORAL began to provide more consistent academic programming as part of a new “balanced literacy” approach, focused on improving participants’ reading skills. CORAL provided its cities with information on two balanced literacy program models, Youth Education for Tomorrow (YET) and Kidzlit, but cities were free to implement other programs of their choice.

Cities were required to include balanced literacy programming 3–4 days per week for at least 75 minutes per day. Each lesson had to include an opportunity for staff to read aloud to youth and independent reading time with access to books organized by specific reading levels. The program also had to include other key balanced literacy strategies: book discussion, writing, vocabulary building, and “fun” activities to encourage literacy skills development. CORAL staff worked closely with school staff to select youth who could benefit most from the program, including those who were struggling academically or socially and/or who were English language learners.

Funding Level Each city received $2 million per year through 2003 from the James Irvine Foundation (Irvine). In 2004, Irvine reduced the annual funding to $1.6 million per year per city. In total, Irvine committed more than $58 million over 8 years (1999–2006).
Funding Sources James Irvine Foundation (all CORAL cities); U.S. Department of Education 21st Century Community Learning Centers (21st CCLC) program (four cities), and California Temporary Assistance for Needy Families (TANF) program (one city).

Evaluation

Overview The evaluation examined the implementation of program sites’ literacy strategies and programs' early effectiveness in retaining children and in producing literacy gains during the first year after literacy program efforts were introduced (the 2004–05 school year). Following this phase of the evaluation, evaluators conducted a midcourse assessment of the program, and then turned to focus on CORAL outcomes, lessons learned, and promising strategies for boosting student achievement through afterschool programming.
Evaluator Public/Private Ventures (P/PV)
Evaluations Profiled

Early Lessons from the CORAL Initiative

Findings from an Independent Evaluation of a Major After-School Initiative

Evaluations Planned None.
Report Availability

Arbreton, A. J. A., Goldsmith, J., & Sheldon, J. (2005). Launching literacy in after-school programs: Early lessons from the CORAL Initiative. Philadelphia: Public/Private Ventures. Available at www.ppv.org/ppv/publications/assets/192_publication.pdf

Walker, G. (2007). Midcourse corrections to a major initiative: A report on the James Irvine Foundation’s CORAL experience. Philadelphia: Public/Private Ventures. Available at:  www.irvine.org/assets/pdf/pubs/evaluation/Midcourse_Corrections.pdf

Arbreton, A. J. A., Sheldon, J., Bradshaw, M., & Goldsmith, J., with Jucovy, L., & Pepper, S. (2008). Advancing achievement: Findings from an independent evaluation of a major after-school initiative. Philadelphia: Public/Private Ventures. Available at: www.ppv.org/ppv/publications/assets/225_publication.pdf

Sheldon, J., Arbreton, A., Hopkins, L., & Grossman, J. B. (2010). Investing in success: Key strategies for building quality in after-school programs. American Journal of Community Psychology, 45(3&4): 394–404.

Contacts

Evaluation Amy J.A. Arbreton
Director of Research
Public/Private Ventures
Lake Merritt Plaza
1999 Harrison Street, Suite 1550
Oakland, CA 94612
Tel: 510-273-4600
Email: aarbreton@ppv.org
Program Anne Stanton
Youth Program Director
The James Irvine Foundation
575 Market Street, Suite 3400
San Francisco, CA 94105
Tel: 415-777-2244
Email: astanton@irvine.org
Profile Updated April 4, 2012


Evaluation 1: Launching Literacy in After-school Programs: Early Lessons from the CORAL Initiative



Evaluation Description

Evaluation Purpose

To address the following question: Can literacy activities be integrated into existing afterschool programs with sufficient quality to promote reading gains?

In order to answer this question, several additional issues were addressed:

  • Who participated in CORAL?
  • Was there early evidence of reading gains as a result of youth’s participation?
  • Did some youth benefit from CORAL more than others?
  • What strategies seemed to contribute most to reading gains?
  • What practices facilitated quality literacy instruction?
Evaluation Design

Quasi-Experimental and Non-Experimental: Across all 37 sites, surveys were completed in Spring 2005 by 412 CORAL staff members (73% response rate out of 564 staff members asked to complete surveys) including team leaders, site coordinators, volunteers, paraprofessionals, educational liaisons, city directors, literacy directors, and enrichment providers. Spring 2004 California Standards Test (CST) English Language Arts test scores were gathered from seven of the nine school districts serving CORAL participants in 33 of the 37 sites. Children’s behaviors were rated via a rating form completed by teachers for 288 of the 418 children (69%) attending the school-based CORAL sites. In addition, enrollment and attendance data were collected from 23 of the 37 sites (the other sites did not complete data in time to be included in the evaluation).

The remainder of the data collection focused on 4 or 5 research sites in each city, for a total of 23 sites. Sites were selected if they served elementary school youth and had been in operation for at least 1 year. Within these sites, the evaluation focused on third and fourth graders, because research suggests that this time period is crucial for reading skills development. Observations of balanced literacy activities were conducted at research sites with 56 groups of 12–20 third and fourth graders. Each group was observed 2–4 times during the year, with most observed 3 times (n = 48). Also at these sites, evaluators interviewed approximately 25 key informants per city (including CORAL staff such as team leaders, city directors, board members, and school staff such as principals) and conducted 19 parent focus groups with approximately 170 parents across groups.

Parents of 738 CORAL third and fourth graders at research sites agreed to allow their children to participate in the evaluation, out of 762 who completed permission forms, for a participation rate of 97%. A total of 635 of the youth with permission were randomly selected to be part of a youth survey and reading assessment cohort (approximately 125 per city), of which 515 completed a survey and 520 completed an informal reading inventory (IRI) from the Jerry L. Johns Basic Reading Inventory (Johns, 1997) in Fall 2004 (pretest). In Spring 2005 (posttest), the same IRI was administered to 383 youth still attending CORAL, for a follow-up rate of 74%.

Reference:
Johns, J. L. (1997). Basic Reading Inventory: Pre-Primer through Grade Twelve & Early Literacy Assessments (7th ed.). Dubuque, IA: Kendall/Hunt Publishing Co.

Data Collection Methods

Interviews/Focus Groups: Interviews with CORAL staff and stakeholders asked about staffing structures and training, participant recruitment and targeting strategies, lesson and activity planning, obstacles to implementation of the balanced literacy model, relationships with schools and other partners, and plans or goals for the future.

In focus groups, parents were asked three primary questions: Why did they choose to send their children to CORAL? How, if at all, had their children benefited by participating in CORAL? What was the quality of their interactions with CORAL staff?

Observation: Structured observations assessed the extent to which CORAL programming successfully

  • Incorporated the six balanced literacy strategies
  • Provided high-quality instruction (e.g., clearly presented and organized), group management, and connection-building strategies (e.g., related texts to youth’s experiences)
  • Offered examples of positive adult support (e.g., acted in responsive ways)

To measure implementation quality and link it to reading level changes, a Literacy Profile was developed that assigned a composite score of 1–5 to literacy activities experienced by each of the 56 groups of youth observed. The Profile considers the quality of the primary balanced literacy strategies and whether each strategy was observed during at least half of the observations. Profile 1 indicates a group that implemented read-alouds and independent reading during fewer than half of the observations or did so at a low quality. At the other end of the spectrum, Profile 5 indicates a group that implemented all six balanced literacy strategies at a rating of 4 during at least half of the observations.

Secondary Source/Data Review: Demographic records from each school district and Management Information System (MIS) data on enrollment and attendance were gathered from each site.

Surveys/Questionnaires: Youth surveys included questions about program experiences and developmental outcomes, attitudes toward reading and school, and effort and interest in school.

The staff survey included questions about staff’s educational background, experience, training, time with CORAL, and responsibilities within CORAL.

The teacher assessment asked teachers to assess CORAL participants’ school-day behavior, including aggressiveness, positive social relationships, and cognitive concentration.

Test/Assessments: The CST English Language Arts test assesses youth’s progress on state standards for English language arts.

The Jerry L. Johns Basic Reading Inventory assigns a grade-level reading designation based on youth’s reading a series of graded word lists and paragraphs and their responses to comprehension questions after each paragraph.

Data Collection Timeframe Data were collected from October 2004 to June 2005.

 

Findings:
Formative/Process Findings

Activity Implementation

CORAL programs tended to be open 4 days (in one city) or 5 days (in four cities) per week, for about 3 hours per day. Youth were expected to attend every day the program was open.

Participants were generally divided by grade into groups of 12–20 for all activities; these groups remained with the same team leader over the course of the program. Except for one city, where certified teachers led literacy activities, team leaders taught literacy-related lessons.

Youth who attended CORAL participated in balanced literacy activities, homework help, and cultural/academic enrichment activities. Balanced literacy activities were offered about 5 hours per week on average, divided across 3 or 4 days. Homework help varied from 5 minutes a day in a few sites to up to 60 minutes per day. Cultural/academic enrichment activities, such as art, science, dance, and cooking, generally lasted 60–90 minutes, occurred 2–5 days a week, and rotated on a 6–8-week basis so that youth participated in a variety of enrichment activities during the year.

During read-alouds, staff sometimes introduced stories by asking youth questions related to the topic or having them predict what would happen based on the pictures. Staff read a variety of books of varying lengths on diverse topics.

Read-alouds were sometimes followed by book discussions, which took a variety of formats. Staff sometimes asked youth to reflect on what they had just read or asked targeted questions about the books.

Writing exercises sometimes followed read-alouds or book discussions and were sometimes extensions of book discussions.

Throughout balanced literacy activities, staff sometimes incorporated vocabulary exercises in order to introduce or review words. Many staff devoted a space on the wall to vocabulary and added new words every day. Staff often introduced vocabulary as part of the read-aloud, either reviewing words before reading or pausing while reading to define a new word. Sometimes staff devoted more time to vocabulary—for example, having youth record words and their definitions in journals, or act out the meanings of new words.

Independent reading exercises usually occurred at the beginning or end of literacy activities and lasted an average of 15 minutes (ranging from less than 5 minutes to more than 30 minutes). In most cases, youth chose from books at individualized reading levels, mostly fiction but also some nonfiction and comic books.

Some balanced literacy lessons included skill development activities, which were opportunities for youth to practice particular literacy skills, such as learning to write in complete sentences and practicing specific letter sounds. These activities occurred less frequently than other balanced literacy activities.

Because parents and school staff in some cities saw homework help as a key benefit of CORAL, these sites devoted a large amount of non-literacy program time to homework. In most cases, though, cities devoted much of this time to enrichment.

Cities had varying philosophies on the relationship between balanced literacy and enrichment activities. Some cities intentionally provided enrichment with a strong literacy focus. In other cities, the emphasis was on connecting topics covered in balanced literacy lessons to the topics of enrichment activities.

Independent reading took some time to “get off the ground,” with an average quality rating of 2.97 in the fall and 3.14 in the spring, as sites faced challenges such an insufficient number of books to meet all youth’s reading levels, unclear procedures for sharing books across classrooms, too little time scheduled for this activity, and staff taking a passive role and failing to ensure that youth were engaged. Many of these issues, particularly those related to materials and scheduling, improved by the time of later observations.

The following quality ratings improved from Fall 2004 to Spring 2005: read-alouds (from 3.35 to 3.58), book discussions (from 2.82 to 3.14), and writing (from 3.02 to 3.12). Ratings decreased for skill development (from 3.20 to 2.71) and vocabulary (from 2.61 to 2.03).

Read-alouds and independent readings were observed more often than other strategies. Read-alouds occurred during 80% of observations and independent reading during 88%, as opposed to writing during 62% of observations and book discussions during 57%.

Staff felt that using a literacy model (either Kidzlit or YET) contributed to effective implementation of balanced literacy programming; models brought consistent, articulated goals and strategies and allowed training to be focused and concrete.

During the first year of balanced literacy programming, 59% of the groups (n = 33) were assigned to Literacy Profile 1, indicating that they failed to implement moderate-quality read-alouds and independent reading during at least half of the observations. Of the remaining groups, almost all (36% of the total, n = 20) fell into Profile 3, indicating that they implemented moderate-quality read-alouds, independent reading, and one or two other strategies at least half of the time. The remaining two fell into Profile 2, indicating that they implemented read-alouds and independent reading at a quality rating of at least 3 during at least half of the observations.

Recruitment/ Participation

Some parents noted in focus groups that they chose CORAL specifically because of the literacy focus, while others identified enrichment or homework help as a key attraction.

Total CORAL enrollment for the year was 5,321, ranging from 585 to 2,081 across cities.

Most participants were in elementary school, mainly first through fifth graders (81%), with a smaller proportion in middle school.

The racial/ethnicity breakdown of participants was 68% Hispanic, 14% African American, 10% Asian, 4% multiracial, 3% White, and 1% “other.” More than half of participants (53%) were English language learners and 89% received free or reduced-price lunch.

CST scores for Spring 2004 indicated that only 16% of CORAL children met the grade-level standards for proficiency on the English Language Arts portion of the test.

Of the CORAL third and fourth graders administered the IRI, 70% read below grade level in Fall 2004 (with 50% of these two or more grade levels below).

Teacher assessments indicated that, in their daytime classrooms, 94% of third and fourth graders in the CORAL research sites were never or rarely overly aggressive, 99% sometimes to almost always engaged in positive social relationships, and 90% sometimes to almost always displayed cognitive concentration.

Surveyed youth reported having positive adults in their lives who cared about them and were available for support (98%) and liking school (90%). They reported that they enjoyed reading (mean = 4.4 on a scale of 1–5) but were less comfortable with their ability to do well in reading; the mean for reading efficacy was 3.7 (on a scale of 1–5).

For the 2004–05 school year, the 1,120 CORAL third and fourth graders at research sites attended on average just under 3 days per week for just under 3 hours per day. The average number of days attended was 81, with 51% attending more than 75 days. The average attendance rate (days attended out of available days the program was open) was 62%.

Retention rates showed that 79% of youth remained in the program as of June 30, 2005.

More than half (53%) of the third and fourth graders who were administered the IRI in the fall and again in the spring attended literacy programming 3 or more days a week between IRI assessments; 36% attended 2 to fewer than 3 days per week during that period; 10% attended 1 to fewer than 2 days per week; and only 1% attended less than 1 day per week. These youth attended an average of 2.31 days per week of literacy activities.

Third and fourth graders attended literacy and homework help at almost equal rates, at an average of about 63 days over the school year, or roughly three quarters of the time they participated in CORAL. They attended enrichment programming slightly more than half of the days (45 days) they participated in CORAL. In addition, they attended physical education an average of 26 days, other academic activities an average of 10 days, and field trips an average of 5 days.

Staffing/Training

Average staff age was 27 years old, and three quarters of staff were female.

Forty-six percent of staff were Hispanic, 17% were African American, 16% were Asian or Pacific Islander, 13% were White, 6% were mixed race/ethnicity, and 2% were “other.”

Of direct service staff, 49% had a high school diploma but had not yet completed college; many were studying to be teachers.

Across cities, 65% of direct service staff reported speaking a language other than English sometimes or always with CORAL youth and their parents. If a staff member did not speak the same language as a parent, he or she often asked a colleague to act as a translator.

Site staff had worked an average of 17 months in the program, and 22% reported that they had worked in CORAL for longer than 2 years.

In all cities, enrichment was provided by a variety of staff: school-day teachers, CBO staff, community members, and team leaders.

Adult support was the strongest aspect of activities across all observations, with an average quality rating of 3.53 in the fall and 3.76 in the spring. In general, staff were observed to be skilled at responding effectively to youth and interacting with them in a warm and engaging manner. Many positive instances were observed of staff providing extra help for struggling youth, taking an interest in youth’s interests and thoughts, smiling and laughing with youth, and being responsive to youth’s individual needs.

Instructional and group management strategies were implemented at a moderate level (on average, instructional quality ratings were 3.03 in the fall and 3.08 in the spring; group management ratings were 3.46 in the fall and 3.54 in the spring), with large variations across groups. In the highest-rated groups, staff provided clear instructions, taught organized lessons, employed strategies to motivate and challenge youth, and had activities ready when youth finished a lesson. In other cases, staff did not use strategies effectively. For example, a few instructors struggled to manage youth behavior. In these cases, staff spent their time attempting to focus and quiet youth and prevent outbursts, sometimes to such an extent that they were unable to implement any balanced literacy strategies.

Successful connection-making strategies were observed least frequently of the quality areas observed, with an average rating of 2.61 in the fall and 2.71 in the spring. In the stronger cases, staff devoted more time to this element, such as having youth role-play scenes from read-aloud texts. Observations with low ratings generally fell into two categories: either the connection-making activity was so brief that it was unlikely to have much impact on youth, or it was well-intentioned but not strongly implemented.

A major adjustment needed in order to implement balanced literacy was hiring a literacy director (an expert with a solid literacy background, skills to train and support site staff in literacy programming, and ongoing time and authority to monitor quality). Two cities filled this position early in implementation, while the other three did not do so until later. Cities that hired early benefited from having a literacy director on board before literacy programming began. In those early months, the literacy director was able to develop and implement a balanced literacy training curriculum, oversee lesson-planning development, and provide early monitoring and feedback. The literacy director’s role in monitoring and quality control was a key factor contributing to higher-quality implementation in the early hiring cities. This monitoring tended to elevate program quality when the literacy director made frequent site visits to observe programming.

Staff survey results indicated that 83% of direct staff attended training in the balanced literacy model. Scheduling training was a challenge, because staff who provided literacy activities were employed part-time and had availability constraints. Moreover, in some cities, programming occurred 5 days per week, and any weekday afternoon time taken for training would result in reduced youth programming. Cities went to great measures to compensate for these restrictions, including holding training on evenings and weekends.


Summative/Outcome Findings

Academic

Third and fourth graders showed significant (p < .0001), but modest, gains between the fall IRI and the spring IRI. Overall, youth increased about one third of a grade level (0.31) in reading.

Youth who were two or more levels below their grade level at the fall IRI assessment (n = 189) improved approximately three quarters (0.78) of a grade level. Youth who were one level below their grade level (n = 81) showed an average gain of a little over one quarter of a grade level (0.27). Youth at or above their grade level in reading (n = 113) showed an average loss in reading levels over the time period (-0.43), but the average decline did not result in their falling below their grade levels.

Youth in Literacy Profile 3 groups showed greater reading gains (average gain = 0.45 grade levels) than youth in Literacy Profile 1 or 2 groups (average gains = 0.26 and 0.28 grade levels, respectively).

Youth who felt a strong sense of reading efficacy, as measured on the youth survey, showed significantly stronger reading gains than those who felt a lower sense of reading efficacy (p <. 0001). No other subgroup differences in reading gains were found (i.e., gender, grade, race/ethnicity, English language proficiency, frequency of attendance, balanced literacy participation).

Of youth two or more grade levels behind, those in Literacy Profile 3 groups had greater average reading gains: 1.0 grade levels compared to 0.49 in Profile 2 and 0.73 in Profile 1. Youth who were one grade below grade level showed the greatest average reading gains in Profile 2 groups (1.00 grade levels) and the least in Profile 1 groups (0.13 grade levels); youth in Profile 3 groups showed average gains somewhere in between (0.47 grade levels). Youth who started at or above grade level tended to drop over time but those exposed to higher-quality literacy instruction had a smaller drop than those exposed to lower-quality literacy programming: Literacy Profile 3 youth at or above grade level only dropped 0.17 grade levels on average, compared to 0.62 for Profile 2 and 0.58 for Profile 1.

Results suggest that how well instructors implemented independent reading strategies was an important predictor of youth’s reading-level gains. However, the strength and consistency of general instructional practices, group management practices, adult support for youth, and connections made for youth between their experiences and the text were not predictors of greater reading gains.

Several parents noted in focus groups that they were encouraged by improvements they saw in their children’s reading since enrolling. In addition, several noted changes in their children’s attitudes toward school and reading. Parents said their children were more structured and responsible in their study habits, developed an interest and joy in learning, and began to trust themselves and their ability to do schoolwork.

Youth Development

Some parents noted in focus groups that their children were more passionate about particular subjects—such as singing or drawing, which they first learned in CORAL—than they were before they participated in CORAL.

Parents in focus groups mentioned seeing improvements in their children’s social skills and believed that those changes helped their children become better students.

Parents in focus groups expressed their appreciation for the physical aspect of some enrichment activities, noting that it improved their children’s overall health.

Parents in focus groups spoke often about the positive effects of enrichment programming and expressed appreciation that CORAL exposed their children to activities that the parents could not otherwise have provided and opened up the world to their children.

 


Evaluation 2: Advancing Achievement: Findings from an Independent Evaluation of a Major After-School Initiative



Evaluation Description

Evaluation Purpose To understand changes in the quality of the literacy programming over CORAL’s last two years (2004–05 and 2005–06); the extent of children’s participation and engagement in CORAL; the relationship of quality, participation, and engagement to positive changes in children’s reading performance and attitudes; and program costs.
Evaluation Design

Quasi-Experimental and Non-Experimental: Across all 37 sites, surveys were completed in Spring 2005 by 412 CORAL staff members (73% response rate out of 564 staff members asked to complete surveys) including team leaders, site coordinators, volunteers, paraprofessionals, educational liaisons, city directors, literacy directors, and enrichment providers. In addition, enrollment and attendance data were collected from all sites, as well as spring California Standards Test (CST) English Language Arts test scores for both years on all CORAL participants.

The remainder of the data collection focused on 4 or 5 research sites in each city. There was a total of 23 research sites in Year 1, and 21 in Year 2. Sites were selected if they served elementary school youth and had been in operation for at least 1 year. Within these research sites, the evaluation focused on third and fourth graders in Year 1, who became fourth and fifth graders in Year 2.

At research sites, evaluators administered a youth survey to a sample of 762 CORAL third and fourth graders whose parents had signed permission forms for their children to take part in the evaluation. Of those, 738 parents (97%) ultimately agreed to allow their child to participate in the evaluation. Of those children with permission, 635 were randomly selected to be included in the youth survey and reading assessment cohort (approximately 125 per city). The final cohort of children surveyed in Year 1 included 515 youth: 280 third graders and 235 fourth graders. The survey was administered to the same group of children in Spring 2006, and children were contacted and asked to complete the survey whether or not they remained in the CORAL program, with the following exception: Of the 515 children in the initial survey cohort, 67 were not followed because three sites no longer ran CORAL programs in Year 2 of the evaluation, bringing the total that evaluators attempted to follow to 448. Of those 448 children, 379 (85%) completed follow-up surveys in Spring 2006. Reliable enrollment and attendance information was also available for a smaller sample of children who had both Year 1 and Year 2 surveys (the number ranges from just over 300 to just over 350, depending on the analysis). This group is the focus of the analysis linking engagement and participation to outcomes.

In Fall 2004 and Spring 2005, evaluators administered an informal reading inventory (IRI) from the Jerry L. Johns Basic Reading Inventory (Johns, 1997) to the sample of third and fourth graders at CORAL research sites who had also completed youth surveys (plus 5 children who had not completed the survey). In Fall 2004, IRIs were administered to 520 youth: 281 third graders and 239 fourth graders. In Spring 2005, IRIs were administered to 383 youth still attending CORAL, for a follow-up rate of 74%. These are the subjects for the analyses linking quality to reading gains in Year 1. In Fall 2005, evaluators reassessed 353 of the 448 (79%) who initially completed IRIs in Fall 2004 and whose sites were still providing CORAL programming; 180 fourth and fifth graders were also added to the IRI sample in Fall 2005, for a total of 533 IRI assessments (children were added to the sample for the purpose of having a larger number of children for linking quality observations to IRI gains). Evaluators followed up with 379 of the initial cohort of 448 (85%) in Spring 2006 and 142 (79%) of the added fourth and fifth graders, for a total of 521 assessments, in Spring 2006. From Fall 2005 to Spring 2006 (Year 2), 319 of 448 children who underwent IRIs and surveys also had assessments in both Fall 2005 and Spring 2006 For the analyses linking quality to reading gains in Year 2, the sample size consists of 368 youth who had IRI assessments in Fall 2005 and Spring 2006 and for whom data had been collected from observations of the group they were part of in that year.

Evaluators undertook systematic observations of CORAL activities in Years 1 and 2. In Year 1, evaluators observed 56 groups of children in balanced literacy activities across the state (23 sites). Each group was observed between two and four times over the course of the year, with most observed three times (48 of 56, or 86%). In Year 2, evaluators observed 43 groups of children in balanced literacy activities across the state (21 sites). Each group was observed at least three times.

In Spring 2006, parents of children who had completed IRIs at any time during the evaluation period (with the exception of the children from the sites that were no longer part of CORAL in the 2005–06 school year) were contacted and asked to complete a survey, even if their children no longer participated in CORAL. Surveys were received from 501 of 610 parents (82%).

Evaluators conducted intensive research visits in early 2005, and again in Spring 2006. During these visits, researchers interviewed various CORAL staff members and stakeholders, including team leaders, city-level directors, board members, and principals of the school sites. During the Spring 2005 intensive site visits, evaluators also conducted focus groups with a sample of CORAL parents. Evaluators conducted a total of 19 focus groups. Approximately 170 Spanish-, Hmong-, and English-speaking parents were included in these 19 groups.

Reference:
Johns, J. L. (1997). Basic Reading Inventory: Pre-Primer through Grade Twelve & Early Literacy Assessments (7th ed.). Dubuque, IA: Kendall/Hunt Publishing Co.

Data Collection Methods

Interviews/Focus Groups: Staff interviews included questions on staffing structures, staff training, participant recruitment and targeting strategies, lesson and activity planning, goals for enrichment activities, obstacles to implementation of the balanced literacy model, relationships with schools and other partners, and plans or goals for the future.

Parents were asked three primary questions in focus groups: Why did you choose to send your children to CORAL? How, if at all, have your children benefited by participating in CORAL? What is the quality of your interactions with CORAL staff?

Observation: An observational tool was developed to measure ten dimensions of quality, falling within the two broad areas of general classroom practices (adult support, instructional strategies, group management, connection making) and balanced literacy strategies (read-alouds, book discussions, writing, independent reading, skill development activities, and vocabulary). Each dimension was scored on a scale of 1 to 5, where 1 indicated extremely negative behaviors and little to no positive strategies for working with youth, and 5 represented outstanding examples of those dimensions in the field, characterized by the consistent use of strong strategies.

Secondary Source/Data Review: The enrollment information from the MIS included background information on participants and daily attendance data, which indicated whether each youth was present or absent, the activities in which he or she participated (coded into literacy, homework help, academic support, enrichment, recreation, or other categories), and the times that youth arrived at and left the program on each day of attendance.

Surveys/Questionnaires: The youth survey questions covered participants’ sense of safety, social support from adults and peers, and interesting activities. Additionally, several questions addressed youth’s attitudes toward reading and school, including their enjoyment of reading, sense of efficacy as readers, and effort and interest in school.

The parent survey consisted of questions asking parents why they enrolled their children in CORAL, what their children did before enrollment in CORAL, the children’s experiences at CORAL, if their children had left CORAL and if so why they had left, and demographic characteristics of the family.

The staff survey contained questions about staff’s educational background, experience, training, time with CORAL, and responsibilities with CORAL.

Test/Assessments: The CST/ELA test measures the grade-level proficiency of youth.

A number of constructs were created based on a series of questions asked on the youth survey; these constructs were adapted from previous studies of youth-serving organizations. For example, adult support, sense of belonging, and interest in activities were adapted from Gambone and Arbreton, 1997. Scales for reading efficacy and liking reading were created by the Developmental Studies Center and used in evaluations of Kidzlit. The constructs include:

  1. CORAL Adult Support—four statements about the number of adults at CORAL who are available (e.g., “How many adults [at CORAL] pay attention to what’s going on in your life?”)
  2. CORAL Belonging—three statements that assess whether the youth feels that he or she belongs and is engaged (e.g., “I feel like I belong here [at CORAL].”)
  3. CORAL Interesting Activities—four statements about whether the youth perceives the activities as interesting and new (e.g., “I get a chance to do a lot of new things.”)
  4. CORAL Positive Peers—three statements about whether the youth has friends at CORAL (e.g., “I get to know other kids really well here [at CORAL].”)
  5. School Liking—three statements that focus on how youth feel about school (e.g., “In general, I like school a lot.”)
  6. Reading Efficacy—five statements about how comfortable youth are with their ability to read (e.g., “I’m very good at reading.” and “Reading is easy for me.”)
  7. Reading Liking—six statements that examine the extent to which the youth enjoys reading (e.g., “Reading is one of my favorite things to do.”)

References:
Developmental Studies Center. (2003). AfterSchool KidzLit outcome study. Oakland, CA: Author.

Gambone, M. A., & Arbreton, A. J. A. (1997). Safe havens: The contributions of youth organizations to healthy adolescent development. Philadelphia: Public/Private Ventures.

Data Collection Timeframe Data were collected between Fall 2004 and Summer 2006.

 

Findings:
Formative/Process Findings

Activity Implementation

Adding literacy activities to programming was a challenge for many CORAL staff who mostly did not have experience with literacy instruction. As a result, the exact offerings and types of activities varied from classroom to classroom as team leaders implemented those components with which they were most comfortable.

Enrichment activities were provided by outside specialists and CORAL team leaders. The outside providers who worked with CORAL generally taught in 6–8-week cycles, switching to a different CORAL site after each cycle. The outside providers offered a breadth of activities ranging from non-literacy academic enrichment, such as Young Engineers (hands-on science activities), to flamenco dancing, yoga, and lacrosse.

Each city provided activities that offered the opportunity to integrate literacy into the program, advancing the “culture of readers” and building skills by including short stories, writing opportunities, and other literacy activities in the enrichment offerings. One team leader, for example, designed an obstacle course where some of the stations were physical obstacles and some involved solving vocabulary clues.

Staff came to think of literacy learning as something that could actively involve and interest children, and they developed specific strategies for creating fun within literacy activities. One of their first steps was to choose activities and strategies that they, as instructors, found interesting. They also learned to choose books that the children would like (scary stories were particularly successful) and to use strategies such as using funny voices to read a story. A key strategy used to make reading fun and interesting was to draw connections between the books and the children’s lives. Team leaders did this by asking questions as they read a book aloud, or during later discussions about the book, that encouraged the children to think about their own families and experiences.

By Year 2, instructors offered the four primary literacy strategies (read-alouds, book discussions, writing, and independent reading) during almost all of the lessons observed. For example, read-alouds were observed 77% of the time in Year 1 and 99% of the time in Year 2. At the same time, the quality of delivery of each strategy also improved. Read-alouds, for example, were rated 2.84 out of 5 on average in Year 1, and 3.59 out of 5 in Year 2.

One of the four classroom practices—instructional quality—was rated significantly (p < .05) lower in Year 2 than in Year 1. The observers rated similar levels of clarity and organization in both years, but tended to give slightly lower ratings in Year 2 on how successfully the instructors challenged and motivated youth. At the same time, instructors significantly increased their skill at creating connections between youth and the books they were reading during Year 2 (p < .001).

Overall quality of literacy activity implementation improved between Years 1 and 2. The percentage of groups that demonstrated low-quality or no read-alouds more than half of the times they were observed decreased from 59% in Year 1 to 12% in Year 2. Similarly, the percentage of groups that demonstrated at least moderate quality literacy activities over half of the times that they were observed decreased from 36% in Year 1 to 88% in Year 2.

Interviews with program staff and partner agencies revealed several specific strategies that helped explain how CORAL staff were able to improve the quality of their literacy programming in Year 2. These strategies included having an effective literacy director; developing an integrated approach for providing monitoring, coaching, and training; and investing resources in strengthening the independent reading component of the balanced literacy lessons.

Costs/Revenues

In 2005–2006, the James Irvine Foundation funding accounted for approximately 80% of CORAL’s support across the four cities. Though one city raised a small amount in private donations, almost all other funding came from public sources, generally through grants awarded to school districts and then distributed to CORAL for programming in the district. Two cities received funding through 21st Century Community Learning Centers (21st CCLC) grants. Both of these cities also received funding from state sources. A third city required two of its schools that joined the initiative after startup to contribute $50,000 each to the costs of running CORAL.

In-kind donations represented an important CORAL resource. All of the cities were successful in obtaining some relatively small in-kind donations, such as tickets to amusement parks or books to be distributed as gifts, that allowed them to offer “extras” at their programs that would not have fit in their budgets. Two cities obtained larger-scale donations, such as regular donations of snacks, in one case worth $200,000. One city partnered with the YMCA and AmeriCorps to provide site staff, enabling some team leader positions to be funded through these partnerships.

By far the most significant in-kind donation for all of the cities was the use of school facilities. In 2005–2006, all four CORAL cities operated their programs solely in schools, thus freeing them from buying or renting program space. All of the schools allowed CORAL to use classrooms and cafeterias free of charge; most also provided janitorial services to clean the spaces at the end of each afternoon, and some schools even provided transportation for CORAL participants at the end of the afternoon.

The CORAL cities each spent between $1,447,613 and $1,776,890 during the 2005–06 program year, with an average of $1,612,055 per city. Including in-kind donations, the amount spent increased to an average of $1,708,932 per city, ranging from $1,456,045 to $1,808,593. In 2005–2006, CORAL programs spent an average of $20 on each child who attended any afternoon, or about $6.25 per hour, ranging from a low of $11.39 to a high of $33.10.

The majority of sites’ costs covered staff salaries and administrative costs. More of CORAL cities’ budgets were devoted to city-level costs, such as executive and administrative salaries, than is common under other afterschool programs, largely due to CORAL’s focus on supporting staff positions, such as the literacy director position, and other costs directly related to improving quality at each site. For example, employing literacy directors cost sites approximately $42,300 per year. Investments in staff time and materials for training and monitoring cost sites between $20,000 and $93,000 per year.

Sites spent an average of $41,680 on books for independent reading in Year 1, and supplemented this with an average of $6,856 on new books in Year 2 (plus $5,858 in donated books).

Developing a computerized management information system to track program data and training staff on using this system cost between $20,000 and $25,000 per city.

Approximately $140,000 supported technical assistance for CORAL programs each year (across all sites).

Approximately $60,000 was invested in researchers’ time each year to conduct evaluation and to provide continuous feedback to programs as they worked to improve quality.

Program Context/ Infrastructure The majority of children indicated that they felt safe at CORAL (90%), that their relationships with their peers at CORAL were positive (80%), and that they felt a sense of belonging to CORAL (71%).
Program/School Linkages From the beginning of the initiative, CORAL staff in all of the cities worked closely with school guidance counselors, teachers, and principals to identify students who could most benefit from the program, including children who were struggling academically or socially, or who were English language learners. With the new focus on literacy in 2004, two of the CORAL cities began to work with the schools to intentionally target children who were far behind in reading based on their standardized test scores.
Recruitment/ Participation

Most parents reported signing up their children up for CORAL because they saw the program as an opportunity for them to do better in school and get homework help. Approximately half of the parents also said they looked to CORAL to help improve their children’s English language skills and expose their children to books. In many cases, parents felt that, on their own, they were not able to provide the kind of help their children needed to succeed academically, since more than half of parents (55%) had less than a high school education, and for a little more than two thirds of the sample, the primary language spoken at home was a language other than English. In addition, more than half (52%) of the parents enrolled their children because of the opportunities for enrichment activities that included art, music, and recreation.

During the 2005–06 school year, the CORAL programs were open an average of 151 days. Out of these total possible days, the children in the research sample attended CORAL an average of 110.3 days during the year (an overall average attendance rate of 73%), or an average of 3.0 days per week.

Children attended literacy instruction the most frequently of any activity (an average of 85.7 days) during the 2005–06 school year. On average, they participated in homework help nearly as often (81.4 days), but in three of the cities, they participated in this activity even more frequently than in the literacy programming. Children also regularly attended nonacademic enrichment activities (e.g., arts, dance, music, drama) and physical education, but did so with greater variation across the state, reflecting cities’ varying schedules in these areas.

Of the children in the sample who attended CORAL in 2004–2005, 69% of those eligible continued to attend in 2005–2006. In addition, 63% attended all four semesters of the period during which data were being collected, while only 2% of the children attended CORAL for just one semester. These retention rates were similar for all youth in the sample, regardless of their demographic characteristics (gender or ethnicity), their Fall 2004 grade-level reading performance, their English language learner status, or their Fall 2004 ratings of their attitudes toward school and reading.

Parent surveys suggested some reasons why children who left CORAL did so. Parents’ most frequent response (25%) for why their children no longer attended CORAL was that their children were no longer interested in the program. Other reasons were that the family moved (21%), the children had too much homework (16%), or the children changed schools (9%).

 Satisfaction Children reported that they liked the CORAL literacy activities (72%) although a higher proportion of children reported liking the other activities offered. The highest percentage of children reported enjoying the sports (94%) and arts activities (92%). When asked about CORAL overall, 73% agreed that all of the program’s activities were interesting and challenging.
 Staffing/Training

More than three quarters (82%) of the team leaders were 25 years old or younger. Most (84%) reported prior experience working with children, but fewer than half (43%) had previously provided literacy instruction. Many were college students. For example, of the 67 team leaders observed in 2005–2006, just under half had completed some college courses but did not have a degree. An additional 3% had earned a bachelor’s degree or higher; 26% had earned an associate’s degree; and 21% had completed high school but had not taken any college courses.

The largest proportion of team leaders (53%) identified themselves as Latino, similar to the proportion (68%) of children in CORAL. Moreover, to support children’s learning, more than two thirds of the team leaders reported that they used a language other than English sometimes (49%) or always (19%) when they were with the CORAL children.

Staff noted that creating strong relationships was important in order to ensure that the children knew they were cared for, to help build their confidence, and to make sure they felt safe at CORAL. Good relationships were also important to the success of some activities. Team leaders explained that when they had a relationship with the children, they were able to choose activities that would be interesting to them; they could connect lessons and discussions to issues in the children’s lives; and they were faced with fewer behavioral problems and could handle them more quickly when they occurred.

Though staff at all CORAL sites indicated that adult–youth relationships were important to them, they varied in the extent to which they integrated specific strategies for relationship building into their programs. A strategy common to all sites was to group one team leader with the same children for the entire afternoon, every day, as often as possible for the whole year.

Many staff indicated that their primary approach to working with English language learners was to help them feel comfortable and supported at CORAL, so that they would feel confident enough to participate, voice their opinions, and build their language skills. Staff also worked to help English language learners feel that their voices were valued by allowing them to speak in their native languages and encouraging bilingual children to translate for their peers. However, strategies aimed specifically at English language learners tended to be inconsistently implemented, and many team leaders indicated that they could use more ideas and training in how to work with these youth.

Almost all children (97%) reported that there was at least one supportive adult at CORAL (someone whom they felt they could talk to or go to if they needed help), and 73% indicated that there were two or more such supportive adults. When asked specifically about the staff who led the literacy activities, more than 85% of children agreed that the staff paid attention to and cared about them.


Summative/Outcome Findings

Academic

Small but significant gains, based on the individual reading assessments, were found in children’s grade-level reading scores over the course of each school year. Children gained an average of 0.31 grade levels in reading from Fall 2004 to Spring 2005 and an average of 0.44 grade levels from Fall 2005 to Spring 2006 (p < .0001) Evaluators found greater reading-level gains (p < .0001) for children who started further behind and for those designated English language learners.

The proportion of children deemed to have a positive outcome on their CST-ELA (defined as moving up a level from “far below” or “below” basic grade level, or staying at or above basic grade level) was 68% from Spring 2004 to Spring 2005 and 72% from Spring 2005 to Spring 2006.

In Year 1, children in groups that received more consistent and higher-quality implementation of the literacy strategies exhibited greater reading-level gains over the school year compared to children participating in groups where the literacy strategies were not consistently implemented. These findings held regardless of English language learner status or youth’s initial reading level. Average reading gains were 0.26 grade levels for children in the lowest-quality group, but were 0.45 for children in groups where the literacy activities were implemented at a moderate level of quality. Additional Year 1 analyses found no relationship between observed quality of the general classroom practices—adult support, instructional quality, group management, and connection making—and children’s reading-level gains. In Year 2, almost all groups (88%) displayed consistent and higher-quality implementation of the literacy strategies, making comparisons inappropriate due to the very small number of children in the lower-quality groups. The average reading gain of 0.44 grade level over the second school year was comparable to the Year 1 average gain of 0.45 for children exposed to higher-quality literacy activities.

While 67% of youth in groups that scored below the median in overall program quality experienced a positive outcome on the CST-ELA, 73% of youth in groups scoring above the median in overall program quality experienced a positive outcome (p < .05).

Children’s sense of belonging to the program was significantly related to a positive change in 10 of the 13 attitude-related outcomes. These were paying attention and concentrating in class (p < .001), liking school (p < .001), feeling safe at school (p < .001), studying hard for a test in the last 30 days (p < .01), not getting into trouble in school in the last 30 days (p < .01), wanting to go to school in the last 30 days (p < .001), liking reading (p < .001), talking with someone about something read in the last four weeks (p < .001), amount of reading for pleasure after school (p < .05), and amount of reading books for pleasure in the last four weeks (p < .10). Duration of participation was not related to changes in reading-related attitudes and behaviors.

Neither duration of participation nor sense of belonging in CORAL was related to changes in reading performance.

Over 90% of parents reported they “agree” or “strongly agree” when asked whether CORAL had helped their children’s academics, school-related behavior and social skills, and English language abilities. In focus groups, parents explained that their children had developed more responsible study habits, taken greater interest in schoolwork, and become more confident in their abilities at school.

 

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project