You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview The Communities Organizing Resources to Advance Learning (CORAL) initiative in California worked to link communities, institutions, and residents around the common goal of improving youth academic achievement through the provision of structured literacy programming and enriching out-of-school time opportunities. Communities with low-income, low-achieving schools in five cities (Sacramento, San Jose, Fresno, Pasadena, and Long Beach) were part of the initiative. CORAL served youth in grades 1–8 (with the majority in grades 1–5).
Start Date The initiative launched in 1999, and programming started in 2001 in Long Beach and Pasadena, in 2002 in Fresno, and in 2003 in Sacramento and San Jose. The initiative ended in 2006.
Scope state
Type afterschool
Location urban
Setting community-based organizations and public schools
Participants elementary through middle school students
Number of Sites/Grantees 31 school-based sites and 6 community-based sites in 2004-2005; 30 school-based sites and 3 community-based sites in 2005–2006
Number Served Approximately 5,000 per year
Components

A lead agency (a local nonprofit organization) was selected by the initiative funder to plan and run CORAL in each city. The lead agency worked with schools and community-based organizations (CBOs) to provide afterschool programming for youth. CORAL cities initially implemented a variety of programming. The common components across sites were homework help and enrichment programming, which included activities such as art or dance, usually without an academic focus.

In the 2004–05 school year, CORAL began to provide more consistent academic programming as part of a new “balanced literacy” approach, focused on improving participants’ reading skills. CORAL provided its cities with information on two balanced literacy program models, Youth Education for Tomorrow (YET) and Kidzlit, but cities were free to implement other programs of their choice.

Cities were required to include balanced literacy programming 3–4 days per week for at least 75 minutes per day. Each lesson had to include an opportunity for staff to read aloud to youth and independent reading time with access to books organized by specific reading levels. The program also had to include other key balanced literacy strategies: book discussion, writing, vocabulary building, and “fun” activities to encourage literacy skills development. CORAL staff worked closely with school staff to select youth who could benefit most from the program, including those who were struggling academically or socially and/or who were English language learners.

Funding Level Each city received $2 million per year through 2003 from the James Irvine Foundation (Irvine). In 2004, Irvine reduced the annual funding to $1.6 million per year per city. In total, Irvine committed more than $58 million over 8 years (1999–2006).
Funding Sources James Irvine Foundation (all CORAL cities); U.S. Department of Education 21st Century Community Learning Centers (21st CCLC) program (four cities), and California Temporary Assistance for Needy Families (TANF) program (one city).

Evaluation

Overview The evaluation examined the implementation of program sites’ literacy strategies and programs' early effectiveness in retaining children and in producing literacy gains during the first year after literacy program efforts were introduced (the 2004–05 school year). Following this phase of the evaluation, evaluators conducted a midcourse assessment of the program, and then turned to focus on CORAL outcomes, lessons learned, and promising strategies for boosting student achievement through afterschool programming.
Evaluator Public/Private Ventures (P/PV)
Evaluations Profiled

Early Lessons from the CORAL Initiative

Findings from an Independent Evaluation of a Major After-School Initiative

Evaluations Planned None.
Report Availability

Arbreton, A. J. A., Goldsmith, J., & Sheldon, J. (2005). Launching literacy in after-school programs: Early lessons from the CORAL Initiative. Philadelphia: Public/Private Ventures. Available at www.ppv.org/ppv/publications/assets/192_publication.pdf

Walker, G. (2007). Midcourse corrections to a major initiative: A report on the James Irvine Foundation’s CORAL experience. Philadelphia: Public/Private Ventures. Available at:  www.irvine.org/assets/pdf/pubs/evaluation/Midcourse_Corrections.pdf

Arbreton, A. J. A., Sheldon, J., Bradshaw, M., & Goldsmith, J., with Jucovy, L., & Pepper, S. (2008). Advancing achievement: Findings from an independent evaluation of a major after-school initiative. Philadelphia: Public/Private Ventures. Available at: www.ppv.org/ppv/publications/assets/225_publication.pdf

Sheldon, J., Arbreton, A., Hopkins, L., & Grossman, J. B. (2010). Investing in success: Key strategies for building quality in after-school programs. American Journal of Community Psychology, 45(3&4): 394–404.

Contacts

Evaluation Amy J.A. Arbreton
Director of Research
Public/Private Ventures
Lake Merritt Plaza
1999 Harrison Street, Suite 1550
Oakland, CA 94612
Tel: 510-273-4600
Email: aarbreton@ppv.org
Program Anne Stanton
Youth Program Director
The James Irvine Foundation
575 Market Street, Suite 3400
San Francisco, CA 94105
Tel: 415-777-2244
Email: astanton@irvine.org
Profile Updated April 4, 2012


Evaluation 1: Launching Literacy in After-school Programs: Early Lessons from the CORAL Initiative



Evaluation Description

Evaluation Purpose

To address the following question: Can literacy activities be integrated into existing afterschool programs with sufficient quality to promote reading gains?

In order to answer this question, several additional issues were addressed:

  • Who participated in CORAL?
  • Was there early evidence of reading gains as a result of youth’s participation?
  • Did some youth benefit from CORAL more than others?
  • What strategies seemed to contribute most to reading gains?
  • What practices facilitated quality literacy instruction?
Evaluation Design

Quasi-Experimental and Non-Experimental: Across all 37 sites, surveys were completed in Spring 2005 by 412 CORAL staff members (73% response rate out of 564 staff members asked to complete surveys) including team leaders, site coordinators, volunteers, paraprofessionals, educational liaisons, city directors, literacy directors, and enrichment providers. Spring 2004 California Standards Test (CST) English Language Arts test scores were gathered from seven of the nine school districts serving CORAL participants in 33 of the 37 sites. Children’s behaviors were rated via a rating form completed by teachers for 288 of the 418 children (69%) attending the school-based CORAL sites. In addition, enrollment and attendance data were collected from 23 of the 37 sites (the other sites did not complete data in time to be included in the evaluation).

The remainder of the data collection focused on 4 or 5 research sites in each city, for a total of 23 sites. Sites were selected if they served elementary school youth and had been in operation for at least 1 year. Within these sites, the evaluation focused on third and fourth graders, because research suggests that this time period is crucial for reading skills development. Observations of balanced literacy activities were conducted at research sites with 56 groups of 12–20 third and fourth graders. Each group was observed 2–4 times during the year, with most observed 3 times (n = 48). Also at these sites, evaluators interviewed approximately 25 key informants per city (including CORAL staff such as team leaders, city directors, board members, and school staff such as principals) and conducted 19 parent focus groups with approximately 170 parents across groups.

Parents of 738 CORAL third and fourth graders at research sites agreed to allow their children to participate in the evaluation, out of 762 who completed permission forms, for a participation rate of 97%. A total of 635 of the youth with permission were randomly selected to be part of a youth survey and reading assessment cohort (approximately 125 per city), of which 515 completed a survey and 520 completed an informal reading inventory (IRI) from the Jerry L. Johns Basic Reading Inventory (Johns, 1997) in Fall 2004 (pretest). In Spring 2005 (posttest), the same IRI was administered to 383 youth still attending CORAL, for a follow-up rate of 74%.

Reference:
Johns, J. L. (1997). Basic Reading Inventory: Pre-Primer through Grade Twelve & Early Literacy Assessments (7th ed.). Dubuque, IA: Kendall/Hunt Publishing Co.

Data Collection Methods

Interviews/Focus Groups: Interviews with CORAL staff and stakeholders asked about staffing structures and training, participant recruitment and targeting strategies, lesson and activity planning, obstacles to implementation of the balanced literacy model, relationships with schools and other partners, and plans or goals for the future.

In focus groups, parents were asked three primary questions: Why did they choose to send their children to CORAL? How, if at all, had their children benefited by participating in CORAL? What was the quality of their interactions with CORAL staff?

Observation: Structured observations assessed the extent to which CORAL programming successfully

  • Incorporated the six balanced literacy strategies
  • Provided high-quality instruction (e.g., clearly presented and organized), group management, and connection-building strategies (e.g., related texts to youth’s experiences)
  • Offered examples of positive adult support (e.g., acted in responsive ways)

To measure implementation quality and link it to reading level changes, a Literacy Profile was developed that assigned a composite score of 1–5 to literacy activities experienced by each of the 56 groups of youth observed. The Profile considers the quality of the primary balanced literacy strategies and whether each strategy was observed during at least half of the observations. Profile 1 indicates a group that implemented read-alouds and independent reading during fewer than half of the observations or did so at a low quality. At the other end of the spectrum, Profile 5 indicates a group that implemented all six balanced literacy strategies at a rating of 4 during at least half of the observations.

Secondary Source/Data Review: Demographic records from each school district and Management Information System (MIS) data on enrollment and attendance were gathered from each site.

Surveys/Questionnaires: Youth surveys included questions about program experiences and developmental outcomes, attitudes toward reading and school, and effort and interest in school.

The staff survey included questions about staff’s educational background, experience, training, time with CORAL, and responsibilities within CORAL.

The teacher assessment asked teachers to assess CORAL participants’ school-day behavior, including aggressiveness, positive social relationships, and cognitive concentration.

Test/Assessments: The CST English Language Arts test assesses youth’s progress on state standards for English language arts.

The Jerry L. Johns Basic Reading Inventory assigns a grade-level reading designation based on youth’s reading a series of graded word lists and paragraphs and their responses to comprehension questions after each paragraph.

Data Collection Timeframe Data were collected from October 2004 to June 2005.

 

Findings:
Formative/Process Findings

Activity Implementation

CORAL programs tended to be open 4 days (in one city) or 5 days (in four cities) per week, for about 3 hours per day. Youth were expected to attend every day the program was open.

Participants were generally divided by grade into groups of 12–20 for all activities; these groups remained with the same team leader over the course of the program. Except for one city, where certified teachers led literacy activities, team leaders taught literacy-related lessons.

Youth who attended CORAL participated in balanced literacy activities, homework help, and cultural/academic enrichment activities. Balanced literacy activities were offered about 5 hours per week on average, divided across 3 or 4 days. Homework help varied from 5 minutes a day in a few sites to up to 60 minutes per day. Cultural/academic enrichment activities, such as art, science, dance, and cooking, generally lasted 60–90 minutes, occurred 2–5 days a week, and rotated on a 6–8-week basis so that youth participated in a variety of enrichment activities during the year.

During read-alouds, staff sometimes introduced stories by asking youth questions related to the topic or having them predict what would happen based on the pictures. Staff read a variety of books of varying lengths on diverse topics.

Read-alouds were sometimes followed by book discussions, which took a variety of formats. Staff sometimes asked youth to reflect on what they had just read or asked targeted questions about the books.

Writing exercises sometimes followed read-alouds or book discussions and were sometimes extensions of book discussions.

Throughout balanced literacy activities, staff sometimes incorporated vocabulary exercises in order to introduce or review words. Many staff devoted a space on the wall to vocabulary and added new words every day. Staff often introduced vocabulary as part of the read-aloud, either reviewing words before reading or pausing while reading to define a new word. Sometimes staff devoted more time to vocabulary—for example, having youth record words and their definitions in journals, or act out the meanings of new words.

Independent reading exercises usually occurred at the beginning or end of literacy activities and lasted an average of 15 minutes (ranging from less than 5 minutes to more than 30 minutes). In most cases, youth chose from books at individualized reading levels, mostly fiction but also some nonfiction and comic books.

Some balanced literacy lessons included skill development activities, which were opportunities for youth to practice particular literacy skills, such as learning to write in complete sentences and practicing specific letter sounds. These activities occurred less frequently than other balanced literacy activities.

Because parents and school staff in some cities saw homework help as a key benefit of CORAL, these sites devoted a large amount of non-literacy program time to homework. In most cases, though, cities devoted much of this time to enrichment.

Cities had varying philosophies on the relationship between balanced literacy and enrichment activities. Some cities intentionally provided enrichment with a strong literacy focus. In other cities, the emphasis was on connecting topics covered in balanced literacy lessons to the topics of enrichment activities.

Independent reading took some time to “get off the ground,” with an average quality rating of 2.97 in the fall and 3.14 in the spring, as sites faced challenges such an insufficient number of books to meet all youth’s reading levels, unclear procedures for sharing books across classrooms, too little time scheduled for this activity, and staff taking a passive role and failing to ensure that youth were engaged. Many of these issues, particularly those related to materials and scheduling, improved by the time of later observations.

The following quality ratings improved from Fall 2004 to Spring 2005: read-alouds (from 3.35 to 3.58), book discussions (from 2.82 to 3.14), and writing (from 3.02 to 3.12). Ratings decreased for skill development (from 3.20 to 2.71) and vocabulary (from 2.61 to 2.03).

Read-alouds and independent readings were observed more often than other strategies. Read-alouds occurred during 80% of observations and independent reading during 88%, as opposed to writing during 62% of observations and book discussions during 57%.

Staff felt that using a literacy model (either Kidzlit or YET) contributed to effective implementation of balanced literacy programming; models brought consistent, articulated goals and strategies and allowed training to be focused and concrete.

During the first year of balanced literacy programming, 59% of the groups (n = 33) were assigned to Literacy Profile 1, indicating that they failed to implement moderate-quality read-alouds and independent reading during at least half of the observations. Of the remaining groups, almost all (36% of the total, n = 20) fell into Profile 3, indicating that they implemented moderate-quality read-alouds, independent reading, and one or two other strategies at least half of the time. The remaining two fell into Profile 2, indicating that they implemented read-alouds and independent reading at a quality rating of at least 3 during at least half of the observations.

Recruitment/ Participation

Some parents noted in focus groups that they chose CORAL specifically because of the literacy focus, while others identified enrichment or homework help as a key attraction.

Total CORAL enrollment for the year was 5,321, ranging from 585 to 2,081 across cities.

Most participants were in elementary school, mainly first through fifth graders (81%), with a smaller proportion in middle school.

The racial/ethnicity breakdown of participants was 68% Hispanic, 14% African American, 10% Asian, 4% multiracial, 3% White, and 1% “other.” More than half of participants (53%) were English language learners and 89% received free or reduced-price lunch.

CST scores for Spring 2004 indicated that only 16% of CORAL children met the grade-level standards for proficiency on the English Language Arts portion of the test.

Of the CORAL third and fourth graders administered the IRI, 70% read below grade level in Fall 2004 (with 50% of these two or more grade levels below).

Teacher assessments indicated that, in their daytime classrooms, 94% of third and fourth graders in the CORAL research sites were never or rarely overly aggressive, 99% sometimes to almost always engaged in positive social relationships, and 90% sometimes to almost always displayed cognitive concentration.

Surveyed youth reported having positive adults in their lives who cared about them and were available for support (98%) and liking school (90%). They reported that they enjoyed reading (mean = 4.4 on a scale of 1–5) but were less comfortable with their ability to do well in reading; the mean for reading efficacy was 3.7 (on a scale of 1–5).

For the 2004–05 school year, the 1,120 CORAL third and fourth graders at research sites attended on average just under 3 days per week for just under 3 hours per day. The average number of days attended was 81, with 51% attending more than 75 days. The average attendance rate (days attended out of available days the program was open) was 62%.

Retention rates showed that 79% of youth remained in the program as of June 30, 2005.

More than half (53%) of the third and fourth graders who were administered the IRI in the fall and again in the spring attended literacy programming 3 or more days a week between IRI assessments; 36% attended 2 to fewer than 3 days per week during that period; 10% attended 1 to fewer than 2 days per week; and only 1% attended less than 1 day per week. These youth attended an average of 2.31 days per week of literacy activities.

Third and fourth graders attended literacy and homework help at almost equal rates, at an average of about 63 days over the school year, or roughly three quarters of the time they participated in CORAL. They attended enrichment programming slightly more than half of the days (45 days) they participated in CORAL. In addition, they attended physical education an average of 26 days, other academic activities an average of 10 days, and field trips an average of 5 days.

Staffing/Training

Average staff age was 27 years old, and three quarters of staff were female.

Forty-six percent of staff were Hispanic, 17% were African American, 16% were Asian or Pacific Islander, 13% were White, 6% were mixed race/ethnicity, and 2% were “other.”

Of direct service staff, 49% had a high school diploma but had not yet completed college; many were studying to be teachers.

Across cities, 65% of direct service staff reported speaking a language other than English sometimes or always with CORAL youth and their parents. If a staff member did not speak the same language as a parent, he or she often asked a colleague to act as a translator.

Site staff had worked an average of 17 months in the program, and 22% reported that they had worked in CORAL for longer than 2 years.

In all cities, enrichment was provided by a variety of staff: school-day teachers, CBO staff, community members, and team leaders.

Adult support was the strongest aspect of activities across all observations, with an average quality rating of 3.53 in the fall and 3.76 in the spring. In general, staff were observed to be skilled at responding effectively to youth and interacting with them in a warm and engaging manner. Many positive instances were observed of staff providing extra help for struggling youth, taking an interest in youth’s interests and thoughts, smiling and laughing with youth, and being responsive to youth’s individual needs.

Instructional and group management strategies were implemented at a moderate level (on average, instructional quality ratings were 3.03 in the fall and 3.08 in the spring; group management ratings were 3.46 in the fall and 3.54 in the spring), with large variations across groups. In the highest-rated groups, staff provided clear instructions, taught organized lessons, employed strategies to motivate and challenge youth, and had activities ready when youth finished a lesson. In other cases, staff did not use strategies effectively. For example, a few instructors struggled to manage youth behavior. In these cases, staff spent their time attempting to focus and quiet youth and prevent outbursts, sometimes to such an extent that they were unable to implement any balanced literacy strategies.

Successful connection-making strategies were observed least frequently of the quality areas observed, with an average rating of 2.61 in the fall and 2.71 in the spring. In the stronger cases, staff devoted more time to this element, such as having youth role-play scenes from read-aloud texts. Observations with low ratings generally fell into two categories: either the connection-making activity was so brief that it was unlikely to have much impact on youth, or it was well-intentioned but not strongly implemented.

A major adjustment needed in order to implement balanced literacy was hiring a literacy director (an expert with a solid literacy background, skills to train and support site staff in literacy programming, and ongoing time and authority to monitor quality). Two cities filled this position early in implementation, while the other three did not do so until later. Cities that hired early benefited from having a literacy director on board before literacy programming began. In those early months, the literacy director was able to develop and implement a balanced literacy training curriculum, oversee lesson-planning development, and provide early monitoring and feedback. The literacy director’s role in monitoring and quality control was a key factor contributing to higher-quality implementation in the early hiring cities. This monitoring tended to elevate program quality when the literacy director made frequent site visits to observe programming.

Staff survey results indicated that 83% of direct staff attended training in the balanced literacy model. Scheduling training was a challenge, because staff who provided literacy activities were employed part-time and had availability constraints. Moreover, in some cities, programming occurred 5 days per week, and any weekday afternoon time taken for training would result in reduced youth programming. Cities went to great measures to compensate for these restrictions, including holding training on evenings and weekends.


Summative/Outcome Findings

Academic

Third and fourth graders showed significant (p < .0001), but modest, gains between the fall IRI and the spring IRI. Overall, youth increased about one third of a grade level (0.31) in reading.

Youth who were two or more levels below their grade level at the fall IRI assessment (n = 189) improved approximately three quarters (0.78) of a grade level. Youth who were one level below their grade level (n = 81) showed an average gain of a little over one quarter of a grade level (0.27). Youth at or above their grade level in reading (n = 113) showed an average loss in reading levels over the time period (-0.43), but the average decline did not result in their falling below their grade levels.

Youth in Literacy Profile 3 groups showed greater reading gains (average gain = 0.45 grade levels) than youth in Literacy Profile 1 or 2 groups (average gains = 0.26 and 0.28 grade levels, respectively).

Youth who felt a strong sense of reading efficacy, as measured on the youth survey, showed significantly stronger reading gains than those who felt a lower sense of reading efficacy (p <. 0001). No other subgroup differences in reading gains were found (i.e., gender, grade, race/ethnicity, English language proficiency, frequency of attendance, balanced literacy participation).

Of youth two or more grade levels behind, those in Literacy Profile 3 groups had greater average reading gains: 1.0 grade levels compared to 0.49 in Profile 2 and 0.73 in Profile 1. Youth who were one grade below grade level showed the greatest average reading gains in Profile 2 groups (1.00 grade levels) and the least in Profile 1 groups (0.13 grade levels); youth in Profile 3 groups showed average gains somewhere in between (0.47 grade levels). Youth who started at or above grade level tended to drop over time but those exposed to higher-quality literacy instruction had a smaller drop than those exposed to lower-quality literacy programming: Literacy Profile 3 youth at or above grade level only dropped 0.17 grade levels on average, compared to 0.62 for Profile 2 and 0.58 for Profile 1.

Results suggest that how well instructors implemented independent reading strategies was an important predictor of youth’s reading-level gains. However, the strength and consistency of general instructional practices, group management practices, adult support for youth, and connections made for youth between their experiences and the text were not predictors of greater reading gains.

Several parents noted in focus groups that they were encouraged by improvements they saw in their children’s reading since enrolling. In addition, several noted changes in their children’s attitudes toward school and reading. Parents said their children were more structured and responsible in their study habits, developed an interest and joy in learning, and began to trust themselves and their ability to do schoolwork.

Youth Development

Some parents noted in focus groups that their children were more passionate about particular subjects—such as singing or drawing, which they first learned in CORAL—than they were before they participated in CORAL.

Parents in focus groups mentioned seeing improvements in their children’s social skills and believed that those changes helped their children become better students.

Parents in focus groups expressed their appreciation for the physical aspect of some enrichment activities, noting that it improved their children’s overall health.

Parents in focus groups spoke often about the positive effects of enrichment programming and expressed appreciation that CORAL exposed their children to activities that the parents could not otherwise have provided and opened up the world to their children.

 

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project