You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview Save the Children’s Model Literacy Initiative supports afterschool and in-school programming designed to improve reading skills among struggling students in grades K–6. Programs are in rural locations in the states of Alabama, Arizona, Arkansas, California, Colorado, Kentucky, Louisiana, Mississippi, New Mexico, Nevada, South Carolina, and Tennessee.
Start Date Fall 2003
Scope national
Type afterschool
Location rural
Setting public school
Participants kindergarten through elementary school students
Number of Sites/Grantees 118 sites in 2008–2009; 147 sites in 2009–2010
Number Served 12,001 youth across all 118 sites (2008–2009); 11,082 youth across the 122 study sites (2009¬–2010)
Components The initiative includes two primary “Literacy Blocks”: Emergent Readers (ER) and Developing Readers (DR). ER serves children in kindergarten and first grade during and/or after school. It includes an extended read-aloud with developmentally appropriate follow-up activities, a reading-together period, and hands-on learning to support growth in phonemic awareness, letter recognition, sound–symbol correspondence, and beginning sight words. When assessments show a child has successfully mastered ER, the child moves on to DR, which targets children in grades 2–6 and takes place during and/or after school. The central component of DR is a guided independent reading program (GIRP), which uses Renaissance Learning’s Accelerated Reader books and software to provide regular opportunities for children to read books at an appropriate difficulty level for their reading skill level. As part of the Accelerated Reader program, children can read a book independently or with a teacher, or listen to the book being read aloud. In addition to GIRP, the DR Literacy Block also includes fluency-building support and read-alouds. Some schools also offer DR participants an in-school program that provides small-group tutorials that target phonics, sight words, vocabulary, and comprehension growth.
Funding Level $13.5 million for 2009–10
Funding Sources Various grantors to Save the Children.
Other Save the Children and Renaissance Learning provided training, technical assistance, and ongoing support in literacy services to literacy program staff.


Evaluation

Overview This evaluation examined the literacy programs’ implementation, as well as outcomes related to participants’ literacy gains.
Evaluators Policy Studies Associates, Inc.
Evaluations Profiled Results from the 2008–09 School Year

Results from the 2009–10 School Year

Results from the Comparative Pilot Study, 2009–10
Evaluations Planned Annual evaluations continue to be conducted.
Report Availability White, R. N., & Reisner, E. R. (2007). Model literacy programs. Save the Children: Evaluation findings from the 2005–06 school year. Washington, DC: Policy Studies Associates. http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED498796

Palmiter, A. S., Arcaira, E. R., White, R. N., & Reisner, E. R. (2009). The literacy programs of Save the Children: Results from the 2008–09 school year. Washington, DC: Policy Studies Associates. Available at: www.eric.ed.gov/ERICWebPortal/detail?accno=ED508135

White, R. N., White, E. A., Palmiter, A. S., & Reisner, E. R. (2010). The literacy programs of Save the Children: Results from the 2009–10 school year. Washington, DC: Policy Studies Associates.

Romash, R. A., White, R. N., & Reisner, E. R. (2010). Save the Children Literacy Programs: Results from the comparative pilot study, 2009–10. Washington, DC: Policy Studies Associates.


Contacts

Evaluation Andrea Palmiter
Research Analyst
Policy Studies Associates, Inc.
1718 Connecticut Avenue, NW, Suite 400
Washington, DC 20009
Tel: 202-939-5332
Fax: 202-939-5732
Email: apalmiter@policystudies.com
Program John Farden
Director, Programs and Results
Save the Children
2000 L Street NW, Suite 500
Washington, DC 20036
Tel: 202-640-6614
E-mail: jfarden@savechildren.org
Profile Updated March 31, 2011


Evaluation 1: Results from the 2008–09 School Year



Evaluation Description

Evaluation Purpose To address the following: What opportunities were provided for targeted children to participate in literacy instruction and for staff to develop instructional skills? What were the program enrollment and attendance patterns? Did participants’ literacy skills improve? What factors were associated with changes in reading proficiency? What were participants’ other academic outcomes?
Evaluation Design Quasi-Experimental: Data were collected on all 118 sites operating during the 2008–09 school year. The majority of programs’ host schools served grades K–8. Enrollment at these schools ranged from under 100 students to over 800. Of the 118 sites, 63 offered an Emergent Readers (ER) Literacy Block, and 114 offered guided independent reading program (GIRP) activities through an afterschool Developing Readers (DR) Literacy Block and/or through an in-school tutorial program.

Data were collected on participants at the beginning, middle, and end of the school year through two reading assessments: STAR Early Literacy, administered to ER participants and STAR Reading, administered to DR participants. Analysis of outcomes was limited to the 2,796 ER participants (46% of ER participants) who completed two or more STAR Early Literacy assessments during the year and 9,090 DR participants (82% of DR participants) who completed two or more STAR Reading assessments at least 90 days apart. In addition, data were collected on program attendance for all participants) as well as on the number of books read by each DR participant (data on the number of books read was not available at one site). DR participant data were also collected from Accelerated Reader quizzes (number taken, percent passed, and quiz scores).
Data Collection Methods

Secondary Source/Data Review: Program attendance data were collected through a web-based data collection system. These data were used to calculate the number of days that each site provided services, the number of program days each participant attended during the year (those who attended at least 55 days were considered “regular” participants), and attendance rates (i.e., the number of days attended as a proportion of the number of days it was possible to attend).

The number of books read was calculated for each DR participant. After completing a book, children took an Accelerated Reader quiz on that book’s content. Participants who answered 60% of the questions about the book correctly passed the quiz. Quiz results helped track changes in reading proficiency and identify additional books appropriate for the child’s skill level. The goal was for participants at each site to pass at least 85% of the quizzes they attempted and to read an average of 25 books or more during the school year.

Test/Assessments: Results of the STAR Early Literary assessment are presented as scaled scores, literacy-skills classification levels (i.e., level of reading proficiency), and risk levels (i.e., the degree of risk that the child will not gain reading proficiency). To help identify specific strengths and weaknesses, scores are also reported for seven literacy domains:

  1. general readiness (e.g., differentiating words from letters)
  2. graphophonemic knowledge (e.g., using alphabetical order)
  3. phonemic awareness (e.g., identifying rhyming words)
  4. phonics (e.g., matching and recognizing short and long vowel sounds)
  5. comprehension (e.g., reading and understanding sentences)
  6. structural analysis (e.g., identifying compound words)
  7. vocabulary (e.g., identifying synonyms and antonyms)

The STAR Reading assessment measures reading proficiency. Results are presented as scaled scores, grade equivalents, percentiles, and normal curve equivalents (NCEs). An increase of more than 2 NCEs is considered a meaningful increase by Renaissance Learning, the publisher of the STAR Reading assessment. The child’s grade level and month of school within that grade are factored into the scores.

Data Collection Timeframe Data were collected over the 2008–09 school year.


Findings:
Formative/Process Findings

Activity Implementation

Of sites that offered ER, activities were offered on an average of 92 days after school and 86 days in school.

Of sites that offered DR, activities were offered on an average of 109 days after school and 93 days in school over the course of the year.

Among the 10,602 participants who read at least one book, the number of books read averaged 64 for all participants and 76 for regular participants. On average, participants read one book every 2.4 days that they attended the program during the year. The average number of books read by all participants was 25 or more in 87% of sites with available data, while the average number of books read by regular participants was 25 or more at 95% of sites with available data.

Recruitment/ Participation A total of 12,001 children enrolled in literacy programs at the 118 sites—2,796 in ER and 11,055 in DR (some participated in both)—over the course of the year, with an average enrollment of 102 children per site, and 7,062 children per month across sites.

Of children who enrolled in literacy programs, 8% were in kindergarten, 11% were in first grade, 18% were in second grade, 21% were in third grade, 18% were in fourth grade, 15% were in fifth grade, and 9% were in sixth grade.

An average of 4,939 children participated across sites each day over the school year, with an average of 56 children served per site on a typical day. Participants attended an average of 67 program days during the year and had an average attendance rate of 73%. In addition, 29% of participants attended 90% or more of the days possible to attend.
Staffing/Training Across all sites, staff and volunteers involved in literacy instruction received an average of 23 hours of training and 37 hours of technical assistance from Save the Children, for a total of 60 hours. Staff in 12 sites also received an average of 4 hours of technical support and coaching from Renaissance Learning specialists.


Findings:
Summative/Outcome Findings

Academic ER participants improved significantly in their STAR Early Literacy scores (average gain = 117 scaled-score points, p < .05).

The proportion of ER participants classified as transitional or probable readers, the two highest of the four reading level categories on the STAR Early Literacy assessment, increased significantly from 11% to 59% (p < .05).

ER participants exhibited significant average score gains on all seven literacy domains on the STAR Early Literacy assessment between the fall and spring (p < .05). The largest gains were observed on structural analysis, which averaged a gain of 21 scaled-score points. The smallest change was on the general readiness domain, where the average change was 14 scaled-score points.

The percentage of ER participants identified as being “on track” or “having low risk of academic failure” increased to a small, but significant degree, from 30% to 31% (p < .05).

DR participants showed a significant increase in STAR Reading scores on average (p < .05). Participants had an average gain of 5.8 NCEs, with 60% achieving a gain of 2 NCEs or greater.

The percentage of DR participants reading at grade level or above as measured by the STAR Reading assessment increased significantly, from 16% to 29% (p < .05).

On average, regular DR participants had significantly larger improvements in STAR Reading scores than those who did not attend regularly (improvements of 7.1 NCEs vs. 3.6 NCEs, p < .05). In addition, a significantly larger proportion of regular DR participants achieved a gain of at least 2 NCEs compared to those who did not attend regularly (63% vs. 53%, p < .05).

A significant and positive relationship was found between the number of books read during the year and DR participants’ improvements on the STAR Reading assessment (p < .05).

About four-fifths of DR participants (81%) passed at least 85% of their Accelerated Reader quizzes. On average, DR participants achieved a passing score on 92% of their Accelerated Reader quizzes.

The percent of Accelerated Reader quizzes passed was significantly and positively related to gains on the STAR Reading assessment (p < .05). DR participants who passed at least 85% of their quizzes gained an average of 6.3 NCEs on the reading assessment, compared with 3.8 NCEs for DR participants who passed fewer than 85% of their quizzes.

DR participants with higher percentages of Accelerated Reader quiz items answered correctly made significantly larger gains on the STAR Reading assessment (p < .05). Participants who correctly answered 95% or more of the Accelerated Reader quiz items averaged gains of 8.5 NCEs on the reading assessment, while those answering fewer than 80% of items correctly averaged gains of 3.8 NCEs.

Among participants reading below grade level on their initial STAR Reading assessment, the average score increased significantly from 28 to 34 NCEs (p < .05). Of these participants, 63% increased their score by 2 NCEs or more, and 18% read at grade level or higher on the final assessment.

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project