You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview 21st Century Community Learning Centers (21st CCLC) programs provide homework assistance, targeted remediation, academics, arts, technology, and recreational activities in an effort to provide safe, enriching environments for youth during out-of-school hours. The Louisiana Department of Education has funded 5-year grants to nonprofits to operate 21st CCLC programs for youth and their families in Louisiana.
Start Date summer 2003
Scope state
Type after school, summer/vacation, before school, weekend, comprehensive services
Location urban, suburban, rural
Setting public school, private school, community-based organization, religious institution, private facility, recreation center
Participants elementary and middle school students
Number of Sites/Grantees 16 programs in 2003–2004
Number Served 5,375 youth in 2003–2004
Components Grants are given to schools and organizations that primarily serve youth who attend schools with high concentrations of economically disadvantaged (e.g., Title I) students (at least 40% of the student body). Grantees are expected to provide tutorial services and academic enrichment activities designed to help youth meet local and state academic standards in subjects such as reading and math. The 21st CCLC programs also provide activities focusing on youth development, drug and violence prevention, technology education, art, music, recreation counseling, and character education to enhance the academic component of the program. In addition, 21st CCLCs offer opportunities for literacy and related educational development to families of participating youth.
Funding Level $4.2 million in 2003–2004
Funding Sources U.S. Department of Education's 21st CCLC program


Evaluation

Overview The evaluation examined academic impacts on participants.
Evaluator Lynne Woodward Jenner, The Policy & Research Group
Evaluations Profiled Academic Outcomes in Louisiana's 21st Century Community Learning Centers

Evaluations Planned The 2004–2005 evaluation has been completed. A 2005–2006 evaluation is currently underway.
Report Availability Jenner, E. J. & Jenner, L. W. (2004). Academic outcomes in Louisiana's 21st Century Community Learning Centers. Baton Rouge, LA: Policy & Research Group.


Contacts

Evaluation Lynne Woodward Jenner
Director of Projects
The Policy & Research Group
2561 Citiplace Court, #750-192
Baton Rouge, LA 70808
Tel: 225-235-1202
Email: ljenner@policyandresearch.com
Program Andrala Walker
Section Leader, Division of School & Community Support
Louisiana Department of Education
1201 North Third Street, 4th Floor
Baton Rouge, LA 70804-9064
Tel: 225-342-4147
Fax: 225-219-4454
Email: andrala.walker@la.gov
Profile Updated June 1, 2006

Evaluation: Academic Outcomes in Louisiana's 21st Century Community Learning Centers



Evaluation Description

Evaluation Purpose To determine the program’s academic impact by addressing the following questions: Did participants show improved test scores compared to nonparticipants? Did attendance intensity impact academic growth? Did particular participant groups (race/ethnicity, gender, baseline achievement levels) show greater academic improvements? How did participating programs differ in academic impacts?
Evaluation Design Quasi-Experimental and Non-Experimental: For this study, four 21st CCLC programs were selected to participate: two in rural areas and two in urban areas. Youth in these programs were in grades 3 and 5. Site visits and interviews with key staff were conducted at all four sites.

Pretest and posttest outcome data were collected from program youth and a comparison group in the fall and spring. A total of 259 participants (defined by attendance of at least 30 program days) and 933 nonparticipants were in the study. The nonparticipants were in grades 3 and 5 in the same school systems. Of participants, 57% were female, 40% were in grade 3 (60% were in grade 5), 79% were minorities, 80% were eligible for free/reduced lunch (FRL), and 35% were from the two rural sites (65% were from the two urban sites). Of the comparison group, 46% were female, 44% were in grade 3 (56% were in grade 5), 54% were minorities, 65% were FRL eligible, and 56% were from the two rural sites (44% were from the two urban sites). Of all minorities in the sample, 99% were African American. Differences in pretest scores, race/ethnicity, gender, and FRL eligibility were statistically controlled for in the analyses.

Evaluators examined pretest/posttest academic test score gains overall (participants vs. comparison youth), by intensity level (number of days attended over the year), and by subgroups (gender, race/ethnicity, prior academic achievement). To examine intensity level, evaluators compared youth who attended at least 30 days, at least 60 days, and at least 90 days during the year to see if more attendance was related to larger gains (although evaluators noted that the 90-day attendees came primarily from one urban site, which limited generalizations). To examine prior academic achievement, the sample was divided into three groups (low, medium, and high achieving) based on pretest scores. Since subgroups were not distributed evenly between grantees (e.g., minorities were concentrated at certain sites), subgroup gains were difficult to distinguish.
Data Collection Methods Interviews/Focus Groups: Project director interviews asked about program context and implementation; recruitment strategies and criteria for selecting participants; impressions of program goals; activity implementation and staffing issues, especially related to the academic component (e.g., whether youth could choose activities and if they had to participate in academics, how materials covered were determined, the goals of the academic component, who constituted the academic staff); and issues related to administering the pretest (their experience giving the test and whether/how they used the results).

Observation: Program observations focused on the academic component and included the following areas: physical environment; number of youth; number of adults in the room (certified teachers, paraprofessionals, other); desks/tables/room set-up; whether the room was noisy or quiet and formal or informal; type of instruction; quality of youth–staff interaction; and percentage of youth that seemed or were: engaged, confused, attentive, bored, smiling, focused on academics, and acting up.

Secondary Source/Data Review: Program attendance data were collected at each program.

Test/Assessments: Iowa Test of Basic Skills (ITBS) forms A and B were collected as the pretest and posttest measures of skills and achievements in math, reading, language arts, social studies, science, and information sources. Scores are reported as Normal Curve Equivalents (NCEs), which are scaled so that a nationally representative population ranges from 1–99, with a mean of 50. “Core” scores (total averaged reading, language, and math scores) were examined for gains overall, by intensity level, and by subgroup. “Composite” scores (total averaged scores in all six subject areas) and individual subject area scores were examined for overall gains. One rural site’s youth only completed the tests’ core subjects. Improvements were expressed in terms of “impact scores,” or the average NCEs by which one group outperformed another. Gender, race/ethnicity, and FRL eligibility were reported on posttest answer sheets.
Data Collection Timeframe Data were collected during the 2003–2004 program year.


Findings:
Formative/Process Findings

Activity Implementation All four programs operated Monday through Thursday, with one also open on Friday. Programs began between 2:30 and 3:30 p.m., with three ending at 5:30 p.m. and the fourth ending at 5 p.m.
Program–School Linkages Three programs offered academics 4 hours per week (two for 1 hour, 4 days per week, one twice a week for 2 hours). The fourth offered 45 minutes, 3 days per week.

Programs varied in how they structured their academic components. At one program, youth identified as academically at-risk participated in academic clusters with certified teachers while other youth participated in hands-on learning activities led by program activity staff. At another, academic time focused on homework help in the fall, and standardized test preparation in the spring. At a third, homework assistance, group work, and direct instruction were provided based on teacher-identified needs. At the fourth, academics consisted of direct instruction and group work using a variety of prepared curricula.

Programs varied in the subjects on which their academic component focused. One focused on math/language arts or science/social studies depending on past test scores. In another, fall subjects were driven by homework, while the spring focus was language arts and math. In a third, teachers identified areas to cover. The fourth covered language arts and math.

Two programs required that staff prepare written daily academic lesson plans. At one, an educational consultant reviewed lesson plans weekly for alignment to benchmarks and standards. At the other, lesson plans were not regularly reviewed, but the project director had the authority to evaluate them.

Programs had different approaches to academic curriculum development. For one, some mapping for the year was done up front, with staff then submitting curriculum plans and weekly lesson plans. For another, staff determined what would be taught in the fall and were encouraged to use a creative approach; in the spring, standardized test instruction booklets were used. A third program had regular meetings to identify the focus of lessons, with schoolteachers helping to identify youth needs. At the fourth program, site coordinators got feedback from schoolteachers, while program teachers determined what was done on a daily basis to reflect youth needs.
Staffing/Training All four programs hired schoolteachers as academic staff. One program also used activity staff as academic staff. Another made extensive use of volunteers (mostly university students) and other agencies to help with academics. For example, second graders below reading level were paired one-on-one with volunteers. Two programs used schoolteachers as site coordinators; the other two hired full-time site coordinators.

One program hired an educational consultant to direct and monitor academic instruction quality. Principal feedback suggested that they felt this was invaluable in providing a quality extended-day academic program.


Summative/Outcome Findings

Academic

Participants (30 days or more) showed a significant improvement over nonparticipants on core ITBS scores, with an impact score of 2.2l NCEs (p < .01). Those who attended 60 or more days and 90 or more days experienced nearly the same academic impact over nonparticipants on core scores (2.44 and 2.68 NCEs respectively, p < .01 for each).

In the one program with a sizable proportion of youth participating for over 90 days (n = 74), ITBS core impact scores increased as program attendance increased: 1.81 NCEs at 30 days (compared to those who attended less than 30 days, p < .05), 1.99 NCEs at 60 days (compared to those who participated less than 60 days, p < .05), and 2.56 NCEs at 90 days (compared to those who participated less than 90 days, p < .01). The evaluators noted that the 30 and 60 day samples were essentially the same youth (only 10 of those who attended at least 30 days did not attend at least 60 days).

Participants showed significantly more academic growth than nonparticipants on the ITBS composite test, with program youth outperforming nonparticipants by 1.30 NCEs (p < .05).

Participants showed significantly more academic growth than nonparticipants on the ITBS reading test (1.57 NCEs, p < .05), with moderate attendance (60 days) related to a slightly larger impact score (1.63 NCEs, p < .05), and higher attendance (90 days) related to a stronger impact score (3.38 NCEs, p < .01), compared to nonparticipants.

Participants improved significantly more than nonparticipants in language (1.62 NCEs, p < .01) and social studies (1.88 NCEs, p < .05). Differences in improvements between the two groups did not differ significantly for math or science.

Both male and female participants evidenced significant growth on ITBS core scores compared to nonparticipants (2.11 NCEs, p < .01 for girls; 2.02 NCEs, p <. 05 for boys).

Minority participants had significant core ITBS impact scores (2.49 NCEs, p < .01), while nonminority participant improvements did not significantly differ from nonparticipants.

Participants entering the program with low or moderate achievement levels had significant core ITBS impact scores (2.21 NCEs, p < .05 and 2.47 NCEs, p < .01, respectively). High achievers did not have a significant increase in performance over non-attendees.

In looking at core test scores by program, only participants in the two urban sites showed significant improvements relative to nonparticipants (1.75 NCEs, p < .05 and 4.7 NCEs, p < .01).

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project