You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Program Description

Overview Woodcraft Rangers (WR) Nvision afterschool program consists of school-based afterschool “clubs” for youth in Los Angeles, California, designed to promote academic, social, and physical development. WR’s goal is to extend schools’ capacities to provide safe and supportive environments beyond the school day and to help youth improve social, behavioral, and learning skills that contribute to school achievement.
Start Date Fall 1999
Scope local
Type afterschool, summer
Location urban
Setting public school
Participants elementary through high school students (ages 6–18)
Number of Sites/Grantees 59 sites in 2010–2011 (40 elementary school sites, 16 middle school sites, and 3 high school sites)
Number Served 15,086 youth in 2010–2011
Components Clubs meet 3–5 days per week and include homework assistance, a fitness activity, a snack, and enrichment activities centered on a selected theme. Each club spans 8 weeks, during which time youth work on specific skills or techniques to achieve mastery. Themes are designed to reinforce classroom learning, be age/gender/school-appropriate, address youth interests, and utilize club staff’s talents. Examples include cooking, etiquette, jewelry making, drawing and painting, and computer skills. Youth are encouraged to join two clubs in each 8-week cycle to expose them to diverse experiences. Recognition events, to which parents, faculty, and other youth are invited to celebrate participants’ accomplishments, are held at the end of each cycle. These events may include an exhibit, team competition, performance, or awards ceremony. WR also provides field trips to educational, cultural, and recreational venues.
Funding Level $8,836,287 in 2010–2011
Funding Sources California Department of Education’s After School Education and Safety Program, United States Department of Education’s 21st Century Community Learning Centers (21st CCLC) program, the City of Los Angeles, United Way, City of Monterey Park, and private foundations.


Evaluation

Overview Earlier evaluations (through 2007) examined WR’s impact on youth. In 2008, WR began exploring the connections between afterschool site quality and youth outcomes. The 2010–2011 evaluation assesses WR’s impact on participant outcomes over time.
Evaluator

Lodestar Management/Research, Inc.

Harder+Company Community Research

EVALCORP Research & Consulting

Evaluations Profiled

Annual Evaluation Report for 2003–04: Findings for Elementary School Programs

Annual Evaluation Report for 2003–04: Findings for Middle School Programs

Assessment of Program Quality and Youth Outcomes

Evaluations Planned WR continues to examine the relationship between program quality and participant outcomes.
Report Availability

Kaiser, M., & Lyons, M. (2001). Woodcraft Rangers: State of California After School Learning and Safe Neighborhoods Partnerships Program with the Los Angeles Unified School District. Annual evaluation report, 1999–2000. Los Angeles, CA: Author.

Lodestar Management/Research. (2002). Woodcraft Rangers: State of California After School Learning and Safe Neighborhoods Partnerships Program with the Los Angeles Unified School District. Annual evaluation report, 2000–01. Los Angeles, CA: Author.

Lodestar Management/Research. (2003). Woodcraft Rangers: Los Angeles Unified School District After School Education and Safety Program annual evaluation report 2001–02. Los Angeles, CA: Author.

Lodestar Management/Research (2004). Woodcraft Rangers: Los Angeles Unified School District After School Education and Safety Program annual evaluation report for 2002–03. Los Angeles, CA: Author.

Lodestar Management/Research (2005). Woodcraft Rangers: Annual evaluation report for 2003–04. Los Angeles, CA: Author.

Lodestar Management/Research. (2006). Woodcraft Rangers After-School Program: Summary of program youth outcomes for middle school sites 2004–05. Los Angeles, CA: Author.

Lodestar Management/Research (2006). Process evaluation report: Key factors related to program recruitment, retention, and outcomes. Los Angeles, CA: Author.

Lodestar Management/Research (2007). Woodcraft Rangers: Annual evaluation report for 2005–06. Los Angeles: Author.

Harder+Company Community Research. (2008). Woodcrafts Rangers annual evaluation report 2006–2007: Middle school programs. Los Angeles, CA: Woodcraft Rangers.

EVALCORP Research & Consulting. (2011). Assessment of program quality and youth outcomes: A study of the Woodcraft Rangers’ Nvision After-School Program. Irvine, CA: Author.


Contacts

Evaluation Lisa Garbrecht
Research Associate
EVALCORP Research & Consulting
15615 Alton Pkwy., Suite 450
Irvine, CA 92618
Tel: 949-468-9849
Email: lgarbrecht@evalcorp.com
Program Pablo Garcia,
Program Director
Woodcraft Rangers’ Main Office
1625 West Olympic Blvd. Ste 800
Los Angeles, CA 90015
Tel: 213-249-9293
Fax: 213-388-7088
Email: pgarcia@woodcraftrangers.org
Profile Updated April 3, 2012

Evaluation 3: Assessment of Program Quality and Youth Outcomes



Evaluation Description

Evaluation Purpose To explore the quality of implementation of the WR program model, examine whether and how quality is associated with youth outcomes, and inform further development of a way to monitor quality and develop improvement strategies.
Evaluation Design

Quasi-Experimental and Non-Experimental: Program data were collected from all WR sites.

WR site coordinators were surveyed to identify key components of program implementation. A total of 55 of the 57 elementary and middle school site coordinators completed the survey (42 of 43 elementary sites; 13 of 14 middle school sites). From these survey data, two different methods, a factor analysis approach and a benchmark approach, were used to identify key components. The factor analysis approach used statistical methods to identify quality indicators. The benchmark approach used ratings from site coordinator surveys to develop a benchmark or minimum criteria for a “high quality” site response. For each item, if the site’s actual response met or exceeded the management team’s benchmark, a point for that item was assigned. If a benchmark category contained five items, and the site met the high-quality criteria for four of them, the site's score would be 80% on that benchmark (i.e., 4 divided by 5).

The program quality factors identified in the factor analysis included:

  1. Core elements—the extent that the program had a collaborative relationship with school administrators, how often it followed a site schedule, and the extent that youth were involved in decisions that impacted the program design
  2. Cycle plans—the extent the program implemented cycle plans (i.e., descriptions of goals and objectives of program activities), how useful the cycle plans were during the implementation of club activities, and the extent that the cycle plans’ objectives were met
  3. Value of ad hoc assistance—the value provided to staff from WR parent volunteers, youth volunteers, and WR traveling specialists (i.e., professionals who run clubs related to their field of expertise)
  4. Connections—how many activities promoted youth involvement in the community, how involved parents were in the program, and how much access the program had to school facilities to implement program activities
  5. Educational supports—the value provided to staff from WR teacher liaisons and school personnel and the extent that the program used academic concepts in activities.

Only the first three factors, however, were retained when relating quality factors to youth outcomes, as connections and educational supports did not have sufficient reliability to be considered stable factors using the current survey measurement tool.

The quality indicators gleaned from the benchmark analysis were:

  1. Cycle plans—the extent to which cycle plans (i.e., descriptions of goals and objectives of program activities) were used and useful in activity implementation
  2. Time distribution—students involved for expected number of minutes in required components of homework, fitness, nutrition/snack, interest-based club activities, and closing activities/all-together time
  3. Club activities—activities offered in six areas: sports, performing arts, visual arts, recreation activities, computers or technology/multi-media, and leadership opportunities/youth development
  4. Student engagement—extent of youth engagement in additional youth development activities
  5. Club contributors—who is involved in club activities, including teacher liaisons, activity consultants, traveling specialists, and school personnel
  6. Ad hoc staff value—the extent to which club contributors have appropriately important roles
  7. Club selection—whether staff help to select club activities; “staff” includes regional managers, site coordinators, club leaders, youth councils, youth participants, and principals/school administrators
  8. Student involvement—the extent of choice and leadership opportunities in club activities
  9. Parent involvement—extent that parents are involved in club activities, including communication with staff and attending events and activities
  10. School collaboration—the extent to which the clubs work with schools, including engaging with school staff in specific communications and activities
  11. Facilities access—how much access the program had to school facilities to implement program activities
  12. Site coordinator qualities—characteristics of site coordinators, including management style, organizational skills, and staff communication
  13. Program staff qualities—characteristics of program staff, including leadership, organization, and student interactions.

To examine youth outcomes, WR participants were administered a survey when they first joined the program (baseline) and again at the end of the program year. In total, 2,304 elementary youth and 406 middle-school youth had matched baseline and year-end surveys. Academic test score data were also collected on participants for the 2007–08 and 2008–09 school years; only those youth with data from both years (N = approximately 3,300 elementary school youth and 4,900 middle school youth) were included in analyses. Lastly, school attendance data were collected for approximately 4,700 youth in both the elementary and middle school samples.

Quality indicators were examined to see whether and how they were related to youth outcomes. Two of the quality factors identified through factor analysis (connections and educational supports) were not included in the quality-outcome analysis because they did not have strong enough internal consistency to be considered stable enough to use in that analysis. In addition, two other factors—program attendance days and demographics of the local communities in which WR sites were located—were used to account for other contributions to youth outcomes beyond quality. Analyses were performed separately for elementary and middle schools.

Data Collection Methods

Secondary Source/Data Review: School attendance data for the 2007–08 and 2008–09 school years were collected from the district.

Program attendance data for the 2008–09 program year were collected from the WR data system.

Demographic data from the U.S. Census 2000 for each school region was collected on the communities in which the WR schools were located. Indicators used include median family income as well as percentages of families in poverty, persons (aged 16 and older) in the labor force, persons with high school degree or greater; and persons who speak a language other than English in their home.

Surveys/Questionnaires: The site coordinator survey consisted of more than 150 quantitative and qualitative questions about how the site operated. Questions focused on how the site operated in terms of staffing, site structure and activities, daily club activities, youth engagement and leadership, school collaboration, parent involvement, and community involvement.

Youth surveys measured changes in attitudes, academic and social skills, sense of efficacy, and risk-related activities. The end-year surveys also included questions about youth’s experience and satisfaction with the program.

Tests/Assessments: Academic data included scores on the California Standards Test (CST) in English/Language Arts (ELA) and in Mathematics. Proficiency scores on the CST range from 1 (Far Below Basic) to 5 (Advanced).

Data Collection Timeframe Data were collected between 2006 and 2009.


Findings:
Formative/Process Findings

Program Context/ Infrastructure

For the factor analysis indicators, sites’ average scores were on the high end of the possible range for two of the five indicators: core elements and cycle plans (10.9 out of 12 and 10 out of 11, respectively).

There were no significant differences in the quality factor scores between programs at elementary schools and programs at middle schools.

According to the benchmark indicators—which represent the degree to which sites were able to reach the quality benchmark, with the goal to reach 100% of the benchmark—quality tended to be high (i.e., reaching a 75% or higher quality rating) across sites in 3 of the 13 benchmark areas for both elementary and middle school programs: site coordinator qualities, school collaboration, and program staff qualities. In addition, the majority of elementary sites reached at least a 75% quality rating in cycle plans and student engagement, while the majority of middle school sites reached at least a 75% quality rating in club activities and club contributors.

According to the benchmark indicators, the range of quality across sites was quite large for almost all the areas. For example, the average quality score for student involvement was high at 74% for elementary sites, yet this score ranged from as low as 20% to the highest possible score of 100%.

Only two benchmark indicators had quality scores that were consistently lower than the WR model’s expectations. For time distribution, the average percentage of the benchmark reached was 44% for elementary sites and 38% for middle school sites. This indicates that most sites did not spend the same amount of time on each of the daily required components as expected by the WR model. For ad hoc staff value, the average percentage of the benchmark reached was 46% in elementary sites and 60% in middle school sites.

Average overall benchmark quality scores ranged from 60–74% for elementary school sites, and 48–80% at middle school sites. For elementary sites, the overall quality average was 67%. For middle schools, the overall quality average was 68%.


Summative/Outcome Findings

Youth Development

Of the 16 quality indicators included in the outcome analysis, 5 had relationships that were favorable with a subset of the youth outcomes in elementary sites: value of ad hoc assistance, student engagement, club contributors, access to facilities, and site coordinator qualities. This indicates that when program quality was higher in each of these areas, the more positive were at least one or more of the desired youth outcome results. Of the remaining 11 indicators, 7 had a combination of negative and positive relationships with outcomes (time distribution, club activities, ad hoc value, club selection, student involvement, school collaboration, program staff qualities); 3 had only negative relationships (cycle plans [factor analysis], cycle plans [benchmark analysis], parent involvement); and 1 had no relationship (core elements). The overall quality indicator was generally unrelated to youth outcomes in elementary sites.

Of the 16 quality indicators included in the outcome analysis, 7 had favorable relationships with a subset of youth outcomes in middle school sites: cycle plans (factor analysis), cycle plans (benchmark analysis), student involvement, parent involvement, school collaboration, facilities access, and site coordinator qualities. This indicates that when program quality was higher in each of these areas, the more positive were at least one or more of the desired youth outcome results. Of the remaining 9 indicators, 6 had a combination of negative and positive relationships with outcomes (core elements, time distribution, club contributors, ad hoc value, club selection, and program staff qualities), 2 had only negative relationships (club activities and student engagement), and 1 had no relationship (ad hoc assistance).

While there was no one indicator favorably associated with all thirteen middle school outcome indicators, five of the indicators were associated with at least half of the outcomes, which was not the case in elementary sites. Also unlike elementary sites, overall middle school program quality was related to three outcomes: increased CST ELA scores, decreased problem behaviors, and increased positive attitude toward school.

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project