You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Research Description

Overview and Components The Cost of Quality Out-of-School Time Programs Study assesses characteristics of various types of out-of-school time (OST) programs, their requirements, and their associated costs. The study includes OST programs across six cities: Boston, Massachusetts; Charlotte, North Carolina; Chicago, Illinois; Denver, Colorado; New York, New York; and Seattle, Washington.
Start Date 2006
Scope national
Type afterschool, weekend, summer/vacation
Location urban
Setting public school, community-based organization
Participants elementary through high school students
Number of Sites/Grantees 111 programs across 6 cities, including 70 school-year programs targeting elementary and middle school students (younger children), 41 school- year programs targeting middle and high school students (teens), 45 summer programs targeting younger children, and 26 summer programs targeting teens (categories are not mutually exclusive: some programs fall into multiple categories).
Number Served Programs targeting younger children served an average of 107 children each day during the school year and 93 during the summer. Teens programs served an average of 70 teens each day during the school year and 55 during the summer.
Study Details This study focuses on the costs associated with operating mature, high-capacity school-based and community organization-based OST programs.
Funding Level Not available
Funding Sources The Wallace Foundation
Other The study included an online calculator for assessing the costs of a variety of options for high-quality OST programs: http://www.wallacefoundation.org/cost-of-quality/cost-calculator/Pages/cost-calculator.aspx.
Researchers Public/Private Ventures and The Finance Project
Research Profiled

The Cost of Quality Out-of-School Time Programs

Investments in Building Citywide Out-of-School Time Systems

Research Planned None
Report Availability Lind, C., Relave, N., Deich, S., Grossman, J., & Gersick, A. (2006). The costs of out-of-school-time programs: A review of the available evidence. Philadelphia & Washington, DC: Public/Private Ventures & The Finance Project. Available at: www.wallacefoundation.org/knowledge-center/after-school/quality-and-cost/Pages/Costs-of-Out-of-School-Time-Programs.aspx and at www.financeproject.org/publications/litreview.pdf

Grossman, J. B., Lind, C., Hayes, C., McMaken, J., & Gersick, A. (2009). The cost of quality out-of-school time programs. Philadelphia & Washington, DC: Public/Private Ventures & The Finance Project. Available at: www.wallacefoundation.org/KnowledgeCenter/KnowledgeTopics/CurrentAreasofFocus/Out-Of-SchoolLearning/Pages/The-Cost-of-Quality-Out-of-School-Time-Programs.aspx and at www.financeproject.org/publications/CostofQualityOSTPrograms.pdf

Hayes, C., Lind, C., Grossman, J. B., Stewart, N., Deich, S., Gersick, A., McMaken, J., & Campbell, M. (2009). Investments in building citywide out-of-school-time systems: A six-city study. Philadelphia & Washington, DC: Public/Private Ventures & The Finance Project. Available at: www.wallacefoundation.org/knowledge-center/after-school/coordinating-after-school-resources/Pages/Investments-in-Building-Citywide-Out-of-School-Time-Six-City-Study.aspx and at www.financeproject.org/publications/InvestmentsInBuildingOSTSystems.pdf


Contacts

Research

Jean B. Grossman                       
Senior Research Fellow                   
Public/Private Ventures                   
2000 Market St.                        
Suite 550                           
Philadelphia, PA 19103                   
Tel: 215-557-4400                       
Fax: 215-557-4469                       
Email: jgrossma@princeton.edu

Cheryl D. Hayes
President & CEO
The Finance Project
1401 New York Avenue, NW
Suite 800
Washington, DC  20005
Tel: 202-628-4200
Fax: 202-628-4205
Email: chayes@financeproject.org

   
Profile Updated April 4, 2012

Research Study 1: The Cost of Quality Out-of-School Time Programs



Research Description

Research Purpose To answer two questions:
  • What do quality OST programs cost?
  • How do costs vary in different types of OST programs?
Research Design

Non-Experimental: The study team solicited recommendations of highly regarded OST programs from key informants across the six study cities. This request yielded an initial pool of over 600 programs. The programs were categorized into 36 program types according to typology of relevant program characteristics:

  • Age group of participants (i.e., elementary and middle school students [younger children] and middle and high school students [teens])
  • Location (i.e., school-based and community-based sites)
  • Operator (i.e., school-based and school-run; school-based and run by a community-based-organization [CBO]; or community-based and CBO-run)
  • Program content (i.e., academic focus, nonacademic focus, and multiple focus) and schedule of operation (school year only and full year including summer)

The goal was to have a relatively even distribution of programs in each city that would reflect the full range of relevant OST characteristics.

The pool was narrowed to ensure a sample of quality OST programs by stipulating that programs meet three criteria:

  • Staff/youth ratios could not exceed 1:20 for younger children or 1:25 for teens.
  • Programs for younger children were required to have a minimum of three quarters of their participants attend the activities, while programs serving teens could be drop-in.
  • Programs were required to have been in operation for at least 2 years.

Then, within each city, programs were randomly selected from each of the 36 types to be included in the study. Interviews were then conducted with executive directors of selected programs to confirm program characteristics.

Surveys on program costs were sent to executive directors or designated staff once they passed screening criteria. Of the 494 programs contacted, 215 met the three quality programming criteria and 111 completed the survey. Follow-up interviews were then conducted with key program staff (usually the executive director and/or financial manager) to confirm information provided on the surveys. This information was compared to data in program budgets and annual reports.  

Data Collection Methods

Document Review: Program budgets, annual reports, and documentation on the valuation of in-kind contributions were collected from programs.

Interviews/Focus Groups: Initial program director interviews served to confirm program characteristics, assess selection criteria, and collect information about program quality attributes (e.g., a clear organizational mission; small group sizes; adequate space and materials; formal orientation, staff training, and performance reviews; regular staff meetings; and formal feedback from participating youth and parents). Follow-up interviews with key staff included questions to verify cost data, probe for hidden costs (especially those related to in-kind contributions), and double-check staff salaries and hours.

Surveys/Questionnaires: Surveys examined program costs across seven areas: staff salaries, benefits, space and utilities, administration, transportation, student stipends, and other costs (e.g. meals, staff training).

Data Collection Timeframe Data were collected in 2006 and 2007.


Findings:
Formative/Process Findings

Activity Implementation

Sixty percent of programs focused on both academic and nonacademic activities, another 20% focused exclusively on academic activities, and the remainder had a nonacademic focus (e.g., drama, arts, music, sports, technology, leadership development, life skills).

Nearly two thirds (64%) of programs operated on a year-round basis (i.e., the school year and summer). The others operated during the school year only.

On average, programs serving younger children ran for 3.7 hours per day during the school year and 8.7 hours per day in the summer. Teen programs ran for 3.8 hours during the school year and 6.4 hours per day in the summer.

On average, programs serving younger children ran for 181 days per year during the school year and 44 days in the summer. Teen programs ran for 150 days during the school year and 35 days in the summer.

Among teen programs, 74% provided leadership opportunities to teens, ranging from holding volunteer or paid staff positions to leading activities or teams of their peers.

Costs/Revenues

Out-of-pocket expenses comprised over 80% of total costs for all programs. In-kind contributions made up the rest of the costs.

The average annual full cost per slot was $4,320 during the school year and $1,330 in the summer for programs serving younger children, and $4,580 during the school year and $1,420 in the summer for teen programs.

Because programs typically enrolled more children than the number of daily slots (since not all children attend every day), the average cost per enrollee was lower than the average cost per slot. For programs serving younger children, the cost per enrollee was 61% of the per-slot cost during the school year, and 75% of the per-slot cost in the summer. For teen programs, cost per enrollee was 41% of the per-slot cost for the school year and 56% of the per-slot cost in the summer.

Because they operated for more hours each day, summer programs were more costly than school-year programs on a daily basis and less costly on an hourly basis. For programs serving younger children, the daily costs were $24 for the school year vs. $32 for the summer while hourly costs were $7.40 for school-year programs vs. $4.10 for the summer. For teen programs, the daily costs were $33 for the school year vs. $44 for the summer, while hourly costs were $10.30 for the school year vs. $8.40 for the summer.

Staff salaries and benefits were the primary cost driver for programs, accounting for about two thirds of total operating costs. Space/utilities and “other” program costs made up about another quarter of costs on average, with the remaining costs consisting of administrative, transportation, and student stipend (for teen programs only) costs.

For programs serving younger children, on average, the school-year portion of year-round programs was less costly per hour per slot than school-year-only programs ($6.20 vs. $9.60). However, because year-round programs typically operated for more hours per day, the daily costs of the two programs were the same. For teen programs, the school-year portion of year-round programs was slightly cheaper than school-year-only programs, both per hour per slot ($9.50 vs. $10.80) and per day ($29 vs. $36).

Programs typically relied on three to five funding sources to cover their full costs: Those serving younger children were supported by three to four funding sources with 73% receiving resources from four or fewer sources. Teen programs relied on an average of four funding sources, with 71% using five or fewer sources. Approximately one third of the total program resources came from public sources (federal, state, or local). In-kind contributions (e.g., volunteers and donated space) represented approximately 20% of resources, and private sources (e.g., foundation grants, corporate donations, individual donations, United Way contributions, loans, contributions from civic organizations and churches, earned income, fundraising income) and parent fees supplied the remainder (approximately 50%).

For programs serving younger children, the majority received in-kind contributions (87%), public dollars (80%), and foundation grants (51%). Programs in all cities except Charlotte used parent fees to cover some of their costs. In Seattle, parent fees, which were publicly subsidized by child care vouchers, made up 69% of funds, but only 25% in the other cities. When a program received public funds, these funds constituted approximately half of the program’s resources. Most programs used both in-kind contributions and public funding.

For teen programs, in-kind contributions were the most common source of funds and covered 22% of program needs. Public funds, the second most common source of funds, covered on average half of program needs. For teen programs that received foundation or United Way grants, those funds covered larger fractions of the teen program’s needs compared to programs serving younger children.

Costs varied by city beyond the cost of living. For school-year programs serving younger children, Boston, Charlotte, New York City, and Seattle had similar average hourly costs per slot, ranging from $4–$6, while hourly per-slot costs were $8 in Chicago and $5–$15 in Denver. The average hourly per-slot costs of school-year teen programs in Boston, Chicago, and New York City were $6–$9, while in Denver and Seattle, where programs spent more on staff than programs in the other 3 cities, costs were $16 and $18 per hour per slot, respectively.

For school-year programs for younger children, academic programs had the highest hourly cost per slot ($12.50), followed by nonacademic programs ($9), and multiple-focus programs ($5.70). The higher hourly cost for academic programs was driven primarily by additional staff and material costs. On a daily per-slot basis, multiple-focus programs were still the least costly ($22), but nonacademic programs were more costly ($31) than academic programs ($27), which operated fewer hours per day than other programs.

For summer programs serving younger children, on an hourly basis, the three program types were similar. However, daily costs per slot differed by program focus: multiple-focus programs cost $34, academic programs cost $30, and nonacademic programs cost $26. These variations were mostly due to differences in the number of hours per day that programs operated. In addition, academic programs leveraged more in-kind contributions than the other two types of programs. Median daily costs were more similar across categories, ranging from $25 for nonacademic programs to $29 for multiple-focus programs.

For teen school-year programs, multiple-focus and academic programs had similar costs ($28 per day or $7.70 per hour versus $28 per day or $9.70 per hour, respectively). Single-focus nonacademic teen programs were more costly ($44 per day or $14.60 per hour) than the other two types of programs due to higher staff and material costs. (Data were not available for summer programs serving teens).

For programs serving younger children, full costs varied by location and type of organization. During the school year, school-based, school-run programs were the least resource intensive, costing $16 a day. Compared to other program types, these programs operated with fewer staff per youth, employed fewer teachers or certified staff, were more likely to be multiple-focus, and received more donated administrative services (copying, office material, etc.). Programs operated by CBOs used more resources, with per-slot costs of $21 per day or $7 per hour, while those in community facilities cost $30 per day or $8 per hour. During the summer, hourly costs were $3–$4 for all three program types. Daily summer costs were $32 for programs operated by CBOs in both locations and $28 for school-run, school-based programs. Staff in community-based, CBO-run programs leveraged the most in-kind contributions in the summer.

For school-year teen programs (data were not available for teen summer programs), the costs of school-based and community-based programs were nearly the same on a daily basis ($33 and $34 per day) and on an hourly basis ($10.90 and $9.90 per hour).

As programs increased enrollment, their daily costs declined until they reached a critical threshold where they needed to hire additional core staff, causing daily costs to increase. For school-year programs serving younger children, this critical point was 100 slots, while for summer programs serving younger children and school-year teen programs, the critical threshold was closer to 150 slots (data were not available for summer programs for teens).

Programs serving multiple age groups were more expensive than those serving a single age group. During the school year, programs serving only elementary school students cost approximately $21 per day, while programs serving elementary and middle school students cost $24 per day. In addition, it cost $31 per day to serve only high school students and $32 per day to serve only middle school students, but it cost $37 per day to serve both age groups. Serving all three age groups cost $35 per day. In the summer, the average daily slot cost was $29 for programs serving elementary school children; $31 for programs serving elementary and middle school children; and $35 for those serving all three age groups (data were not available for summer programs serving middle and high school youth). Programs serving multiple age groups had relatively lower out-of-pocket expenditures in the summer than during the school year because staff were more successful at finding in-kind contributions.

On average, management staff (e.g., executive/associate director, site coordinators) had the highest hourly wages, ranging from a high of $25.71 for school-year programs serving younger children to a low of $23.21 for teen school-year programs. In addition, activity leaders’ average salaries ranged from a high of $18.48 for school-year teen programs to a low of $12.75 for summer programs serving younger children. Administrative/support staff salaries averaged from a high of $18.39 for the summer teen programs to a low of $14.36 for summer programs serving younger children.

Program Context/ Infrastructure

Ninety percent of programs were run by CBOs, whether they were located in a school or community facility. These CBOs encompassed a wide range of private and nonprofit entities, including Y’s, Boys & Girls Clubs, parks and recreation centers, childcare centers, and faith-based organizations. Only 8 of the 111 programs, however, were part of national organizations, such as the YWCA or Boys & Girls Clubs of America.

One quarter of the programs had been in operation for less than 5 years at the time of selection; however, 44% had existed for more than 10 years.

Recruitment/ Participation

Approximately 63% of programs served primarily younger children, while 37% served teens.

Programs ranged in total enrollment size from 15–1,800 participants. Although the majority of programs (52%) served fewer than 100 participants, a substantial portion (33%) served more than 200 participants. In terms of slots, programs had the capacity to serve as few as 12 to as many as 1,350 youth per day.

Programs serving younger children enrolled an average of 193 children during the school year and 128 during the summer. Teen programs enrolled an average of 297 teens during the school year and 282 teens during the summer.

Programs serving younger children served an average of 107 children each day during the school year and 93 during the summer. Teen programs served 70 youth each day during the school year and 55 during the summer.

On average, 79% of participants in programs serving younger children attended all the time. Among teen programs, 64% of participants attended all or most of the time.

Staffing/Training

At programs serving younger children, 67% of staff who led activities had a 2- or 4-year college degree, and 24% were teachers or certified specialists. At teen programs, 84% had a 2- or 4-year college degree, and 31% were teachers or certified specialists.

Three quarters or more of programs had staff training requirements, with staff receiving approximately 30 hours of training per year.

During the school year, staff/youth ratios averaged 1:8.3 for programs serving younger children and 1:9.3 for teen programs.

In terms of full-time-equivalent staff, on average, programs serving younger children had 7.6 paid staff and 0.6 volunteers during the school year, and 12.3 paid staff and 1.2 volunteers in the summer. Teen programs had an average of 5.5 paid staff and 0.6 volunteers during the school year, and 7 paid staff and 0.5 volunteers in the summer.

The majority of programs’ staff salaries went toward activity leaders (53–65% of total salaries across programs), followed by management (e.g., executive associate director, site coordinator; 29–39% of total salaries across programs), with the smallest percentage going toward administrative/support staff (6–8% of total salaries across programs).



Research Study 2: Investments in Building Citywide Out-of-School-Time Systems: A Six City Study



Research Description

Research Purpose To explore strategies and activities pursued in building citywide OST systems, monetary and in-kind investments associated with these efforts, variations in investments from city to city, and options for financing these efforts.
Research Design Non-Experimental: Researchers made site visits to each of the six cities to gather data on system investments in four major areas (providing community leadership and vision, improving program quality, expanding access and participation in quality programs, and financing and sustaining quality programs). These four areas were identified from the Finance Project and Public/Private Ventures’ prior research on afterschool systems. In planning for the site visits, the researchers worked closely with key informants, which included representatives from a variety of afterschool organizations and government agencies, to help identify the relevant system components in each city and the individuals and organizations most involved with and knowledgeable about these components. While on site, the researchers interviewed informants about funding and collected funding/budget documents. Follow-up phone interviews were also conducted with informants.
Data Collection Methods

Document Review: Program budgets, annual reports, and documentation on the valuation of in-kind contributions were collected from programs.

Interviews/Focus Groups: On-site interviews served to assess system-building activities, strategies, investments, and financing. Follow-up phone interviews served to verify data, probe for hidden costs (especially those related to in-kind contributions), and gather additional information as needed.

Data Collection Timeframe Data were collected between October and December of 2007.


Findings:
Formative/Process Findings

Costs/Revenues

The cities spent an average of 14% of their OST system-building investments on leadership, ranging from 1%–35% per city. The size of these investments seemed to be most closely related to the availability of funding from a third-party source, with larger percentages devoted to leadership investments in cities that had funding from third-party sources.

On average, the cities devoted 43% of their OST system-building investments to improving program quality, ranging from 4%–69% per city. These investments consisted primarily of cash contributions and a small amount of in-kind donations. Out-of-pocket contributions largely covered staff members, consultants, facilities and equipment, materials development, and other direct costs.

Investments in expanding access to and participation in OST programs averaged 38% of cities’ OST system-building investments, ranging from 3%–71% per city. An important determinant of the scale of access and participation investments seemed to be whether funding was allocated specifically for these investments—cities with more dedicated funding for this purpose had higher funding levels in this area.

Systemic Infrastructure

Researchers identified five general strategies for nurturing OST system leadership:

  1. Leadership of mayors to focus attention on OST programs; bring people together; mobilize resources; and develop mechanisms for providing guidance, management, and support.
  2. Citywide governing bodies to lead, advise, and monitor system-building efforts.
  3. OST intermediaries to foster collaboration and coordination among stakeholders and mobilize resources.
  4. Partnerships and collaborations to pool knowledge and resources.
  5. Business planning to identify needs, priorities, strategies, and activities to be pursued.

Leadership investments were primarily directed toward governing bodies in Chicago and New York City, toward OST intermediaries in Charlotte and Seattle, and toward partnerships and collaborative relationships in Boston and Denver.

Investments in leadership included the costs of leaders’ time incurred to participate in governing bodies, intermediaries, and other collaborative relationships. These costs varied depending on the participants and their role in the decision-making process.

Leadership activities overseen by intermediary organizations were supported to a greater extent with dedicated staff time, while leadership activities that occurred in cities without intermediaries were supported more by time volunteered by leaders from several organizations. Investments in OST intermediaries were often provided by private foundations and public agencies interested in advancing system-building efforts.

Investments in quality generally took one of four forms:

  1. Staff technical assistance and professional development
  2. Alignment of programming with school district curricula
  3. Adherence to quality standards to evaluate and assess effectiveness
  4. Data management systems to compile and organize information on programs and indicators of effectiveness

Investments in quality improvements primarily consisted of ongoing training and technical assistance expenditures, which were typically funded with a blend of public and private resources. Cities offering staff credentialing or degree programs through existing training organizations had much larger professional development investments than cities that relied on consultants and outside trainers for professional development. Quality standards and data management systems also accounted for a substantial percentage of total investments in Denver and Chicago.

Several cities were involved in developing program quality standards. Local leaders used lessons they learned from state initiatives and national professional organizations to help shape the design and implementation of quality-rating tools. Developing these tools did not appear to require substantial monetary investments, but researchers noted that they had little information on what was required to successfully implement quality standards.

Three cities launched large-scale data management systems to collect participation and outcome data across local OST programs. Local leaders reported that foundation support was critical to covering the software development, training, system maintenance, and user support necessary to get these systems up and running.

Investments in expanding access and participation were largely one-time allocations for start-up and development, which generally took one of five forms:

  1. Resource and referral systems
  2. Market research to understand local needs and preferences
  3. Outreach
  4. Pilots and program innovations designed to attract and better serve diverse youth
  5. Building facilities and securing rent-free space throughout the city

Four of the six cities heavily concentrated their expanding access and participation investments in particular strategies: Denver spent 100% of its investments on resource and referral systems, Boston and Charlotte spent 97%–100% on pilots and program innovations, and Seattle spent 100% on building facilities. By contrast, in New York City and Chicago, these investments were spread across multiple strategies.

Expanding access and participation investments covered staff salaries and benefits, consultants, contracted services provided by communications and market research firms, printing, and administrative resources related to maintaining data systems and conducting data analysis. Researchers found little evidence of in-kind contributions in this area.


© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project