You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Claudia Weisburd and Rhe McLaughlin of Foundations, Inc., describe their Quality Assurance System for program improvement.

Since 1992, Foundations, Inc., has operated extended-day enrichment programs and provided technical assistance to schools, school districts, and other education and community organizations. Working primarily with children from low-income communities and those who serve them, Foundations seeks to improve program performance and enhance student achievement in school and during non-school hours.

In 1999, after a rapid expansion in our after school operations, we at Foundations found ourselves confronted with what is now a widespread challenge: maintaining program quality across states, sites, and highly varied on-the-ground conditions. Snapshot evaluations or one-time ratings were clearly insufficient. We needed a system that would establish the highest standards and at the same time help staff attain and maintain those standards through a process of tailored, targeted, and continuous improvement. After three years of piloting and revising, Foundations developed the Quality Assurance System (QAS) evaluation tool.

The QAS is an assessment tool that can be tailored to individual programs to build a comprehensive picture for planning program improvement. It is designed to show change over time, instead of merely showing where a program happens to be on assessment day. Using the QAS, programs begin with an initial assessment. Based on the results, assessors and program staff identify areas in need of improvement and develop specific improvement strategies. The program then conducts a follow-up assessment, paying particular attention to those elements targeted for improvement. The QAS is designed to be used as either a self-assessment tool or a tool that outside evaluators can use. (Outside assessment offers the opportunity for program-specific expert help. By comparing programs across sites, the reports generated through outside assessment create a picture of how well programs measure up against each other; these reports may also be particularly useful for formal reporting requirements.)

Offering Flexibility While Ensuring High Standards
The design of the QAS is based on our belief that if staff are to embrace assessment-based improvement plans, they must see the process as relevant, contextual, and potentially helpful. A hallmark of the QAS therefore is its flexibility. The tool starts “from where programs are,” by assessing their missions, content, and resources. At the same time, it emphasizes best practices for after school. This combination leads to high quality, realistic, and site-specific improvement planning.

The two-part program profile assesses the context of the program. Part I addresses items essential to all after school programs, including staffing, facilities, and health and safety. These “program-basics building blocks” are examined for all programs regardless of particular program content.

Part II narrows the lens to analyze “program-focus building blocks,” such as program mission, target population, and activities, with specific sections applicable to academic programs, recreation programs, and youth development. The assessor using the QAS completes only those sections corresponding to specific program goals. This program-focus section allows providers to tailor the assessment to the actual goals and activities of their programs. This specificity frees programs from being penalized for failing to provide content that was not part of their mission. For example, a recreation program will not receive a lower rating because it scores poorly in academic content, if academics were not part of the program goals.

While the QAS deliberately accommodates a range of program types, it establishes clear, high standards through the close assessment of building blocks. Each building block is divided into composite elements that are evaluated on a scale of 1 to 4. Element scores are summed to provide a total score for each building block. Within the staffing building block, for example, the assessor looks at elements such as qualifications, attendance, and staff-child ratios. The level of detail provided by these elements and building blocks allows staff to easily identify both program strengths and weaknesses. This information can then guide targeted improvement planning.

Current and Future Progress
Since the debut of QAS in April 2003, over 150 programs have participated in its processes, which have involved approximately 1,000 staff members in evaluation, reflection, and program improvement. With the launch of a web-based version of the tool (slated for spring 2004), sites will have the ability to collect data, analyze findings, plan change, and produce reports online. By blending meaningful assessment with continuous improvement, the QAS has encouraged and will continue to support the development and maintenance of high quality after school programs across diverse sites.

For more information about evaluation of the Foundations, Inc., after school programs, visit the HFRP Out-of-School Time Program Evaluation Database.

Claudia Weisburd, Ph.D.
Director of Special Projects
Email: cweisburd@foundationsinc.org

Rhe McLaughlin
Director of Evaluation
Email: rmclaughlin@foundationsinc.org

Moorestown West Corporate Center
2 Executive Drive, Ste. 1
Moorestown, NJ 08057-4245
Tel: 856-533-1600
Website: www.foundationsinc.org

 

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project