You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Jennifer Buher-Kane, Nancy Peter, and Susan Kinnevy of the Center for Research on Youth and Social Policy at the University of Pennsylvania share their experience of creating a tool kit designed specifically for those who provide professional development to out-of-school time program staff.

The need for effective professional development for out-of-school time (OST) staff is well documented.1 Researchers and practitioners have shown increasing interest in finding and utilizing methods that effectively evaluate professional development for this population. In the summer of 2004, we at the Out-of-School Time Resource Center (OSTRC), housed within the Center for Research on Youth and Social Policy (CRYSP) at the University of Pennsylvania, attempted to locate research-based survey instruments used to evaluate OST professional development.

After an extensive literature review and conversations with key stakeholders, we determined that these instruments did not exist. Seeing a need for such tools, we implemented a mixed-method pilot study to design and test survey instruments that can be used in OST workshops and conference settings.

As part of the planning process, we at the OSTRC reviewed literature on effective implementation and evaluation of professional development, including models of evaluation from the professional development researchers Guskey, Killion, Kirkpatrick, and others.2 These models have common elements; in particular, each defines various “levels” of evaluating professional development, such as participants' satisfaction, learning, application, and results.3 To adapt these education-based models to OST, we added another evaluation level: extension. Extension refers to adapting knowledge to suit a particular program and/or sharing this knowledge with others such as OST staff, programs, or youth.4

Using the theoretical frameworks described in the literature, we next developed instruments to measure knowledge, skills, and attitudes, as well as intended versus actual application, at several points in time during the professional development process:

  • Preworkshop surveys primarily gather baseline data regarding knowledge, skills, and attitudes.
  • Postworkshop surveys gather the following information: comparative data regarding knowledge, skills, and attitudes; reactions to workshops and presenters; and baseline data regarding intended application.
  • Follow-up workshop surveys gather comparative data regarding actual application, benefit to students, and organizational support.
  • Presenter self-assessments gather information regarding reactions to workshops that compares with participant responses.
  • Overall conference evaluations gather information regarding reactions to conference components.

The preworkshop and postworkshop surveys are administered immediately before and after the workshops. The follow-up workshop survey is administered 1 month after the workshops; the inclusion of this longer-term follow-up is a distinction between the OSTRC design and those used in formal and early childhood education.

The OSTRC tested our new surveys at several conferences. The first pilot test was conducted in November of 2004 with 339 staff at one OST conference in Philadelphia. This conference yielded 1,174 surveys. After the conference, we conducted a series of five focus groups with 50 OST staff in Philadelphia to determine how the surveys could be revised. After analyzing the qualitative data, the results and feedback were incorporated into the survey questions.

A second pilot test was conducted in April and May of 2005 at two conferences. One conference hosted OST staff from across Pennsylvania, while the other hosted OST staff from multiple states within the mid-Atlantic region. Taken together, these two conferences included 740 staff and yielded 3,540 surveys. OSTRC and CRYSP staff performed a comprehensive data analysis, after which the surveys were further revised and sent to a survey design expert at the University of Pennsylvania for review. The results of this review are still pending, but will be used to revise again if necessary.

We at the OSTRC plan to publish these survey instruments as part of an evaluation tool kit that will be available, free of charge, for those who design or provide professional development to OST staff. We will share the data collected through these surveys and maintain a data storehouse that will track their use over time on a national level. We are also conducting further research on evaluating alternative forms of professional development and accurately measuring changes in OST student outcomes that result from professional development.

1 Halpern, R. (1999). After-school programs for low-income children: Promise and challenges. The Future of Children, 9, 81–93; Lauver, S. (2004). Attracting and sustaining youth participation in after school programs. The Evaluation Exchange, 10(1), 4–5, 31; Shortt, J. (2002). Out-of-school time programs: At a critical juncture. New Directions for Youth Development, 2002(94), 119–123.
2 Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York: Basic Books; Guskey, T. (2000) Evaluating professional development. Thousand Oaks, CA: Corwin Press; Killion, J. (2002). Assessing impact: Evaluating staff development. Oxford, OH: National Staff Development Council; Kirkpatrick , D. L. (1994). Evaluating training programs: The four levels. San Francisco, CA: Berrett-Koehler.
3 To read more about the levels of professional development evaluation, see Questions and Answers with Thomas Guskey.
4 Peter, N. (2004). Out-of-school time (OST) professional development workshops: An evaluation framework. Retrieved December 6, 2005, from http://www.sp2.upenn.edu/ostrc/pdf/OSTWorkshopEvaluation.pdf

Jennifer Buher-Kane
Senior Research Coordinator
Out-of-School Time Resource Center
Email: jbuher@sp2.upenn.edu

Nancy Peter, M.Ed.
Director
Out-of-School Time Resource Center
Email: npeter@sp2.upenn.edu

Susan Kinnevy, Ph.D.
Research Director
Center for Research on Youth and Social Policy
Email: kinnevy@sp2.upenn.edu

Center for Research on Youth and Social Policy
University of Pennsylvania
3815 Walnut Street, 3rd floor
Philadelphia, PA 19104
Tel: 215-898-2505
Fax: 215-573-2791
Website: www.sp2.upenn.edu/ostrc

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project