Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.
Volume XI, Number 4, Winter 2005/2006
Issue Topic: Professional Development
Promising Practices
Ila Desmukh Towery and Rachel Oliveri offer lessons for engaging teacher and student stakeholders in the evaluation of a professional development program.
The Gender Equity Model Sites initiative (GEMS) is a pilot application of the Seeking Educational Equity and Diversity model (SEED) in a Bos-ton-area middle school and high school. GEMS is a peer-led professional development program that aims to facilitate school-wide change by fostering in teachers a greater awareness of how gender, race/ethnicity, and class impact their teaching practices and their understanding of and interactions with students. The initiative seeks to encourage self-reflection and raise consciousness around issues of inequity in schools in order to create more equitable and safe school environments.
Our goals in evaluating GEMS are to document program implementation, identify any changes in teaching practices and teacher awareness around issues of equity and diversity, and provide feedback to the schools for program improvement. Our evaluation employs a mixed-method approach through student and teacher surveys, student research groups, semistructured teacher interviews, and student focus groups; we use the latter three methods to actively engage stakeholder voices. The first 2 years of our 3-year evaluation of GEMS offer the following lessons for engaging stakeholders, including youth, in professional development and its evaluation.
1. A conceptual model for evaluation helps prioritize stakeholder engagement. The evaluation of the GEMS initiative uses a conceptual model in which integrating the voices of stakeholders is critical to understanding program impact. The model, Jacobs' five-tiered approach to evaluation, organizes evaluation activities at five levels-moving from generating descriptive and process-oriented information to determining the program effects and outcomes.¹ Our evaluation emphasizes understanding program implementation and participants' experiences of the program. Stakeholder voices matter for both of these goals. Early in our evaluation, we identified the program's key stakeholders as both teachers and students.
2. Program buy-in facilitates stakeholder engagement in evaluation. Thus far, teacher and student buy-in to the GEMS initiative seems to facilitate their buy-in to our evaluation. For example, when we conducted semistructured, open-ended interviews with program participants to better understand teachers' experiences of GEMS, their enthusiasm for GEMS was apparent in their eagerness to share their personal experiences with us. It may also be possible that teachers' commitment to the program's ideals of social justice has enabled them to become more deeply engaged in the learning opportunities created by the evaluation.
3. Stakeholders can be instrumental in engaging other groups. We partnered with teachers to conduct student focus groups that grouped students together according to their racial, ethnic, sexual, or gender identities. These groupings were meant to encourage a safe forum for frank expression of students' personal experiences at school with regard to their identities. Teachers were eager to hear about students' experiences and share students' concerns with their colleagues as part of their work toward school equity, and they played an active role in recruiting students and leading the discussions in the focus groups. The combination of teacher involvement; stipends and incentives for student and teacher participation, respectively; and students' interest in sharing their personal experiences contributed to both teachers' and students' successful engagement in these groups and resulted in our gaining valuable insights into their school experiences.
4. Direct program involvement and depth of contact may matter for engaging youth stakeholders. In an attempt to further include student voices in our evaluation, we designed student research groups in which students were to assess their school climates through photo documentation. It was our intention to develop a researcher–student collaboration in which we, as researchers, would offer students the opportunity to gain research skills, while the students provided us with their perspectives on school climate.
Attaining student buy-in to these groups proved challenging, however, because most students were not directly involved in any GEMS activi-ties and thus were unaware of the program and how it pertained to them. Moreover, the only way the school could accommodate these groups was through a two-part guest visit during a single teacher's class period. As a result, we did not have the opportunity to develop relationships with the students, and our inability to train students in research methods in such a short time weakened students' contributions to the evaluation data and analysis. However, as noted earlier, teacher recruitment and the use of incentives helped engage students in some of our other evaluation activities.
As evaluators, we are discovering that it is important that both program and evaluation design take into account the varying levels of stake-holder involvement in order to create successful researcher–stakeholder collaborations.
¹ Jacobs, F. H. (1988). The five-tiered approach to evaluation: Context and implementation. In H. B. Weiss & F. H. Jacobs (Eds.), Evaluating family programs (pp. 37-68). Hawthorne, NY: Aldine de Gruyter; Jacobs, F. H., Kapuscik, J., Williams, P., & Kates, E. (2000). Making it count: Evaluating family preservation services. Medford, MA: Family Preservation Evaluation Project.
Ila Deshmukh Towery
Ph.D. candidate
Eliot-Pearson Department of Child Development
Tufts University
177 College Avenue, Room 101
Medford, MA 02155
Email: ila.deshmukh@tufts.edu
Rachel Oliveri, M.A.
Evaluation Co-Coordinator
Eliot-Pearson Department of Child Development
Tufts University
177 College Avenue, Room 101
Medford, MA 02155
Email: rachel.oliveri@tufts.edu