You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


David Eddy Spicer, Roland Stark, and Martha Stone Wiske from WIDE World describe their process of measuring learning in online professional development.

An expanded, Web-only version of this article is also available.

How do you create an assessment system for professional development that is as useful to learners as it is to instructors and program managers and at the same time provides evidence of learning to those on the outside? Accountability standards give a prominent role to assessment as critical to educational effectiveness. Yet new views of teaching and learning make clear that if assessment is truly to support instruction, then assessment strategies must be made integral to students' learning activities. Online environments for professional learning offer both challenges and opportunities in striking this balance. One big challenge in online environments is getting a geographically dispersed instructional team to agree on what constitutes valid and fair assessment; one big opportunity is the chance to assess student work that is well documented and largely text based, with a trail of evidence from the beginning to the end of the learning process.

A team from the WIDE World online professional development program at the Harvard Graduate School of Education (wideworld.pz.harvard.edu) developed a three-step process that effectively negotiated this balancing act with educators enrolled in our online courses. First, we helped instructors sharpen course goals. Next, we developed performance assessments that included rubrics to guide progress toward those goals. Finally, we devised a testable system for applying the rubrics to score a key course assignment. Summarized in this way, this three-step process might appear to be a straightforward march. In practice, it is more of a tango, with bold moves forward toward greater definition of instructional intent, then back, as each successive step demands new, creative ways of maintaining balance and rhythm.

We chose three online courses with which to test this assessment process. All three drew on the Teaching for Understanding framework,¹ which emphasizes the use of well-defined goals and ongoing assessment in teacher professional development. The educators enrolled in these courses are supported by online teaching assistants or “coaches.” Participants work on creating lesson plan designs and sharing them via online tools; feedback from the coaches is supplemented by exercises to stimulate self- and peer reflection.

Our three-step assessment tango helped us move from an ad hoc approach in each course to a systematic one. This meant finding our collaborative rhythm in the first step, making bold moves forward to tighten important instructional links in the second step, and finally “going quantitative” to track outcomes in the third step.

First Step: Sharpening Course Goals and Key Performances
We structured initial meetings with instructors using a tuning protocol² designed by the Coalition of Essential Schools as a tool for sharing curricular work. This process succeeded in fostering open communication among all instructors. In these meetings, we also conducted an assessment inventory to examine how key assignments related to instructional goals and to understand existing coach, peer, and self-assessment activities.

Second Step: Tightening Links Among Goals, Performance, and Assessment
Effective assessment, whether online or face-to-face, gives learners clear criteria, frequent feedback, and opportunities for reflection. The three courses already had rubrics in place to accomplish these tasks for certain assignments. The assessment inventory from Step 1, along with subsequent discussions with coaches, helped to clarify the criteria in these rubrics. From there, we built these criteria into a more global rubric or reflection guide, which directed participants' work toward a cumulative course product.

Third Step: Going Quantitative
Our final step was to help instructors translate each reflection guide into a scoring guide for evaluating learners' final products on a numerical scale. Instructors initially hesitated for fear that a reductionist emphasis on scoring would derail learning and create issues around fairness and efficiency. Other issues arose that were particular to online environments, especially the inability to make quick changes to the course and, in particular, tweak the assessment process on the fly, as one might do during a face-to-face course. However, instructors soon saw that they could make the approach fit their own instructional intentions and preferences. Coaches' enthusiasm for finding constructive ways to make the rubrics work also boosted instructors' confidence in the process.

By the end of our first trial run, our team had developed a solid, systematic approach to participant assessment. Our three-step tango had helped us move past the perils of “dumbing down” measures that threatened to reduce learning to jumping through hoops. We are now well on the way to mastering the three steps that help drive learning forward and provide evidence of the learning process.

¹ For an in-depth explanation of the Teaching for Understanding framework, see Wiske, S., & Spicer, D. E. (2004). WIDE: Using Networked Technologies to Promote Professional Development. The Evaluation Exchange, 10(3) 10
² A tuning protocol is a widely used series of steps for reflecting on teacher and/or student work. For additional details, please see Allen, D., & McDonald, J. (n.d.). The tuning protocol: A process for reflection on teacher and student work. Retrieved November 30, 2005, from http://www.essentialschools.org/cs/resources/view/ces_res/54

David Eddy Spicer
Research Manager
Tel: 617-384-9869
Email: eddyspda@gse.harvard.edu

Roland Stark
Researcher/Statistician
Tel: 617-384-7841
Email: roland_stark@gse.harvard.edu

Martha Stone Wiske
Co-Principal Investigator
Tel: 617-495-9268
Email: wiskema@gse.harvard.edu

WIDE World
Harvard Graduate School of Education
14 Story Street, 5th floor
Cambridge, MA 02138
Website: wideworld.pz.harvard.edu

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project