You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼

Theodore Lamb, of the Center for Research and Evaluation at Biological Sciences Curriculum Study, discusses retrospective pretests and their strengths and weaknesses.

Evaluators often use pretests and posttests in their evaluation designs, taking measures before and after an intervention to see what difference the intervention made. At times, however, evaluators can find such designs difficult to implement, especially in interventions such as professional development (PD) workshops. In such workshops it can be burdensome to administer instruments twice, or worse, the traditional pretest may not be effective if participants do not sufficiently understand, prior to the workshop, terms or concepts needed to answer pretest questions. Campbell and Stanley¹ offer a solution in the form of the retrospective pretest (RPT).

The RPT is administered at the same time as the posttest, that is, respondents are asked to answer questions about their level of understanding or skill after an intervention, such as a PD workshop, occurs. They are then asked to think back to their understanding prior to the intervention; this is the retrospective pretest. Selected strengths and weaknesses of using the RPT method are discussed below (see also box).

Retrospective Pretest Strengths
Single administration. Workshop participants sometimes dislike completing evaluation forms before and again after a workshop. One clear advantage of the RPT is that evaluation forms are administered only once, at the end of the workshop. This single administration saves time and may be more satisfying to participants.

Avoiding the response-shift effect. The RPT can also help to avoid a response-shift effect,² which occurs when a respondent's frame of reference or evaluation standard changes significantly during an intervention. Consider, for example, a workshop for teachers focused on the topic of inquiry-based teaching. If workshop participants misunderstand basic terms or concepts associated with inquiry-based teaching, then results from traditional pretest questions may be misleading.

The RPT method avoids the response-shift effect by clearing up misconceptions before participants are asked to make assessments. Once a workshop is over and all concepts have been sufficiently explained, participants first assess their new level of understanding or skill, and, second, reflectively assess the level of understanding or skill they had prior to the workshop.

Retrospective Pretest Weaknesses
Desire by participants to show a learning effect. A “good subject effect”³ occurs when subjects try to figure out what a researcher wants and then give responses to support what they believe that to be. The RPT may introduce a permutation of the good subject effect: the learning effect, or the tendency for workshop participants to show, in order to help the workshop deliverers look good, that learning took place regardless of whether or not it did.

Threats to validity. Self-report data and the recall of information through reflection may be subject to problems of insufficient recall as well as offer the potential for fabricated or biased responses.

The logic and value of experimental designs has a long and distinguished history in many disciplines and should be used when they can help to answer research or evaluation questions. However, if for some reason a traditional pretest and posttest cannot be used, then the RPT is better than either not evaluating at all or taking only posttest measures.

To explore possible differences between traditional pretests/posttests and the RPT, we collected data in several PD workshops using both types of administrations. Results showed little differences between the two approaches. We concluded therefore that the RPT is a good method to use if it is difficult or impossible to use traditional pretests. However, we recommend that the RPT be supplemented by additional data gathered to illustrate the effectiveness of workshops and other interventions.

Strengths and Weaknesses of Retrospective Pretests


• Less likely to offend participants who do not like being put in the role of students or research subjects required to complete both pretests and posttests

• Can be used when traditional pretests are not possible

• Unlike traditional pretests, does not risk negatively impacting intervention effectiveness by possibly introducing terms and concepts before participants are ready for them

• Provides data that, with other supporting data, can be used to evaluate the effectiveness of a professional development intervention


• May introduce a desire for participants to show a learning effect

• Challenges traditional methodological logic, since both predata and postdata are collected after the intervention

• May introduce threats to validity such as memory recall, history, and regression to the mean

• Possibility of fabricated and biased responses

• Can be perceived as less rigorous, and therefore less convincing, than other approaches

¹ Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally.
² Howard, G. S. (1980). Response-shift bias: A problem in evaluating interventions with pre/post self reports. Evaluation Review, 4(1), 93–106.
³ Orne, M. T. (1962). On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. American Psychologist, 17(11), 776–783.

Theodore Lamb
Center for Research and Evaluation
Biological Sciences Curriculum Study
5415 Mark Dabling Blvd.
Colorado Springs, CO 80918
Tel: 719-531-5550

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project