You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.

www.HFRP.org

The Harvard Family Research Project separated from the Harvard Graduate School of Education to become the Global Family Research Project as of January 1, 2017. It is no longer affiliated with Harvard University.

Terms of Use ▼


Jennifer Greene, Associate Professor at Cornell University, discusses a framework for planning and implementing mixed-method evaluations.

Evaluation theory and practice today are characteristically pluralistic, embracing diverse perspectives, methods, data, and values within and across studies, toward the generation of more insightful and meaningful evaluative claims. Yet traditionally, mixed-method work has concentrated on technical level, focusing on combining qualitative and quantitative methods within one evaluation study. This article strengthens the promise of pluralistic evaluation by advancing an enhanced conceptual framework that extends beyond the technical level for planning implementing mixed-method evaluation studies.

When planning an effective and defensible mixed-method evaluation design, one must consider three levels: the political level, which includes the purpose and of evaluation; the paradigm level, which incorporates assumptions about the social world and our ability to know it; the technical level, which represents discrete methods and procedures for gathering and analyzing data. It is at the paradigm level where the greatest controversy mixing methods arises. Paradigms serve as the philosophical “anchor” of social inquiry, providing the framework within which are grounded assumptions about knowledge, our social world, and role as evaluators in it. These include, example, assumptions regarding the objectivity or subjectivity of knowledge claims and the realism or relativism of our social worlds.

The current mixed-method conversation consists primarily of three stances on sensibility of mixing paradigms while mixing methods in evaluative inquiry: (1) purist stance, whose adherents argue against the sensibility of mixing paradigms; (2) the pragmatic stance, in which paradigms are viewed as useful conceptual constructions but of little value in guiding practice, and in which methodological decisions should be made to maximize contextual responsiveness; and (3) the dialectic stance, in which paradigms are viewed as important frameworks for inquiry practice, and the inevitable tensions invoked by juxtaposing different paradigms are viewed as potentially generating more complete, more insightful, even more revisioned or transformed evaluative understandings.

Past discussions about mixing methods demonstrate clearly that focusing on the philosophical differences between evaluation approaches mires us in controversy. If we as evaluators wish to maximize the possibilities in intentionally using multiple methodologies, we must shift the mixed-method conversation away from preoccupation with explicit assumptive differences among paradigms and toward other characteristics of social inquiry traditions. This is not to say that philosophical underpinnings should be eschewed; indeed, each paradigm offers a meaningful and legitimate way of knowing and understanding. Rather, the argument here for a middle ground and a balance between philosophy and methodology, between paradigms and practice. There need to focus on characteristics that define the different inquiry traditions and therefore warrant our attention and respect, but also that are not logically irreconcilable when juxtaposed with contrasting characteristics.

One such characteristic is that of knowledge claims generated by different traditions. For example, concepts such as closeness and distance, particularity and generality, and meaning and causality are characteristically advanced by the traditions of interpretivism and postpositivism, respectively, and are not necessarily logically incompatible. A mixed-method design combining these two traditions would strive for knowledge claims that grounded in the lives of the participants studied and that also have some generality to other participants and other contexts; that enhance understanding of both the unusual and the typical case; that isolate factors of particular significance while also integrating the whole; and that full of emic meaning at the same time that they offer causal connections of broader significance. Such a study produces results that are potentially more useful and relevant.

An alternative set of characteristics that may productively advance the mixed method conversation are the different values and interests advanced by different methodological traditions. For example, postpositivism characteristically advances values of efficiency and utilitarianism; interpretivism characteristically promotes values of diversity and community. A mixed-method approach would focus on the value-based and action-oriented dimensions of each of the different inquiry traditions and become the grounds on which methods and analysis decisions were made. An evaluation focusing on these areas represents a greater plurality of interests, voices, and perspectives, and offers a potentially more constructive dialogue among different evaluation traditions.

Two design alternatives that may effectively combine the critical features of different traditions are component and integrated designs. In component designs, the different methods remain discrete throughout the inquiry, so that the combining of methods is conducted at the level of interpretation and inference. Three specific examples of component designs that build directly from earlier work on mixed-method purposes are triangulation, complementarity, and expansion. Triangulation has typically entailed the use of different methods, each with offsetting biases, to assess a given phenomenon. However, this same logic can be applied to inquirer bias, bias of substantive theory, and bias of inquiry context. Complementarity designs are those in which results from one method-type are enhanced or clarified by results from another method-type, both within a single inquiry framework and across different inquiry frameworks. Expansion designs rely on different inquiry frameworks and methods for different inquiry components and bring them together in a side-by-side fashion.

Further Reading


Bryman, A. (1988). Quantity and quality in social research. London: Unwin Hyman.

Caracelli, V. J., & Greene, J. C. (1993). Data analysis strategies for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 15, 81–105.

Firestone, W. A. (1990). Accommodation: Toward a paradigmpraxis dialectic. In Egon G. Guba (Ed.), The paradigm dialog (pp. 105–124). Newbury Park, CA: Sage.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255–274.

Krantz, D. L. (1995). Sustaining vs. resolving the quantitative-qualitative debate. Evaluation and Program Planning, 18, 89–96.

Ragin, C. C. (1989). The comparative method: Moving beyond the qualitative and quantitative strategies. Berkeley: University of California Press.

Ragin, C. C. (1996). Turning the tables: How case-oriented research challenges variable-oriented research. Comparative Social Research, 16, 31–46.

Reichardt, C. S., & Rallis, S. F. (Eds.). (1994). The qualitative-quantitative debate: New perspectives. New Directions for Program Evaluation, 61.

Integrated mixed-method designs attain greater integration of the different method types during the inquiry and analytic processes. Examples of such designs include iterative, embedded or nested, holistic, and transformative designs. Iterative designs are characterized by a dynamic and ongoing interplay over time among the different methodologies. Methods are employed in multiple iterations such that findings and interpretations are developed at increasing levels of sophistication. Embedded designs join one methodology within a different methodology, for example, an ethnographic study within a quasi-experimental framework. In holistic integration, there is a simultaneity of the mix of methods which offers a synthesis of perspectives, of understandings and insights reached, and of study results and conclusions. The transforming design takes into consideration the value-based and action-oriented dimensions of the different inquiry traditions and emphasizes the plurality of interests. By infusing values and political dimensions into the evaluative inquiry, these designs are intentionally pluralistic.

Discussions and debates about assumptive differences among evaluative paradigms will continue. Each paradigm offers a meaningful and legitimate way of knowing and understanding and orients us in the world of inquiry, is also limited. The challenge for evaluators is to make methods that are philosophically defensible the same time, contextually practical responsive. Mixed methods have potential of enabling us to understand more fully, to generate insights deeper and broader, and to develop important knowledge claims that respect wider range of interests and perspectives. They offer significant potential enabling us to understand better the complex social phenomena which we now face.

This article was adapted with permission from a special issue of New Directions for Program Evaluation. The issue focuses on mixed-method design in evaluation and includes articles on parameters for making methods decisions, analyses of mixed methods studies, discussions of paradigms, and examples of mixed-method designs. See Greene, J. C., & Caracelli, V. J. (Eds.). (1997). Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms. New Directions for Program Evaluation, 74.

Jennifer C. Greene
Associate Professor
Cornell University

‹ Previous Article | Table of Contents | Next Article ›

© 2016 Presidents and Fellows of Harvard College
Published by Harvard Family Research Project