host posted on November 20, 2012 15:55
Book by DePoy, E., & Gilson, S
Review by:
Heather Zeng
Assistant Professor of Psychology
Online Instructor Evaluator
Park University
Evaluation Practice is a critical read for individuals seeking to apply for grants, implement programs, or coordinate institutional interventions; yet implementation is only one aspect of this text. Here, authors guide readers with clarity to understanding that a difference can be made by the approach used.
Certainly, as the authors note, the current era of increased accountability is a prelude to this writing’s raison d’être. Moreover, in recent times, entities with limited budgets are more carefully scrutinizing programs and asking the question “Does this Work?” and if so why? (a question that when answered might streamline processes and costs).
The authors plunge readers into this challenging topic through the use of examples and scenarios that deconstruct the evaluation process. Whereas other texts provide didactic and upfront construction of knowledge, here authors give deference to readers’ acumen by assuming knowledge, sharing examples from their professional work, and then deconstructing terminology and techniques. It is a refreshing approach that confirms that readers who pick up this writing are in the right place: that the process of discovery will be a mutual and collegial one.
Evaluation Practice offers readers insight into research methodology that affirms the use of mixed methodologies or a triangulated approach can further experiences in the field. They describe tools e.g., problem mapping and force field analysis that provide tangible examples that can be easily replicated by readers. For example in sharing Lewin’s (1951) force field analysis they impart a simplistic but highly portable skill that clarifies driving forces towards a particular issue and any restraining forces. Individuals then can logically target and rank order issues from 1 to 10 in terms of helping or being a detriment. In providing these tools, the authors give readers elegant resources for logical inquiry and provide succinct overviews of research methodologies and approaches. In doing so they give parity to both the quantitative and qualitative research approaches as well as combined methodologies.
There are several sections that have utility to a broad range of readers including students, faculty, grant writers, and program evaluation staff (to name a few). One area of note is the authors’ focus on reflexive action which they describe in terms of both systemic decision-making and receiving feedback from others in the inquiry process. The authors add to a static model for grant evaluation that has endured over many years: an individual applies for a grant, document outcomes, and writes a summary report. Here the authors add a Web-based monitoring system to record all activities (p. 134) that allows for ongoing articulation of the questions that arise throughout projects. This novel approach allows for cross-referencing across years and of individuals who work within a program. This was a Eureka moment; it was refreshing to see this emphasis for ongoing program improvement and development.
When we think of previous policy it often seems that empiricism, along with reflexive action, is remiss in many entities (we need only to look at many haphazard efforts that did not use this logical progression of decision making and reflection). It was affirming to see a resurgence and positing of this balance between reason and intuition in the realm of program development and intervention. We know that the implications are vast if we look to education, the social sciences (criminal justice, counseling and psychology in particular) and human resources - fields in which intervention efficacy can be critical to individual learning or quality of life/work.
Additionally, the authors provide questions for processing assessment in a project. This ready-made listing (p. 151) poses several questions for critical analysis using a health care example but has transferability to many topic areas, domains, and projects.
Here DePoy and Glison provide a resource that, if followed by both practitioners and scholars, can bring together the much needed alignment between theoretical knowledge and real world solutions. This text is for individuals keenly interested in taking empirical knowledge, identifying logical gaps and opportunities, and guiding public policy/projects in ways that are purposeful. One area the authors may want to consider for further development would be linkages to data displayed in terms of outcomes; those outcomes can be presented both visually and graphically. For example, Corcoran (2000) in an interview with Edward Tufte notes that the point of presentations is “(t)o explain something.” In another edition this would be a great inclusion – presenting evaluation outcomes- and doing it well. In the current economic turmoil it might be even more critical in to program efficacy. Yet there is a more universal appeal to their writing: marrying empirical approaches with the pragmatism of real world applications. As such, this is a rich offering.
References:
Corcoran, D. (February 6, 2000) "Campaigning for the Charts that Teach", The New York Times, Retrieved June 1, 2009, from http://www.nytimes.com/2000/02/06/business/talking-numbers-with-edward-r-tufte-campaigning-for-the-charts-that-teach.html?pagewanted=1
Lewin, K. (1951). Field theory in social science. New York: Harper & Row.
Evaluation Practice. (2007). Book by DePoy, E., & Gilson, S. Review by: Heather Zeng.
Philadelphia, PA: Lawrence Erlbaum Associates. 256 pp. $29.95. ISBN # 978-0-8058-6300-0