Prep for “Evaluation Options for Decisions Makers” Workshop

I recently 1OK, it was quite awhile ago now, but this blog posting has been sitting in my draft folder for a looooong time!  gave on a presentation on Developmental Evaluation for a workshop at work called “Evaluation Options for Decision Makers“. Which meant that I finally got around to reading a several articles that I’d had on my desk for *ages* with the very good intention of reading. Here are my summaries of those articles.

How Does Complexity Impact Evaluation?

  • Complex systems:
    • “uncertainty/unpredictability
    • interdependence among a large number of actors who themselves adapt and co-evolve
    • emergent outcomes created by the connections or relationships in the system
    • nonlinearity (outputs and inputs are not directly correlated)” (p., vi)
  • Complex interventions are characterized by:
    • structural complexity: more players, increased variety of relationship between them, more interdependence of their decisions
    • cognitive complexity: “increasingly more difficult to make valid or accurate predictions about the system” (p. vi) resulting from increased structural complexity
    • social  complexity: “high level of social conflict or disagreement among the many players in the system”
  • e.g., healthcare – more types of providers, decisions of one group affect the others, different approaches of different discipline can lead to disagreements – and even the clients/patients becoming more informed (or misinformed)/engaged/involved
  • “recognizing, , acknowledging, surfacing, and addressing the paradoxes inherent in complex systems” (p. vii)
  • “It is what happens “in between” that matters: between people, organizations, communities, parts of systems – “in between” relationships”… “paying more attention to relationships as the unit of analysis rather than to parts of the system” (p. vii).
  • requires “willingness to be uncertain at times and to know that being uncertain is crucial to the process” (p. viii)
  • “embracing multiple perspectives and being aware that evaluation is about understanding networks within and between organizations” (p. viii)
Reference:  Zimmerman, B., Dubois, N., Houle, J., Lloyd, S. Mercier, C., Brouselle, A., & Rey, L., (2011). How Does Complexity Impact Evaluation? The Canadian Journal of Program Evaluation  26(3), v-xii.

Practice-Based Evaluation

  • a summary paper of the articles in the journal + a conference
  • evaluators use “conceptual models to position themselves in relation to the intervention and to guide their evaluation […] methodology was secondary and followed naturally” (p. 107)
  • need to take “into account contextual variables […] evaluation of interventions should be contingent, contextualized, and embedded within temporal, administrative, social, economic, and political realities” (p. 107)
  • differences from “pure” research paradigm:
    • “knowledge produced and shared over the course of the evaluation will inevitably influence the intervention and its context” (p. 108) vs. research where you would think of this as “contamination” of the research
    • “acknowledging that evaluation is a subjective process is at odds with the desire to see it as a scientific, externally valid, reproducible process” (p. 110)
Reference:  Dubois, N.,  Lloyd, S. Houle, J., Mercier, C., Brouselle, A., & Rey, L., (2011). Discussion: Practice-based evaluation as a response to address intervention complexity. The Canadian Journal of Program Evaluation  26(3), 105-113.

The Art of the Nudge

This article talks about the experiences of a group of Developmental Evaluators who spend three years evaluating a national initiation on youth engagement, focusing on some of the things they learned that were particularly effective to “provide real-time feedback that subtly supports shifts in policies, practices, resource flow, and programming in a way that is sensitive to context and to the energy of the people involved” (p. 40) and “creative opportunities for groups to find their collective way, to recognize patterns within complex systems, to help take stock of how the team was doing, and to name design flaws or blockages in a supportive manner” (p. 46) These included:

  1. Servant Leadership
  • leading “must always be in service to the group achieving its goals and living its principles”
  • “opening pathways for new understanding and addressing program blockages”
  • “draw out data and observations that help actors realize what they collectively believe to be the best path forward at any given time” (p. 46)
  • use of an appreciative lens – “focus on strengths and promising patterns that could be leveraged to support the initiative” (p. 46) – in the “fast-paced decision making, messy collaborations, and steep learning curves […] nudges could be perceived as threatening” (p. 47)
  • listen deeply and actively – “find synergies, identify decisions inconsistent with the group’s stated intend, and detect when and how to intervene with an effective nudge” (p. 47); “use carefully crafted questions that encourage transformative group reflections” (p., 48)
  • integrate reflection into practice
  1. Sensing Program Energy
  • “a common management response to unrest, tension, and conflict is to supress it” (p. 49)
  • but a Developmental Evaluator “who is perceived credibly, without a personal or organizational agenda apart from the project, can be well-positioned to identify an issue that is blocking program energy” (p. 49)
  • open channels of communication and bring interpersonal dynamics to the surface
  1. Supporting Common Spaces
  • common spaces = “physical places, moments in time, and virtual spaces where key actors interact” P. 50)
  • as places to: (a) identify observations and (b) prioritize interventions
  1. Untying Knots Iteratively
  • knots = “a wide spectrum of problems, given the ambiguities, concerns, and interpersonal dynamics that got tied up” (p. ,52)
  • iterative approach: “(a) identify what specifically requires more clarity, (b) consider how to collect information about the challenge and its potential solutions, (c) collect the information, (d) reflect on how to gracefully bring the information back into the system, (e) put the information back into the system, and, finally, (f) follow up on the results of the intervention” (p. 53)
  1. Paying Attention to Structure
  • “both formal decision-making structure as well as the culture of decision making, that includes an organization’s written and unspoken norms, rules, routines, and procedures” (p. 53)

Some other interesting points from the article:

  • traditional evaluation “can fail to return timely data about how an unpredictable system is responding to new inputs, leaving innovator in the dark about how to adjust” (p. 40) and “offer too little information too late to be useful for innovation” (p. 56)
  • DE “supports innovation by providing timely and actionable data about how a complex system is responding to an initiative” and is about “asking evaluative questions, applying evaluation logic, and gathering real-time data to inform ongoing decision making and adaptations” (Patton, 2011, p. 1)” (p.41)
  • purpose of evaluation:
    • DE: “the exploratory development of a social change approach”
    • formative: “fine-tuning of a program”
    • summative: “definitive judgement about a program’s impact” (p. 42)
  • Developmental Evaluator:
    • is part of the development team – can be an uncomfortable role for evaluators who are usually trained to be removed from the intervention/program
    • requires skills/experience in “organizational development, whole-systems change, pattern recognition, interpersonal dynamics, conflict management, and facilitation – all skills that are crucial for helping innovators know when and how to use data and feedback to adapt strategies as they go” (p. 55)
Reference: Langlois, M., Blanchet-Cohen, N. & Beer, T. (2012). The Art of the Nudge: Five Practices for Developmental Evaluators. The Canadian Journal of Program Evaluation. 27(2), 39-59.

Footnotes

Footnotes
1 OK, it was quite awhile ago now, but this blog posting has been sitting in my draft folder for a looooong time!
This entry was posted in Developmental Evaluation Community of Practice notes. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *