Webinar Notes: “Emergence: Detection and Interpretation in a Leadership Program” by Michael Quinn Patton

As part of EvalYear 2015, the School of Public Affairs at the University of Colorado Denver, is holding a webinar series called: “Practical Applications of Systems to Conduct Evaluation: Cases and Examples“. They have a great list of 30 minute webinars running from January to June related to use of systems thinking in evaluation. Systems thinking is something I’ve been reading up on lately, so these webinars are very well timed from my perspective!

Here’s the notes I took during the webinar:

“Emergence: Detection and Interpretation in a Leadership Program” by Michael Quinn Patton

  • complexity in the evaluation world
    • traditional logic models are linear
    • but in the real world, there are forks in the road, sidetracks, unexpceting things, challenges, and opportunities
  • Tracking Strategies by Henry Mintzberg
    • intended strategy – deliberate strategy – implemented as intended
    • but some of what was intended isn’t implemented (unrealized strategy) – and that’s OK
    • emergent strategy – opportunities arise that weren’t part of original strategy, so some strategy emerges on the fly
    • high performing organizations have a “realized strategy” of deliberate + emergent (an unrealized strategy is that which was planned by not done)
    • and then look at what the outcomes of that realized strategy was
  • documenting all of this is part of the evaluator’s role as this occurs
    • why they decided to leave some things unrealized?
    • why did they add the new stuff?
    • people making the decisions typically do not document why they make their decisions
  • traditional accountability thinks any unrealized strategy as a failure (didn’t do what you planned) and emergent strategy as “mission drift” (you did stuff that you didn’t plan on) – but the research on high performing organizations doesn’t support this way of thinking
  • emergence – what to watch for
    • subgroups – how do people self organize with a program? there can be results not just on individuals but on (sub)groups
    • critical incidents
    • issues
    • staff-participant relationships
    • processes
    • outcomes
    • impacts
    • non-linear effects (ripple)
  • in highly dynamic environments, it doesn’t make sense to make detailed plans for long time periods (because you can’t predict what’s going to happen over long time frames); makes sense to have a strategic vision and principles that you operate under, and details will emerge
  • you are operating under situations of uncertainty, you aren’t going to be getting “proof” – you are thinking in terms of probabilities
  • having some data is better than no data, and waiting for complete data isn’t feasible (things are dynamic, so you can’t get “complete” data anyway) – real time pressures to make decisions, use the best information you have and recognize the uncertainty
  • developmental evaluators need to walk alongside the program people
  • developmental evaluation is scary for both the program people and the evaluator – you don’t have a roadmap to follow – you develop the evaluation as you go along
  • when people are talking about innovation and the need for new ideas – that’s where you will (should?) find people willing to try out developmental evaluation
  • companies spend lots on R&D – they get the need for creation/experimenting/testing, but governments and NPOs tend not to put their resources there

Action Items

This entry was posted in evaluation, event notes, methods, webinar notes and tagged , , , , , . Bookmark the permalink.

2 Responses to Webinar Notes: “Emergence: Detection and Interpretation in a Leadership Program” by Michael Quinn Patton

  1. Pingback: Webinar Notes: Systems Dynamics: Computer​ Models to Anticipate and Plan for Surprise | Dr. Beth Snow

  2. Pingback: Webinar Notes: Soft Systems Methodology: The Use of Rich Pictures from Evaluation | Dr. Beth Snow

Leave a Reply

Your email address will not be published.