News

Live blog from the conference: Michael Quinn Patton's keynote

March 16, 2016By Liz Skelton

Complexity & evaluation in an uncertain world

 Participants in the conference kicked off by outlining the key issues they are grappling with?

  • How do we get buy in to work in partnership?
  • Measuring intangibles
  • Letting go of personal gain and moving into shared space – grappling with fear, unknown
  • Shift in mindsets to work long term
  • Collaboration
  • Starting off with predetermined outcomes – shift to adaptive mindset and practice
  • Gathering data that links explicitly to strategy

 The opening session beamed in Michael Quinn Patton, author of Developmental Evaluation. Applying complexity concepts to enhance innovation and Use, from Minnesota in conversation with Janet Clinton. Knowing that context is everything – Michael was videoconferenced in from a shed in Minnesota in the middle of a blizzard, powering his shed and internet via a generator. Commitment indeed!

Michael framed the purpose of the session was to understand the link between complexity and evaluation and the power of evaluation to bring about change.

If you want to spend your time making the most of your contribution, then there is a way you can measure if you’re having any impact or not. Michael posed the challenging human example of bad breath.  The only we can test if we have bad breath ourselves is to breath into our hand and of course we can’t smell anything because it’s our own scent. The only way we know we have bad breath is for others to tell us.  Evaluation takes on the question of whether we are really prepared to look at what we’re doing and have someone neutral to help us with that.

Evaluating requires learning how to learn and using findings to make things better.  Many resist feedback because they want to continue doing what they’re doing or because they fall on the side of politics rather than evaluating. 

How does evaluation fit in the world of complexity? If we can’t understand evaluation how do, we understand complexity?

Many social innovators believed that evaluation is too slow, compliance driven, not written for their context. Traditional evaluation assumes a certain amount of control, using linear highly predictable models.  In reality, change is more organic, messy with progress ebbing and flowing and many forks in the road. Henry Mintzberg, author of Tracking Strategies identified no high performing organisation implements all of what they intend to do. They engage in ongoing assessment, find emergent opportunities, leave somethings behind so that overtime their strategy is a combination of what they intended to do and what they changed as they went. Approaching evaluation via complexity helps us navigate emergent systems where we are dealing with emergent and complex challenges.

How do we balance managing complexity in a world which relies on plans, logic and short term outcomes?

Following logic, technical frameworks are needed when we’re running programs where we know what to do.  The problem is that we approach complex problems in the same way.  They don’t work. We need processes that can cope with innovation, like software development where we can experiment, try thing out, learn, test, apply.   The issues we’re grappling with like climate change, terrorism, human rights, public health where were not in control, where we don’t agree require complex adaptive evaluation responses for complex adaptive situations.

Developmental evaluation came about when Michael came up against the limitations of formative and summative evaluations He was two years into evaluating a leadership development program with small communities in Minnesota. These approaches didn’t address the context of the complexity the program operated in.  Leadership needs to be adaptive, to be able to equip leaders to make progress in a more connected, complex work where models are no longer sufficient. 

People are searching for models, but in complex dynamics situation we’re never going to find THE model.  The model is the ability to deal with change, understanding how the world operates.  It focuses on generating  & helping innovations particularly in uncertainty and change. This is a fruitful area for complexity and developmental evaluation.  All we can work with is principles for best practice.

We need to diagnose what kind of approach is required depending on the challenge. Our challenge and default is to use technical approaches for what are complex and adaptive challenges. Political instinct is to simplify and we end up doing harm because we don’t understand the complexities. Collectively, we need to become more astute about real world complexities that exist.

In complexity, expertise is not a matter of having an answer to something, expertise is astute situation recognition. From there  the diagnosis guide the response. Adaptive leadership calls this the skillsof diagnosing the environment we’re operating in, looking for the systemic patterns and dynamics. Heifetz & LInsky see this as most important but undervalued skills of leadership.

Could developmental evaluation be the key to how we support innovation and change? Does it lack credibility of evidence?

The challenge in a political world is to be able to acknowledge failure.  Innovation requires risk, there will be a lot of failures.  Fail often, fail fast, learn and move on.  We mouth this in public sector but very few do it well.  Different people value different kinds of evidence. Democratic dialogic evaluation educates policy makers on how to interpret developmental evaluation.  Developmental mindset is to not try and find the one size fits all but to interpret and think about the evidence and bring out the nuances.  More a thinking/mindset problem than an evidence problem.

Policy makers, governments often don’t want to engage with the “fluffy variables”? Will collaboration, adaptation ever be seen as outcomes.

Variables are interdependent  where process and outcomes are inextricably linked i.e. collaboration leads to greater social capital.  But this way of approaching evaluation presents the dichotomy we struggle with. Outcome mapping, outcome harvesting can measure short terms outcomes which can lead to greater impacts.

Michael ended by sharing his current foucs - Blue marble evaluation which uses the picture of the earth from space as a metaphor for a blue marble- working on how we get global system change.  This requires building a global community of evaluators for global systems change evaluation.  If we want to build a more sustainable world for our next generation, then we need to develop blue marble evaluative thinking. This is what the world needs. 

Back to the Top