Actually this is one of those interesting little tools that defies categorisation. It's not just a data analysis tool. I've used it a lot when presenting results of research to stakeholders, and to help people debrief unstructured meetings.
One of the big challenges in evaluation or any applied social science or consultancy task is how to help people engage with the results. How do you get them to acknowledge often uncomfortable conclusions ? Would it be better to get them to analyse the results ? If people participated in the actual data analysis, then maybe they would accept the conclusions a bit more.
But there were several problems with that idea :
• people are busy, they have hired you to do the work, why should they spend their valuable time doing something they have paid you to do ?
• you spent a lot of time collecting the data - you have masses of it. Just how much data is enough to get good quality analysis ?
• how to you discourage people from seeing only those patterns in the data that reflect primarily their own view of the world ?
• if you can get them to step outside their own mental space how do you prevent them from being defensive about their existing ideas and engage in the possibility of actually learning from seeing things from other perspectives ?
• and finally how do you do this in a group setting where the pressure tends to be for discussion rather than dialectic ?
I puzzled about this for a few years, and then came across some colleagues using ideas drawn from Vygotskyian based psychology and Activity Theory. Essentially, Vygotsky postulated that we learn from two different practices. Patterning (i.e. fitting current events into past events), and puzzling (i.e. seeking explanations why the current event doesn't fit into past events, or even other current events).
I realised at that point that much of our analysis, both qualitative and quantitative, was based essentially on patterning. With relatively few exceptions, outlying data was removed from view and thus from the analysis. With that went much of the "puzzling" and potential learning. However, if you approached this outlying data with the possibility of it being there for a reason rather than chance, then maybe by discussing the bulk of the data's relationship with the outliers you can get a deeper understanding of what is going on. In other words, encourage people to puzzle over the data rather than pattern the data.
From there it was a relatively short step to thinking about applying the same idea to data that were progressively less extreme, until in the end you were forcing puzzles onto apparently consistent data. I can't claim credit for this insight my colleagues at WEB Research had got there long before me. All I've done is expanded on their ideas, initially with colleagues at New Zealand Department of Labour who helped me refine it. I have used variations on many occasions - always successfully. On one or two occasions people told me it was the best way they had ever come across of analysing data quickly in a group setting.
We take decisions all the time. Indeed we are so used to doing it, that we often forget to check with others who might be affected by that decision how they would like to be involved in that decision. Sometimes we don't think that through ourselves either, and get surprised when a friend or colleague starts shouting at you down the telephone line.
There are four basic ways to take decisions, and good decision-making is about having clear agreements about which one should be used.
I haven't a clue where this tool came from, but I'm deeply indebted to the person who originally thought it up.
In a group discussion it is common for ideas and discussion to flow rapidly. Group members hear things and respond. Not much listening is going on. There are many times when this is absolutely appropriate, especially when the group is in a very creative mood.
However, a group can be stuck "wheel" spinning, or skimming along the surface without actually getting to the core of the issue. At this point it helps to slow things down and start listening. I'll be honest, the first time I used this I didn't have a clue what I was doing. I was faced with a dysfunctional group and made it up as I went along. Somehow everything came out OK, and I've used it many times since.
Bob Dick's brilliant little tool helps people resolve apparently irreconcilable difference. It is from his equally brilliant book "Helping Groups to be Effective" - quite the best facilitation and group work book on the market. Check out his other excellent publications via Interchange.
DISCUSSING UNDISCUSSIBLES
This is another wonderful Bob Dick tool, this time written with Tim Dalmau. It resolves a really tricky and common situation in a way so simple that you kick yourself for not thinking it up yourself.
One of the big problems in any group process, or discussion is what is not talked about. It erects a hidden boundary around a discussion and is the source of much frustration. Bob and Tim developed this tool to surface the conditions that create boundaries, and allow people to decide what they want to do about these boundaries. The really clever bit about the tool is that it you can do all this without actually talking about the undiscussibles themselves.
How do we stop people assuming that all you need to do is tell people to do something and it will change their behaviour ? That assumption is linked with more "failed" campaigns than I care to mention. It's made advertising agencies a mint, but that is often all it does. As you can probably guess, I'm no great fan of social marketing.
On the other hand ....
The Ottawa Charter is a WHO health promotion framework used to overcome this problem. It is credited with the success of strategies to promote smoke-free workplaces, reducing the incidence of HIV, road safety, and melanoma prevention. I have used it in all sorts of other areas such as dairying, energy efficiency and land management. It's one of those frameworks that people go "oh yes" to almost immediately.
What is seeks to achieve is voluntary behaviours which are consistent with a "cause"
It is also based on the concept of "leverage". A strategically selected jigsaw of people and organisations doing what they are most effective at, rather than a single agency trying to change the world on its own.
For the framework to work properly you need (or need to develop) :-
A clear cause (or vision)
A clear set of initial principles or values
Initial agreement to the above by all key stakeholders
The strategy framework has five components, which aim to develop and maintain :-
1. The knowledge and skills required to adopt these behaviours.
2. Relevant services which promote and model the cause.
3. A sense of involvement in and ability to contribute to the cause.
4. Policies and rules which promote the cause.
5. Support from the wider environment for the cause.
To gain leverage, you need a range of strategies across the 5 areas which reinforce each other. You also identify the organisation or individuals most able to develop each part of the strategy.
The nice thing about this framework is that it works at any level you wish to apply it. I have used it at a national level (where for instance rules = legislation), and in single organisations (where services could be the canteen serving up decent food). I find the framework helpful even when doing something which is part of the strategy framework. For instance, if I am trying to developing a "relevant service", the framework helps me to strategise and plan how to develop that service.
The document download has a longer discussion of the framework, plus a couple of non-health examples.
CREATIVITY TECHNIQUES AND TOOLS FOR PROBLEM SOLVING
This is a real gem.
Every so often you come across a website that takes your breath away in terms of usefulness and the generosity of those involved. If the UK consultancy called Mycoted are anywhere near as good as their resources then they are a pretty sharp outfit. This website has over 200 different tools and methods relating to action research, large group processes, strategy development, evaluation - just about everything really.
Each method has a short description of what it is and just enough information for you to use it relatively safely.
If you, like me, think most planning frameworks look better on paper than in reality, then give this a go.
I'm no fan of strategic planning. I largely agree with Henry Mintzberg that most strategic planning ends up as bad strategy and bad planning. I believe that strategising and planning are conceptually, emotionally and cognitively very different activities, and any attempt at directly combining them is fraught with difficulty at all kinds of levels.
On the other hand ... several years ago, I came across a remarkable little publication by the Rand Corporation called "Assumption Based Planning". Although it explicitly rejects the "strategic planning" tag, I think it is damned close to resolving the tension between strategy and planning.
However, I found that people had great difficulty working with assumptions, so I took the idea and blended it with the "force field" technique. This seems to work better - at least in New Zealand. I use this tool primarily when I want people to explore the relationship between what they are doing, and the environment in which they are doing it. The workshops on Evaluation and Organisational Learning and Upside Down Strategy (see Workshops) use the framework extensively.
The download is a graphic representation of my process. Very bare. I suggest you use your imagination if you want to use it. I've also added a rough description of the original concept and suggested a way of applying it.
There are dozens of good evaluation tools. I've included this one for several reasons. Firstly it combines several methods of inquiry and analysis, including action research, performance measurement, strategic planning and program logic. Secondly it is one of those methods that looks deceptively easy, but is actually very powerful. Thirdly, it works well in many settings, especially workshops. Finally, despite it's apparent closeness to many standard evaluation approaches, it has never really been widely adopted by the evaluation community. I think that is a pity.
The tool was originally developed by Wes Snyder in Africa, and further modified by Bob Dick. The version I use is even more modified, since I don't think Bob's version covers environmental factors in as much detail as I tend to. However, I've included Bob Dick's version, since it is the most clearly written.
I developed this tool from an original idea by Shankar Sankaran now at Univeristy of Technology in Sydney. It is based on the Plan, Act, Observe, Reflect, Plan cycle familiar in action research. I've always been convinced that powerful reflection depends to some extent on the questions asked, and this tool uses a dozen or so questions that force us to think below the surface. It forms part of a "Learning Log" developed by myself and Bill Harris,. Learning logs are featured in Bill and my chapter in Effective Change Management Using Action Research and Action Learning.
David McDonald, Gabriele Bammer and Peter Deane produced an excellent e-book on the use of dialogue methods in a variety of settings. Many of these methods are drawn from the systems and organisational development fields. Some are highly reflective. Worth a good look. You can download it here
This is a really neat process when you have a room full of people who need to cluster lots of ideas quickly. It is also very good at exposing unspoken assumptions.
It's based on a process called "Fastbreak" which I came across when working with Pegasus Communications Inc and WEB Research. The big snag is that people don't believe the process works until they do it. The tool has never, ever, failed me.
Some tips.
If the point of clustering is just to simplify people's task, then you don't need much debate about the clusters themselves and what ideas go where. If the clusters form a fundamental step in the task, or you are wanting to uncover assumptions, then take time and allow people to discuss what goes where.
Some people get very frustrated by the clustering process. Invariably they form a little knot at the back of the room and create lots of noise. You can overcome this by acknowledging that some people don't want to be involved, allow them to do so, but suggest they temporarily leave the room !
This works really well if you use hexagonal "Post It" notes. You can get them on-line from Vis-It, although someone should tell them that the glue needs improving. When the room heats up they fall like Autumn leaves.
To a large extent the excitement over 'sustainability' has been replaced by more recent notions of 'resiliance' and 'adaptive management', but I still think the idea of sustainability is still worth exploring. Some years ago Patricia Rogers and I looked at the sustainability literature and tried to make sense of it. Somewhat later, I pulled together this little two page framework for work I was doing for the Lumina Foundation with SPEC Associates. One side explores how to frame sustainability and and the other side describes what the literature says are important components of sustainability - however you frame it.