NewYork in July and the International Forum of Visual Practitioners (IFVP) Conference was a fantastic experience.
I ran a workshop called ‘did we hit our target?‘. The topic was aimed at those who design and facilitate workshops and group events – many members of the IFVP have this role.
I believe this is an important topic. The need to measure our impact as facilitators and recorders grows increasingly critical in an environment where project dollars remain tight. We must be able to quantify and qualify our worth. And some of the best data we have can come from our own experiences and those reported by our clients.
We explored two areas of interest – 1) how we measure the success of our efforts + 2) how we visualise that evaluation information for harvesting and communicating. The pivotal question for our session was:
How do we know if our meeting or workshop
has been a success?
I had the opportunity to discuss what evaluation methods, if any, were used by other professionals in the visualisation business.
At the start I did a quick poll on what kind of evaluations visual practitioners do after an event. Scenario A: Exit stage left with materials under your arm, waving to the client. Scenario B: Informal Harvest: how do you think that went? C: Formal, structured harvest: against pre-agreed outcomes.
The general consensus in the conference group was about 10%:80%:10%. In summary, an informal question of ‘How did you think that went?’ with the client was most common.
The group agreed that thinking more about the options in Scenario C: Formal Structured Harvest would be useful for their practice.
I presented my take on the logic model which I’ve dubbed ‘logic model lite’. At its simplest form, it covers the INPUTS (ie. what resources we invest in the meeting), our ACTIVITIES (ie. what we do in the meeting) and the OUTCOMES – short, medium and long-term (ie what results we see).
Using the workshop we were in, we ran an example of what a logic model ‘lite’ would look like. That way, participants got a feel for what information was needed and what level it was aimed at. We built it systematically from identifying the inputs, the activities involved, and then the short-, medium- and long-term outcomes. Finally we identified the matters we could evaluate the results of the workshop.
We discussed how common practice was to check if the activities and short-term outcomes were achieved. However, back to our ‘workshop as an intervention’ paradigm, further investigations could be done into the results and longer term outcomes that flow.
I shared some of the visual methods I employ for checking end-of-workshop outcomes with my participants.
These include my tried-n-true target board. I like the idea of linking people’s responses to the concept of ‘hitting the mark’.
Brian Tarallo of Lizard Brain Solutions offered his use of faces and emotions to do a visual Likert scale for feedback.
At the finale of the workshop, we checked the short-term outcomes for a measure of success. Participants reported having more structure and concepts to approach evaluation of their own workshops for the future.
During this great discussion, Tracey Ezard of Jessup Ezard Consulting recorded our thoughts. Thank you, Tracey for capturing our points and to all the participants – Lynn K., Nora H., Brian T., MJ and Lisa.
Do you want to:
Expand your professional toolkit with visual thinking skills?
Boost your effectiveness in meetings?
Add impact to your presentations?
Gain confidence in drawing and applying graphics to your work?
Be seen as a creative thinker?
If you are yes to any of these, find out more about my premium program: Essentials of Visual Thinking & Graphics Practice here.
Preparation for New York has begun in earnest! As the days tick by, I have been drawing together my presentation material for the International Forum for Visual Practitioners later this month.
The topic is ‘did we hit our target?’ and is aimed at those who design and facilitate workshops and events. The session will be an opportunity to discuss evaluation methods in this context. This is an important topic as I think an ability to measure our impact as facilitators grows increasingly critical in an environment where project dollars remain tight.
A lot of thinking has gone into evaluation methods generally for programs and projects to help people track outcomes and report (to funders and sponsors) on their successes.
In my experience, however, the types of ‘evaluation’ we do as facilitators is often more superficial. In closing a workshop, we often ask the stalwart question: ‘how well do you think we did in terms of meeting our workshop’s objectives?‘ Participant responses generally provide a summated account of outcomes. I understand this is appropriate in many cases as the resources and energy invested in gauging the outcomes of a workshop is in line with the overall investment of the event. In comparison, medium to large scale programs that span months/years may have a total investment that is tenfold of a workshop and so require more structured and probing evaluations.
That fact aside, I think we have room to improve the standards of our workshop evaluation. Before I launch with my ideas, I want to acknowledge Dr Jess Dart of Clear Horizon. Jess, through her training courses and working alongside her team as a co-facilitator on evaluation projects, taught me much of the basics in program evaluation theory and practice. She is an evaluation guru in this country and a talented facilitator and business leader.
Here’s the first key point. We need to see workshops for what they are:
If we do that, then it makes sense to spend time being clear about the expected outcomes in terms of short, medium and longer timeframes that we see flowing from the workshop. If we take the analogy of the pebble in the pond (the workshop), then we need to identify the ripples (outcomes) – what are they? how big do we expect them to be? and where do they go?
How does this thinking affect our practice?
I see three phases where facilitators translate the pebble in a pond analogy into a clear framework for evaluating outcomes. They are:
develop a clear STATEMENT of OUTCOMES at the commissioning and designing stages of the workshop
with the client, develop a shared understanding of HOW the workshop will DELIVER THE OUTCOMES expected; and
design a process to check the EXPECTED with ACTUAL OUTCOMES.
A critical tool in doing the phases above is the logic model – a depiction of how the client / participants see the change occurring as a result of a program or project. As it applies to workshop evaluation frameworks, I call it ‘Logic Model lite’ as it is a simpler beast than one developed for a large scale program.
In my next post, I’ll provide an example of a logic model ‘lite’ for a workshop and show how to develop the evaluation questions that you will need to measure your event’s success.
Need help to get your CREATIVE on?
Curious Minds Co. is a consultancy firm passionate about helping people and organisations connect with their natural CREATIVITY and achieve their business and life goals.
You can contact me through firstname.lastname@example.org.
I read a post this week by Agnese Aljena on the 7 common things for drawing and business where Agnese makes the point that drawing and business are both learned sets of skills, both need love, dedication and passion. That resonated strongly with me. Especially as my business is about creative thinking and a (cool) foundational element of that is drawing out ideas, sketching down concepts and working them through with my clients using both word and image.
So back to the need to LEARN.
How did I learn these skills? For my ‘drawing’ in the context of business, communicating ideas and providing clarity for myself and others – I went to a number of very talented people in this field… Christina Merkley, Peter Durand and Diane Durand, Sunni Brown and Dan Roam. For my business skills, I had the joy of working with and learning from colleagues Mary Maher and Trevor Lloyd. My partner, Ian also brings lots of business smarts to our conversations, so I bounce ideas around when I need input.
These are the generous folk that got me started.
Ultimately, though, LEARNING occurs for me from the DOING.
Experience births new ideas and understanding.
That brings me to another topic that interests me greatly – evaluating the success of an endeavour. At its core, evaluation is purposeful enquiry – to learn, improve and get better in how we bring our ideas into the world. In so many areas, our communities need new and better ideas right now (that whole discussion is probably for another time/post).
So, how do we ‘measure the success’ of a project, a business initiative, or – dear to my heart – a workshop? How do we know if we are ‘hitting the mark’? This is a focus for me as I prepare for a workshop I am running in New York, USA on this topic later this month.
So to prepare, I have set myself a task of writing a series on this topic – how to’s, tips and techniques for Measuring Your Success. STAY TUNED!
Need help to get your CREATIVE switched on?
Curious Minds Co. is a small consultancy firm helping people and organisations get their CREATIVE on and achieve their goals. You can contact us through email@example.com. – See more at: www.curiousmindsco.com.au/courses