How to REALLY know that your volunteers deliver on your mission

It’s trickier than you think to describe impact – and that’s good

Every time I offer a volunteer impact measures training, I’m struck by some new takeaway the helps me better explain how to do master this complex-but-valuable process.

Today, my takeaway is this:

When you create impact measures on a matrix, one column stands out as the trickiest to complete – and probably the most important.

That column has to do with (drum roll, please)… Indicators.

Let me back up for a sec. Impact measures are created using a logic model. A logic model is a matrix that maps out all of the components needed to evaluate the effectiveness of a program.

The various columns within the matrix – the activities, the inputs, the outputs, the indicators, etc. – all of these pieces lead towards creating the outcomes you actually want. It’s the outcomes that demonstrate exactly what your volunteers accomplished that served your clients or delivered on your organization’s mission.

You can’t create relevant outcomes without putting a lot of consideration into your Indicators.

The Indicator describes what improvement looks like, so that you can set a quantifiable goal to work towards.

Here are some general examples of Indicators for volunteer program impact:

  • An increase in reading comprehension (for an after-school program with volunteer tutors)
  • A decrease in park litter (for a conservation organization with volunteer cleanup groups)
  • An increased number of job interviews per client (for a homeless shelter with volunteer resume counselors)

As simple as these examples may look, indicators are tricky to develop. Generally, we know intuitively that a volunteer role has a positive impact, but describing that impact in quantifiable terms requires some brain power.

Creating indicators forces us to answer this basic question:

“How will we know that this volunteer role supports our goal?”

Take this real-life example, which cropped up during an impact measures training for a museum.

Like many museums, this one has a goal of deepening visitor engagement with the collections. Also, like many museums, this one works with volunteer docents.  We might conclude that docents naturally deepen visitor engagement due to their knowledge of the collections and the tours that they offer.

But how do we prove that with data? What’s the indication that docent tours are actually deepening visitor engagement?

Developing indicators led to an extended conversation. We had to break apart the workings of a typical tour and figure out exactly which results demonstrate engagement in a quantifiable way.

It turned out that were multiple possible Indicators for volunteer engagement.  The group had to choose the one with  the fewest complications.

Here were some of the options:

Indicator: an increase in average tour time.

  • You might argue that a longer tour means visitors are more engaged because they’re asking questions. Then again, one longwinded visitor could extend the Q&A but alienate other tour members. Plus, it might get complicated to measure the number of people asking questions and the number of questions asked.

Indicator: an increase in tours that stick with a 30-minute time limit.

  • Then again, a tour that sticks with a 30-minute format might deepen engagement by piquing visitor interest without monopolizing their time. For this choice, though, you might need some advance data on the connection between length of tour and number of tours that get filled to capacity.

Indicator: a decrease in the average number of tour drop-outs.

  • Perhaps the best signal of engagement is that visitors remain with the tour until the very end, the presumption being that un-engaged visitors would wander away. Then again, visitors may drop off a tour because the volunteer was an uninspiring speaker, or not well-prepared to answer questions.

You can see how each indicator has its pros and cons. It’s our job to choose the indicator that comes closest to helping us illustrate our goal and re-assess it over time.

Creating Indicators, it turns out, is part art and part science.

But here’s what I love most about Indicators – as we create them, we create a different kind of conversation

Normally, we’re in action mode, attending to the recruiting, training, and supervising of our volunteers.  If we confer with colleagues– say about the way we train volunteers, it’s generally to solve an immediate problem.

When we work together on an Indicator, we are still solving problems. This time, though, our solutions are tied to the big picture. As we hash out how best to illustrate our objective, we end up discussing everything: training, scheduling, data collection, volunteer satisfaction. We must revisit our way of doing business and document how it contributes to our larger goals.

Measuring strategic impact is not a standard practice – at least not yet. But consider the potential: if we all committed to this process and these conversations, imagine the increase in credibility for volunteer management. Perhaps our measure of progress would be this:

An increase in Leaders of Volunteers who master the art of creating Indicators.

Volunteer Managers: arm yourself with influence, too! My September 12 webinar shows you how to bring others on board with your important ideas.

Check out ‘Achieving Buy-In for Your Volunteer Program’ at AchievingBuy-inforVolMgrs.eventbrite.com

4 Comments

  • Another really interesting blog and a great challenge to those of us trying to come up with those indicators of volunteer impact.

    I would add that we may also want to develop key indicators of the benefits of volunteering on the volunteer. Does listing volunteer experience on a resume lead to more interviews? Does volunteering help ward off feelings of isolation among retirees?

    I always describe the field of volunteer engagement as two halves of a circle. We have to ensure that volunteer are helping to fulfill mission-related needs of the organization, but we also have to ensure the organization is meeting the needs of the volunteers. I would suggest that documenting the impact of a volunteer program should also address both halves of this circle.

    • Hi Laura, you raise some really good points. There are all kinds of things that are valuable to measure about the impact of volunteering on the volunteer. Taking indicators in that direction depends on the priorities of the strategic plan. A forward-thinking organization might include a priority on enhancing the volunteer experience.

  • What a great blog, and perfectly timed for today’s (9th August) Twitter Chat about Measuring Volunteer Impact. We’re on Twitter at 8 pm UK time with the hashtag #TTVolMgrs. It would be great if you would join us.

Leave a Reply

Your email address will not be published. Required fields are marked *