How to find a VMS system that actually meets your needs

Hint: it all comes down to one key element

If you are a CVA, you may have seen (or even answered) a question posed in the CVA Facebook forum by our intrepid colleague, Liza Dyer.

Liza took a census. She wanted to know how her peers organize their volunteer data and asked them to post an emoji to express how they feel about their current system.

This simple question generated tons of responses – 51 comments in total – and a string of emojis that ran the gamut from hearts and smileys to frowns and weepy faces.

I’m not surprised. Finding the right volunteer management system (VMS) matters when you’re tracking volunteers, coordinating with development, and (ideally) measuring your program’s impact. It’s the difference between spending minutes entering data and running reports from one source, or taking hours to manipulate a spreadsheet for relevant information.

There are a lot of VMS choices out there, probably something for every program, depending on your size and budget.

The question is: how do you select the right one?

Abbey Earich may have the answer for you.

Abbey Earich, Recruitment and Retention Coordinator for the Smithsonian Office of Visitor Services, led the launch of a VMS system shared by over 300 staff members in 19 different museums

Abbey is the Volunteer Recruitment and Retention Coordinator for the Smithsonian Office of Visitor Services (OVS). She has been instrumental in the roll-out of a new Smithsonian-wide VMS system. That comes out to over 300 staff who oversee volunteers in 19 museums, managing a total of 6,000 volunteers annually.

Imagine the consequences of choosing the wrong system.

Abbey served as a search committee member and now acts as project lead for the new system, which launched in August of 2018. Since then, she has overseen the upgrades to the system – a process which is expected to be ongoing.

Chances are your volunteer program is smaller than the Smithsonian. Even so, the process that Abbey and her colleagues used to find, select, customize, and implement a new VMS system is universal – and it’s one that organizations of any size may want to adopt.

Here’s Abbey’s advice:

1) Get diverse input from the start

The search for a new VMS system began in 2016.  One of the first steps was to assemble a committee to advise on the search. The committee was comprised of users of the old VMS system and those who had expressed a need for a volunteer database.

The committee was intentionally pan-institutional. Abbey shared: “We wanted the most diverse range of users possible: public-facing staff, behind-the-scenes staff, people who used volunteers only for short-term projects, and those who managed volunteers long term.”

Having such a diverse committee made a “big difference in addressing everyone’s needs,” says Abbey. “In meetings, the group discussed what didn’t work in the old system and what they ideally wanted in the new database.”

 

2) Let experts do the research

With so many options available, the Smithsonian hired a consulting firm to research customizable products that fit their needs and budget. The consultants met frequently with the search committee to understand the range of needs and identify the best options. As a federally funded organization, at least three bids were required for the final selection.

This entire process took about a year, as options for a robust platform were limited. It was a challenge finding a vendor who could meet all of the high priorities in one system.

Each provider on the short list gave a demonstration to the committee, after which a recommendation was made on the preferred vendor, Bespoke Software, with the VSys database. The final selection of Bespoke as vendor was made by the Director for OVS.

 

4) Get diverse input as you customize

Abbey’s map of the volunteer journey

Once the selection was made, Abbey headed up an Implementation Team tasked with customizing the product. Once again, the team was intentionally pan-institutional to allow for a wide range of input.

The team met frequently over the next year, examining every single aspect of the volunteer journey to identify what was needed. In fact, Abbey still keeps the giant flow chart that she made to map out the volunteer journey: from how a volunteer learns about an opportunity, to what the application form needs to include, to how to track volunteer hours.

 

5) Come to consensus

“On every decision, we had to agree. We had to figure out what would work for everybody.”

For example, there was a lot of debate over the application form: should there be one universal form, or variations for the various units? The final decision was to create two universal forms, one for short term volunteers and one for long term volunteers.

“These conversations took a while. It was a really big lesson in compromising,” Abbey observed, “but also in advocating for your program, and understanding each other’s programs as well.”

 

6) Take plenty of time to test. And then test some more.

Once Bespoke customized VSys, the next step was extensive testing. The vendor transferred the actual data from the old system into a beta system so that their testing simulated the actual user experience.

“Our software consultant was also instrumental in the testing. They gave us standardized forms to test with. We had to go through the forms line-by-line and identify any errors.”

The Implementation Team conducted the testing, along with other staff and SI volunteers.

 

7) Treat your system as a living database

Vsys may have launched last year, but the system continues to evolve. Abbey explained, “We’re still finding things that need to be changed. A user will say ‘Wait – I didn’t even think about this process, or this scenario, or this one element that I don’t have on the screen and we actually need.’” Changes to the system are expected to be ongoing.

There is a resource account available for users and volunteers to share feedback and suggest upgrades. Abbey meets with the vendor on a weekly basis to review the feedback and discuss any immediate fixes. Abbey also now leads a Change Control Team (CCT), with members from across the Smithsonian; this team meets on a quarterly basis to discuss requests and recommend changes to the system.

For example, volunteers have reported the desire to view a list of other volunteers signed up for their shifts. The CCT reviewed and approved this request (to list the volunteers first name and last initial) on the calendar for others to view. The vendor is now working on this enhancement in the system and it should be rolled out to volunteers soon.

As you review these suggestions, you’ll see a recurring theme.  Each stage of the process was incredibly inclusive, bringing together a diverse group of system users and volunteers. It’s the wide range of viewpoints – just as much as the planning, testing, and careful execution, that’s led to a high performing system.

“One of the reasons I was successful in this role is that I already had relationships with a lot of these people. They knew me, they trusted me, and I could act as a mediator in the conversation.”

Once again, it comes back to buy-in. Every part of your volunteer program – even the software you use –works best when everyone owns the outcome.

 

line-separator-blue

Ready to form your own database search committee? My Principles of Buy-In will keep you focused on success .  Email me to receive a handout about the principles and a next steps worksheet – and I’ll add you to the Twenty Hats mailing list.

6 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *