Whenever museum staff get together, one of the first topics of conversation that comes up is visitor counts. Even if we’re a free attraction, like the National Museum of the Marine Corps, we live by the numbers, or so it would seem. Grant applications ask the question of us; so do donors and marketers and our boards and bosses. For that question, there is an easy answer. We count electronically at Quantico; many use the clicker method or count sales if they have paid admissions. After last year’s rough economic times, the Marine Corps Museum’s numbers have picked up. Since we opened in late 2006, we had the best ever months in 2010 for May through August, and we welcomed visitor # 2 million this summer. But the “who” question is much harder to answer!
Visitor research and analysis is a specialized art in which I, for one, have had no formal training, and that is now high on my professional improvements to-do list. But we didn’t let that stop us from trying to find out who was visiting the National Museum of the Marine Corps, where they were coming from, why, and anything else we could glean from survey data. And we’ve probably made mistakes at everything we’ve tried. The good news is that even with the learning curve, the data we’ve collected since summer 2007 have all meshed and have been validated from one survey to the next. Here’s what we’ve done and what we learned—and what we could have done better.
Part 1. Using summer college interns during our first season, we did a quick exit survey and then some visitor stalking because we wanted to know what was going on in our galleries. The exit survey work was pretty straight forward: where from, age, military background or not, first visit or not, length of stay, level of overall satisfaction. And we observed the sex of the respondents. Our interns interviewed every 10th person as they left the Museum, over a combination of mornings, afternoons, weekdays, and weekends. And then with clipboards in hand and with signage in place advising visitors that we would be observing them in the WWII gallery, we tracked how long our guests lingered in various locations, whether they seemed to be reading labels or not, if they looked up at the aircraft and other visuals overhead, how long they watched videos, and how much conversation was taking place among visitors who were mingling with others. That was all a lot harder than is sounded!
In part two of this series, we’ll look at the exit survey data as it compares to a more formal collection of visitor information. The visitor observation work we did in 2007 is the only exercise of this kind we’ve tried. We followed visitors over a combination of mornings and afternoons, weekdays and weekends. Over 40% of our visitors spend 3+ hours in the Museum; that year we had three historical galleries in place, including WWII. Of the visitors we tracked, they spent about 25 minutes in this gallery, our largest. Videos, some of which run as long as 8 minutes, kept visitors engaged for 90 seconds. Very few folks picked up the headsets to listen to oral histories; fewer seemed to look up and notice the artifacts and signage above them. Our guests did seem to read labels and share their reactions with others around them. They “socialized” the experience. Apparent families and assumed veteran groups discussed the exhibitions with the most animation.
Because we were in the midst of tweaking the design of three new galleries at the time, we used this information right away. We shortened new videos; angled aircraft hanging in the ceiling as much as possible, dramatically lit them, and included engine audio; “dressed up” oral history stations and shortened run times; and made the new galleries more navigable for groups where we could. We left text length alone.
We need to do more visitor-tracking, but our “stalkers” need more training to develop their powers of observation. The form on which we record our observations needs to be more streamlined and more inclusive (we made observations at a limited number of “stations”). We need to engage the subjects as they exit the area to capture actual data about them, rather than make assumptions. And we need to plan on observing for longer periods of time; it is a labor intensive exercise. We seriously underestimated the time factor. Young college students clearly stood out; we should consider using a combination of older and younger observers. We did not see any significant variations in the observations based on time and day.
Who has experience and advice on visitor tracking? What training aids have you used that were especially helpful? What did you do as a result of the data collected? Next time, we’ll look at visitor demographics.
—Lin Ezell, Director, NMMC, and VAM Council Member
No comments:
Post a Comment