Thursday, September 30, 2010

VOTE NOW FOR OUR 2012 CONFERENCE SITE

Every year as fall approaches, the VAM staff begins the process of determining where our next conference is going to be. Making this decision is a balancing act, as we need to travel around the state, make sure different museums and communities get the chance to serve as local hosts, and keep to a very tight budget and a very specific need for space. In the past couple of years, many members have asked us how we choose a community or a hotel, and so we decided to show you "behind the scenes" of our decision process.

There are several things we need to keep in mind as we look at hotels:
  • Do they have enough meeting space to provide our participants with ample classrooms and our exhibitors with a large enough exhibit hall?
  • Is the hotel able to provide us with a room rate for our participants that fits into their tight budgets, and conference food prices that fit within VAM's tight budget?
  • What does the area around the hotel look like in terms of museums--how far is it to area museums, and is there a good variety of museums for our participants to see?

When VAM was "younger," we were small enough to be able to spread our meetings across many of the communities in Virginia--Fredericksburg, Lynchburg, Staunton, Charlottesville, even Alexandria. As we have grown into the largest state association in the country, our choices for hotel and location have shrunk. There are now only eight cities in the state that have hotels with appropriate meeting space in which to hold our meeting--Richmond, Roanoke, Hot Springs, Reston, Crystal City, Williamsburg, Portsmouth and, with a brand new hotel, Newport News.

With a goal of going to Roanoke in 2013, 2012 gave us the opportunity to look into Northern Virginia for a conference site--we haven't been to this part of the state since 2004. At the same time, the new hotel in Newport News, as well as the museums in Newport News, offered us a wonderful chance to try that site.

Let us know what you think. Because the 2011 conference is in Portsmouth, we do not want our members in other parts of the state to feel slighted by returning to the Tidewater two years in a row, and we also want to do what is best for VAM and for our members.

We'd love to have your opinion on 2012 VOTE NOW!

Tuesday, September 28, 2010

Welcome to the 2011 VAM Conference

Welcome to historic Olde Towne Portsmouth, Virginia, site of the 2011 Virginia Association of Museum’s annual conference. Known for its charm and central location in Hampton Roads, Olde Towne Portsmouth is visited by travelers and residents alike for its abundance of museums and attractions, restaurants, specialty shops, and arts and antiques galleries. Its pedestrian-friendly atmosphere offers visitors a chance to stroll its streets to view scenic waterfronts, historic homes, beautiful churches, lush gardens and more. Conference planning will take VAM participants through this cultural locality during the four-day conference period.



Evening Socials
“Ain’t we got fun?” Three evening events beginning on Saturday will allow you to explore beyond your interim stay at The Renaissance.
• Evening 1: This year’s scholarship reception will take place at the Hill House, headquarters of the Portsmouth Historical Association and the only historic residence in Olde Towne that is open to the public. This 1825 four-story English Basement type home contains all the belongings and furnishings of the Hill family who lived here for over 150 years.
• Evening 2: Enjoy food, fun and frolic along the historic Seawall where events will take place at the Lightship, Naval Museum, Skipjack Nautical Wares and Riverview Gallery. Pirates, music, naval and maritime history, art and “The Colonel” will be featured. For sports lovers, Selection Sunday will take place at the Sports Hall of Fame.
• Evening 3: Relax and have fun strolling from the Courthouse Galleries to the Visual Arts Center, Children’s Museum of Virginia and Virginia Sports Hall of Fame. This progressive event includes hors d’oeuvres, dinner, dessert and musical entertainment. And, if you need further enticement, bring your loot—all the venue gift shops will be open.

Daytime Activities
Whether you enjoy art or history, two concurrent afternoon events on Saturday will keep attendees engaged.
• Event 1: On the roof of Tidewater Community College’s Visual Arts Center, Ed Francis, Chair of the Glassblowing Department, will demonstrate the glassblowing process and work with attendees in the creation of a paperweight.
• Event 2: Author and historian Dean Burgess will take a group on a Steeple-to-Steeple Church and Synagogue Tour that will feature the unique architecture and interiors of St. Paul’s Catholic Church, Trinity Episcopal Church and the Jewish Museum and Cultural Center.

Remember VAM's last conference in Portsmouth?



Shelley Brooks, Co-chair, Local Arrangements Committee, VAM Conference 2011

Wednesday, September 22, 2010

Visitor survey work: What did our visitors think? (3 of 3)

In my first two blogs, I wrote about our recent work with surveys and visitor tracking. In this column, we’ll take a look at what the data had to say about customer satisfaction.

In our first quick exit survey, we simply asked for an overall judgment from our visitors about their experience that day. We gave them a scale running from outstanding to dissatisfied and asked them to select one of the 5 gradations on the scale. Pretty shallow, but it provided a snapshot. The AASLH survey, however, asks many questions from a variety of angles to get at visitor satisfaction, value of the visit, quality of education and entertainment, the difference the visit made. We received valuable feedback, and much of it is also useful for marketing purposes and for donor/grant appeals. In 2009-10, we re-asked some of the AASLH questions on an on-line survey. Here’s some of what we learned from visitors to the National Museum of the Marine Corps:

When is came to overall satisfaction with their visit, 76% reported that the Museum was outstanding; 22% excellent; and less than 1% reported satisfactory, marginal, or unsatisfactory each. This rating is supported by the AASLH data; of a possible high score of 10, those surveyed gave the Museum a 9.5 as an overall rating.

When it came to visitor satisfaction with staff and volunteers, 71% reported it to be outstanding; 21% excellent; 5% satisfactory; less than 1% marked marginal and unsatisfactory each; 2% had no opinion. This rating is supported by the AASLH data; of a possible high score of 10, those surveyed gave the Museum’s staff and volunteers a 9.3 rating.

When asked if they would return for another visit, 81% indicated that it was very likely that they would do so; 15% said it was somewhat likely; while 4% said no. This rating is supported by the AASLH data; of a possible high score of 10, those surveyed gave the Museum a 9.3 rating for a return visit.

When asked if they would recommend the Museum to others, a very high 98% said that it was very likely that they would do just that! This rating is supported by the AASLH data; of a possible high score of 10, those surveyed gave the Museum a 9.6 rating when it came to making a recommendation to others.

Other AASLH data points of interest include a rating of 9.4 out of 10 for its unique learning environment and a 9.3 for the positive impact it had on visitors. We also learned that our “brand name” wasn’t very familiar yet; that people had difficulty once inside the Museum navigating from one gallery to the next and within the galleries; and most people heard about us by word of mouth (so is all that expensive advertising, rack card distribution, and other marketing efforts worth it?).

The on-line survey was by invitation as explained in an earlier blog, and we were not consistent in how we passed out our keepsake postcard invitations to Museum visitors. Our first game plan was for docents to hand out the card to every 5th visitor every day during mid-day hours until 250 cards had been passed out. Because of the variations in docent schedules and assignments, there is a revolving door of docents, students, and Marines at the information desk. It proved impossible to stick to our plan. For two early months, we under-executed our invitation give-aways. And then we ran out of cards because we weren’t paying attention to our stock, resulting in another few weeks in which we had little data. We spent most of the year-long cycle, passing out our daily allotment along with the Museum brochure as folks entered. And sometimes, over-zealous information desk staff passed cards out all day long, not just to the first 250. Thus, the exercise was not as scientific as we had planned.

And I was disappointed in the small number who responded on their computers once they got home, although our statically minded colleagues tell us that capturing feedback from even 2-3% of our visitors is “statistically relevant.” We did much better than that, but I was hoping for a 10% rate, not the 5 to 8 that we got.

The survey is still active on our site; we’re letting folks who find it on our home page take it, so that we continue to get a trickle of feedback around the clock. Survey Monkey is not expensive; the reported data is usefully presented; it leaves the heavy lifting to others (but we did have staffing support from the institutional assessment team at Marine Corps University, and it would have been much more labor intensive without them). You also have to know how to write the questions and portray the answers, so you get the data you’re looking for. I think this automated tool has much promise. Next time, I want to use terminals in our Museum itself for the survey, along with trained staff and volunteers who will direct traffic to the stations. But I think we’re asking the right questions. Take a look for yourself on www.usmcmuseum.org and tell me what you think.

And our visitors are not shy. I bet yours on just as vocal. On both the AASLH survey and the on-line instrument, we left opportunities for folks to give us their comments. And comment they did! And while the stats are valuable, the staff made greater use of these comments. We have gone over them periodically, spotting trends and repeat suggestions. Senior staff members are taking the lead on being responsive to these comments. We’ve tried new approaches to signage and way-finding based on the surveys; we’ve added benches; we have altered some of our information desk procedures to be more helpful to guests with special needs; and we have a growing list of suggestions for new exhibition topics (in spades!!). We definitely want to keep those comments coming (in addition to the comments cards available at the desk).

What have you done in the way of on-line or in-house computer-based surveys? What was the most useful suggestion you ever received? What was your worst mistake, one that we might learn from and avoid ourselves? I’ll let you know more as we continue at the Marine Corps Museum to get to know our visitors better. We are in the education business. But we are also in the business of providing customer service, and we’ll keep on trying to get better and better at both.

—Lin Ezell, Director, NMMC, and VAM Council Member.

Three Scholarships Available

The Virginia Foundation for the Humanities is offering three free tickets (normally $575 each) to edUi 2010 for cultural heritage organizations serving Virginia. edUi is a conference on web design and development specifically aimed at web professionals serving institutions of learning happening November 8-9 in Charlottesville, VA.

Preference will be given to small museums, cultural heritage organizations, historical societies and similar organizations serving Virginia with budgets that make attending edUi impractical. Additional consideration will be given to individuals or organizations poised to put what they learn to use on specific projects that will forward VFH's efforts to discover and share untold stories, encourage lifelong learning, and promote civil discourse through:
Commonwealth Outreach
Public events, seminars, exhibitions, educational resources, and long-term initiatives statewide.
Scholarship Serving the Public
Opportunities for scholars to reach broad audiences in Virginia, the nation, and world.
Digital and Media Initiatives
Culturally relevant content produced for public radio and Internet distribution.

Application deadline is Friday October 8th. Apply online at http://eduiconf.org

Monday, September 13, 2010

Visitor survey work: Who are our visitors? (2 of 3)

In our first discussion, we looked at the National Museum of the Marine Corps’ first experiment with visitor tracking within one of its exhibit galleries. Now we look at data collected during a quick exit survey and the Museum’s participation in a more formal program. In 2007-08, the Marine Corps Museum signed up for what is now called “Visitors Count!” with the American Association for State and Local History. Their survey has been used by a wide range of museums across the country during recent years; the instrument is tried and true, statistically proven, and allows for museums to compare their results with other organizations in their field or region. Participants have the option to include a few questions of their own, specific to their needs, but the instrument is provided by AASLH, the results are analyzed by AASLH’s statistical partner, and AASLH provides very specific guidelines on its execution. Institutional members get a discount on the survey program’s fees.

Our third major initiative was to use Survey Monkey, inviting our visitors to go to our web site upon their return home and answer some questions about their recent visit. We used an attractive keepsake-quality 5x7 postcard as our invitation, which we handed out to visitors either as they entered or as they exited. Most of the questions were similar to our earlier survey work; others were tailored to help our foundation partner gather data about visitor satisfaction in the museum store and food venues. We added a question about family income for the first time at the suggestion of our development team, who kept seeing that question come up on grant applications. And we ran the survey for a full year (June 2009-May 2010) to see if there were seasonal differences (there were not many). And we looked at zip code information gathered by the store. We learned a lot!

Male:female visitor ratio ave = 64:36 (for the on-line survey it was 67:33).

Of visitors surveyed on-line, 22% were in the 26-45 age group, 47% were 46-65, 17% were 66-75. While the age categories were slightly different, this data coincided with that collected during the AASLH survey: 34% were age 19-34, 25% 35-54, and 25% 55+. Surveys generally did not include the large numbers of active duty Marines, who are generally younger than 26, nor organized school and youth groups. These percentages reflected the general adult visitor population.

Overwhelming percentage of our visitors was white (89%). Again, surveys generally did not include the large numbers of active duty Marines and school groups, whose ethnic backgrounds are more diverse than our general adult visitor population.

Many of our visitors hailed from VA (21%), followed in decreasing order by PA, MD, FL, NC, NY, CA, NJ, and TX. Most of our visitors traveled from nearby Mid-Atlantic states and from those states with the largest overall populations and largest populations of retirees. These numbers coincided with zip code data collected by the Museum Store.

Of the visitors surveyed, 36% reported being Marines, past or present. Again, surveys generally did not include the large numbers of active duty Marines who visit the Museum in organized groups. Slightly fewer visitors reported being Marines during the first half of the year.

In 2009-10, 41% of visitors reported staying in area hotels and campgrounds; in 2007-08, the number was 22%, a marked change.

Our visitors were generally equally distributed across all income levels, with a small increase of visitors reporting an annual income of $46-75K in January-June and a slight increase of those reporting $100k+ in September-December.

Our visitors spend several hours at the Museum: 1% was here less than 1 hour; 21% spent 1-2 hours; 34% spent 2-3 hours; 25% spent 3-4 hours; and 19% were at the Museum for 4+ hours!

In 2009-10, 64% of our visitors were first-timers. (The reported number in 2007-08 was 79%; we would expect this number to be highest shortly after opening and decrease thereafter if the Museum is attracting repeat visitors.) In 2009, 16% of those surveyed were on their 2d visit; 11% on 3d visit; and 10% reported having been at the Museum 4 times or more!

61% of visitors surveyed indicated that they had visited NMMC’s web site.

37% of visitors surveyed indicated that they had eaten in one of the restaurants on their visit, and 70% reported buying something from the Museum Store. Both these numbers are significantly higher than the reported capture rates reported by both revenue-generating venues; capture rates are based on total visitation, to include Marines, students, and other populations not surveyed.

You’ll recall from my first blog that we first used a quick 6-question exit interview. Of our 6 questions, we wasted one. Obviously, having just opened, most visitors would be there for their first time. No need to ask that one. Age ranges differed a bit from survey to survey. We recommend using the AASLH breakdown so that all data meshes; they came up with this breakdown based on years of research; no need for us to try something different. Our interns should have been trained more thoroughly; a mix of older docents working with the interns may have also netted us more validity with our older visitors, some of whom seemed not to take them seriously. And we believe that we could have gone up to 10 questions and still remained within most subjects’ comfort zone with a stand-up interview.

For the AASLH survey, we needed a relatively quiet space where our visitors could take this longer, more complex survey. Randomly selected guests fill out this multi-page questionnaire themselves; it takes several minutes of careful reading; many of the questions require a sliding scale answer; some questions are asked more than once from a different perspective. We chose to use one of our restaurants for the survey site; about 25% of our visitors visit one of the two restaurants on the second deck. Visitors, especially during the first 18 months or so perceived the Mess Hall to be over-priced. By conducting our survey upstairs, we biased the results toward visitors who could afford the prices and the time to eat at the Museum. It was also hard to be completely random, and we may have gravitated to visitors who seemed more receptive and friendly. It’s hard not to, because you’re asking for a valuable commodity: their time. If they appear cool or unhappy, you believe your chances of success are not good. We will repeat this survey next year, and we’ll find another physical place for the interviews and work on our objectivity. And our little appreciation gift—a laminated bookmark—is definitely a gift not appropriate for all age groups. This survey was also executed on weekends and weekdays, primarily mid-day, over two seasons.

Who has exit survey experiences they would like to share? Check out AASLH’s site for more information on “Visitors Count.” Next blog will look at how we measured visitor satisfaction and the National Museum of the Marine Corps’ experiences with an on-line survey.

—Lin Ezell, Director, NMMC, and VAM Council Member.

Wednesday, September 8, 2010

Visitor surveys work: How many and who are they? (1 of 3)

Whenever museum staff get together, one of the first topics of conversation that comes up is visitor counts. Even if we’re a free attraction, like the National Museum of the Marine Corps, we live by the numbers, or so it would seem. Grant applications ask the question of us; so do donors and marketers and our boards and bosses. For that question, there is an easy answer. We count electronically at Quantico; many use the clicker method or count sales if they have paid admissions. After last year’s rough economic times, the Marine Corps Museum’s numbers have picked up. Since we opened in late 2006, we had the best ever months in 2010 for May through August, and we welcomed visitor # 2 million this summer. But the “who” question is much harder to answer!

Visitor research and analysis is a specialized art in which I, for one, have had no formal training, and that is now high on my professional improvements to-do list. But we didn’t let that stop us from trying to find out who was visiting the National Museum of the Marine Corps, where they were coming from, why, and anything else we could glean from survey data. And we’ve probably made mistakes at everything we’ve tried. The good news is that even with the learning curve, the data we’ve collected since summer 2007 have all meshed and have been validated from one survey to the next. Here’s what we’ve done and what we learned—and what we could have done better.

Part 1. Using summer college interns during our first season, we did a quick exit survey and then some visitor stalking because we wanted to know what was going on in our galleries. The exit survey work was pretty straight forward: where from, age, military background or not, first visit or not, length of stay, level of overall satisfaction. And we observed the sex of the respondents. Our interns interviewed every 10th person as they left the Museum, over a combination of mornings, afternoons, weekdays, and weekends. And then with clipboards in hand and with signage in place advising visitors that we would be observing them in the WWII gallery, we tracked how long our guests lingered in various locations, whether they seemed to be reading labels or not, if they looked up at the aircraft and other visuals overhead, how long they watched videos, and how much conversation was taking place among visitors who were mingling with others. That was all a lot harder than is sounded!

In part two of this series, we’ll look at the exit survey data as it compares to a more formal collection of visitor information. The visitor observation work we did in 2007 is the only exercise of this kind we’ve tried. We followed visitors over a combination of mornings and afternoons, weekdays and weekends. Over 40% of our visitors spend 3+ hours in the Museum; that year we had three historical galleries in place, including WWII. Of the visitors we tracked, they spent about 25 minutes in this gallery, our largest. Videos, some of which run as long as 8 minutes, kept visitors engaged for 90 seconds. Very few folks picked up the headsets to listen to oral histories; fewer seemed to look up and notice the artifacts and signage above them. Our guests did seem to read labels and share their reactions with others around them. They “socialized” the experience. Apparent families and assumed veteran groups discussed the exhibitions with the most animation.

Because we were in the midst of tweaking the design of three new galleries at the time, we used this information right away. We shortened new videos; angled aircraft hanging in the ceiling as much as possible, dramatically lit them, and included engine audio; “dressed up” oral history stations and shortened run times; and made the new galleries more navigable for groups where we could. We left text length alone.

We need to do more visitor-tracking, but our “stalkers” need more training to develop their powers of observation. The form on which we record our observations needs to be more streamlined and more inclusive (we made observations at a limited number of “stations”). We need to engage the subjects as they exit the area to capture actual data about them, rather than make assumptions. And we need to plan on observing for longer periods of time; it is a labor intensive exercise. We seriously underestimated the time factor. Young college students clearly stood out; we should consider using a combination of older and younger observers. We did not see any significant variations in the observations based on time and day.

Who has experience and advice on visitor tracking? What training aids have you used that were especially helpful? What did you do as a result of the data collected? Next time, we’ll look at visitor demographics.

—Lin Ezell, Director, NMMC, and VAM Council Member