- About ACM
- About Children’s Museums
- Elevating the Field
- Conferences and Professional Development
- Member Login
June 26, 2020 / News & Blog
The Children’s Museum of New Hampshire was one of six New England museums (and the only children’s museum) that took part in the Assessing Museum Impact study. Museum president Jane Bard reflects on what she and her team learned in the process and how they are applying that knowledge to build a more sustainable operation.
Why did you decide to become involved with the project? How much time/effort was involved?
Collecting and analyzing data is important, but in the grand scheme of all we do, evaluation consistently moves to the bottom of our to-do lists. I say “our” because, in order to be effective, the commitment to engage in evaluation involves all museum departments. The project took eighteen months and included participation from several museum departments for an average of ten hours per month.
What data did you collect, which departments did most of the collecting, and what collection methods had you been using?
Prior to our involvement with the project, we had been collecting financial data, zip code data, and used both paper and online surveys to collect feedback from visiting families, teachers, and program participants. The visitor services and marketing departments primarily conducted data collection, with input from the education and development departments for specific projects or grants. This data gave us basic details and sometimes valuable information that helped inform decisions. However, there were gaps, especially among data that could measure our success in reaching goals set forth in our strategic plan. The AMI project challenged us to engage our entire staff in determining what data we could and should collect, and how it could help guide our decision making across departments.
What did you learn about using data effectively?
We tested some of our routinely made claims and assumptions about the museum, carefully reviewing the language used in those communications. We also reviewed which claims we should test such as, are we reaching a statewide audience? Are we fulfilling our mission and vision? Are our programs achieving their stated impacts in the community, with children, with low-income families? Is the museum the economic engine we claim it to be?
What were some of the data collection challenges, and how did you overcome them?
Primary challenges included collecting data from several departments, using multiple tools, and then trying to determine how to synthesize, analyze, and share it all in useful ways. We overcame this by dedicating two all-staff meetings to delve into the specifics: what data did we want to collect, how would we collect it, and who would lead each collection effort. We then assigned one staff member (who loves data!) to compile and share all the data. Another challenge was getting parents and teachers to return surveys. For parents, we incentivized their participation by periodically raffling off a museum membership to those who completed surveys. For visiting teachers, we eventually gave up trying to collect paper surveys and are now finding greater success by sending them a link to an online post-visit survey with results automatically tallied in a Google doc.
What did you learn about data collection? How did you come up with indicators that would determine success (or not), and then what data collection methods did you use to collect that information?
We used our current strategic plan pillars to develop success indicators. For example, one of our strategic goals was to expand and deepen our impact. How do we measure that? To show audience expansion, we gathered the following quantitative data:
Much of this data was collected using our Altru CRM system and is now being used to target marketing efforts to specific towns and audiences where we know, thanks to census data, there is room for growth.
To gauge how well we were meeting the second part of that goal—to deepen our impact—we gathered staff observations as well as qualitative data through surveys of parents, teachers, and children. Survey questions probed changes in understanding of subject matter, changes in observed behavior and skills, and changes in learning approaches following a museum experience.
We also used our AMI project data to answer the big question: are we fulfilling our mission and vision? Our mission is to actively engage families in hands-on discovery, and our vision is to inspire all to become the next generation of innovators and creative thinkers. Much of the qualitative data we gathered served a dual purpose in helping us determine 1) success in achieving our mission and vision, and 2) course adjustments that might be needed.
There are some very sophisticated data collection methods available (e.g. geomapping), but primarily for large organizations with well-developed skills and capacity. What simple but effective methods did you devise to collect useful data?
We experienced an “aha” moment during this project when our advisors validated the idea that “snapshots” of data can be as valuable as year-long views, and that collection methods do not need to be sophisticated. We tested this idea during a Free Family Day hosted to celebrate the museum’s thirty-fifth anniversary. We wanted to know how many first-time visitors this event drew and whether it attracted local families or expanded our geographic reach. We set up large pieces of poster board and gave families stickers to post indicating whether this was their first time to the museum or whether they had visited before and where they lived. This quick and simple method of collecting data did not overburden our staff (on a very busy day with more than 2,000 visitors!). At a quick glance, it showed us that about 50 percent were first-time visitors and represented towns throughout the state of New Hampshire, as well as southern Maine and northern Massachusetts. We also gave these stickers to everyone, using a different color for members, and then counted how many were left. Since we knew how many we printed, we could easily count the number of overall visitors and how many were museum members.
You have said that data is used to confirm what you think is happening and inform decisions. Did any new data reveal any surprises?
Data from our point of sale (POS) system showed hourly visitation patterns: the largest percentage of families started their museum visit between 10:00 a.m. and 11:00 a.m., our first hour of operations. Through a Constant Contact survey of our members, as well as a Facebook survey of the general public, we found overwhelming interest in earlier opening hours. Many respondents said this would allow them to visit more often, work better for their family’s schedule, and increase their interest in renewing their membership. The museum changed to an earlier opening in September of 2019, and now 13 percent of families start their visit between 9:00 a.m. and 10:00 a.m. Although we can’t prove causation, we have also experienced a 10 percent growth in family memberships during the same period.
What aspect of museum operations, or which museum staff, is underutilized in data collection? What are some unturned stones of useful information?
A “secret source” of data that surprised me, but may not surprise our marketing colleagues, is the aggregate of comments, reviews, likes, shares, and interest generated by the museum’s social media content. Paying attention to social media metrics helps us gauge what resonates with our audience, as well as what doesn’t. We also use Google analytics to track engagement with our website pages. With a little time, effort, and training, the information we have gathered has helped us understand how our visitors utilize information on our site, and that pattern is constantly evolving. A nonprofit Google grant helped us set up a Google Ads account, which gives us access to $10,000 of monthly advertising within the Google search console. With the help of an outside consulting company, we’ve able to see how people are searching for information about us, how to reach new audiences, and then how to better get them in the door for a visit.
It’s been said that “numbers without stories have no humanity; stories without numbers have no accuracy.” Has any of your new data affected your story?
New data hasn’t changed our story; it has reinforced the story we were already telling. We recently conducted a survey in an effort to measure our impact on the lives of the children and families. We asked members and program participants to rate the accuracy of several statements on a scale from strongly agree to strongly disagree in. Between 80 and 100 percent of parents and caregivers strongly agreed with the following statements:
Being able to back up our claims with data has been powerful, both for our staff who feel their efforts are validated, and for supporters, who are making funding decisions based on data we are able to provide.
This article was originally written before the COVID-19 crisis swept the world. Did your participation in the Assessing Museum Impact project provide any additional resources to help you plan for our uncertain future, which now includes preserving your institution while navigating a path to reopening?
The article still stands, although it now seems like it was written a lifetime ago. Beginning in March, we have been using surveys to gather data to help us make decisions about the direction of our summer programming (traveling library programs and summer camps), and we will be surveying our audience about their intent to visit the museum once we have a possible opening scenario. We are also closely monitoring our social media engagement to see what is resonating with families and educators so we can hone our virtual offerings to both support our mission and best serve our audience.