Posts Tagged ‘museum’

The case study below highlights two summative evaluations RK&A did at the California Academy of Sciences (CAS).  Both exhibitions debuted in the new CAS building that opened in San Francisco’s Golden Gate Park in September 2008.  Although the two exhibitions use different interpretive methods and have different learning outcomes, the two projects together highlight the importance of exhibition introductions.

Water Is Life and Altered State: Climate Change in California [2010]

Summative evaluations of two exhibitions for a natural history museum

The California Academy of Sciences (CAS) contracted Randi Korn & Associates, Inc. (RK&A) to evaluate two exhibitions debuting in the CAS’s new facility. One exhibition, Altered State: Climate Change in California, uses fossils, interactive technology, and live animals “to explore the science of climate change, the effects we may expect to see in our own backyard, and the steps that can be taken to mitigate these dramatic changes,” while the other exhibition, Water is Life, explores the importance and diversity of water using the Steinhart Aquarium’s Living Collection.

How did we approach this study?

We believe that each evaluation study is unique and should strongly consider the goals and objectives of the exhibition, program, or other endeavor. As such, RK&A worked closely with CAS to identify its goals and objectives for each exhibition. Water is Life was focused on visitor learning. In response, RK&A conducted a remedial evaluation to identify operational or conceptual shortcomings early in exhibition development, followed by a summative evaluation that employed a rigorous, modified pre-test/post-test design to measure visitor learning. For Altered State, CAS sought to understand what visitors did in the exhibition, as well as what they took away from their experiences; thus, RK&A conducted timing and tracking observations and in-depth exit interviews.

What did we learn?

While many findings were exhibition-specific, there were also two larger trends. First, both evaluations show that visitors had strong affective experiences in the exhibitions, although learning objectives were challenging to meet. For instance, in the Water is Life evaluation, findings show that visitors who went to the exhibition demonstrated much greater interest in and concern for the natural world than visitors who did not see the exhibition. However, there were few differences in the knowledge of visitors who went to the exhibition and those who did not.

Second, visitors need a strong physical and conceptual introduction to exhibitions. In the Water is Life remedial evaluation, some interviewees described way-finding issues, and a few interviewees specifically asked for a better introduction. Further, in Altered State, the open space exhibition design, with its multiple entry and exit points, likely contributed to low dwell times.

What are the implications of the findings?

 The evaluations are a keen reminder that exhibition introductions are imperative. They can set the conceptual stage for visitors, and if they are well-conceived and executed, they can also convey overarching concepts, connect subthemes, and present the intent of the exhibition. On a similar note, while the open exhibition design offers visitors a free-choice learning environment, introducing some structure and direction might help those seeking to understand the exhibition’s “big idea” without compromising the free-choice quality. Additionally, the studies reaffirmed that the unique value of exhibitions are the strong affective experiences they prompt.

 

Read Full Post »

RK&A’s work with the Perez Art Museum Miami (PAMM) is today’s featured project on the Museum Education Monitor’s (MEM) social media sites!  RK&A has been working with the Perez Art Museum Miami since 2013 to evaluate its Knight School Program, a single-visit program designed to serve all third grade students in Miami-Dade County Public Schools.  We began our work together by helping staff articulate and clarify student outcomes and indicators.  The program intends to enhance students’ critical thinking skills related to observing and interpreting works of art.  We are now in the process of conducting a formative evaluation  that will identify the strengths and areas in need of improvement, before finally conducting a summative evaluation in 2015.

Check out the MEM posting for some additional information by visiting these social media sites today!

Web– http://www.mccastle.com/Public/Default.aspx

Facebook– http://www.facebook.com/Museum.Education.Monitor

Twitter– http://twitter.com/mchriscastle

Pinterest – http://pinterest.com/mchriscastle/

YouTube – http://www.youtube.com/user/MChrisC54

FORUM Blog– http://forum.mccastle.com/

 

Read Full Post »

The case study below is from a project RK&A did with the Museum of Science and Industry in Chicago, IL and highlights the importance of iterative testing.

Future Energy Chicago Simulation [2013]

An evaluation of a multimedia simulation for a science museum

Between 2012 and 2013, RK&A conducted four rounds of formative evaluation of the Future Energy Chicago simulation for the Museum of Science and Industry in collaboration with the design firm Potion. In the simulation, up to five teams compete against each other in five games: Future House, Future Car, Future Neighborhood, Future Power, and Future Transportation. In the games, players have to make decisions that challenge them to think about energy production and usage, and they receive badges as rewards for selecting energy-efficient choices.

How did we approach this study?

RK&A included primarily middle school youth (home school groups, etc.) in testing, as they are the target audience for Future Energy Chicago. Each round of evaluation explored unique issues relevant to a particular design phase. In the first round of evaluation, RK&A tested three-dimensional paper prototypes of each game to explore middle school youth’s understanding of the concepts presented. In the next two rounds (alpha and alpha prime), RK&A tested the games on touch-screen monitors to explore each game’s functionality as well as youth’s motivations and learning, including a badge system aimed at rewarding youth’s energy-efficient choices. In the last round of evaluation, RK&A tested the games using a combination of multi-touch and projection technology that closely mirrored the final simulation environment. For each round of evaluation, RK&A staff conducted observations and interviews with middle school youth who played the games.

What did we learn?

Each round of evaluation revealed successes and challenges of the Future Energy Chicago games that MSI staff and Potion designers used to improve the games’ functionality and messaging. Throughout testing, findings revealed three key characteristics of the game that were compelling to middle school youth—variety of energy choices, opportunities to design aspects of their energy environment, and challenging energy problems to solve. Findings also revealed that youth’s prior knowledge and experiences with energy choices highly influenced the choices they made and the messages they took away from each game. A consistent challenge throughout testing was helping youth understand the idea of trade-offs in energy choices (comfort or cost versus saving energy). A badge system was implemented to address this issue, as well as to incentivize youth to select energy-efficient choices.

What are the implications?

This study underscores the importance of iterative testing when evaluating a complex digital learning environment. Not only did MSI staff and Potion designers need to understand barriers to effectively using the games, including the intuitiveness of the technology, the Museum needed to understand what about the simulation motivated youth’s game play and effectively empowered them to make smart energy choices as the future residents of Chicago. Further, RK&A facilitated reflective discussions between rounds of testing that enabled MSI staff and designers to apply the study findings and recommendations to the next round of testing, ultimately improving the overall functionality and effectiveness of Future Energy Chicago.

Read Full Post »

25th Anniversary ButterflyThe most challenging evaluation report I’ve written consisted of 17 PowerPoint slides. The slides didn’t pull the most salient findings from a larger report; the slides were the report! I remember how difficult it was to start out with the idea of writing less from qualitative data. While I had to present major trends, I feared the format might rob the data of its nuance (17 PowerPoint slides obviously require brevity). The process was challenging and at times frustrating, but in the end I felt extremely gratified. Not only was the report thorough, it was exactly what the client wanted, and it was usable.

As evaluators, we toe the line between social science research, application and usability. As researchers, we must analyze and present the data as they appear. Sometimes, especially in the case of qualitative reports, this can lead to an overwhelming amount of dense narrative. This acceptable reporting style in evaluation practice is our default. Given the number of reports we write each year, having guidelines is efficient and freeing. We can focus on the analysis, giving us plenty of time to get to know and understand the data, to tease out the wonderful complexity that comes from open-ended interviews. As researchers, the presentation takes a backseat to analysis and digging into data.

However most of the time we are writing a report that will be shared with other researchers; it is a document that will be read by userspaper-stack-300x251museum staff who may share the findings with other staff or the board. Overwhelming amounts of dense narrative may not be useful; not because our audience can’t understand it, but because often the meaning is packed and needs to be untangled. I would guess what clients want and need is something they can refer to repeatedly, something they can look at to remind themselves, “Visitors aren’t interested in reading long labels,” or “Visitors enjoy interactive exhibits.” As researchers, presentation may be secondary, but as evaluators, presentation must be a primary consideration.

As my experience with the PowerPoint report (and many other reports since then) taught me, it can be tough to stray from a well-intentioned template. A shorter report or a more visual report doesn’t take less time to analyze or less time to write. In fact, writing a short report takes more time because I have to eliminate the dense narrative and find the essence, as I might with a long report. I also have to change the way I think about presentation. I have to think about presentation!

At RK&A, we like to look at our report template to see what we can do to improve it – new ways to highlight key findings or call out visitor quotations. Not all of our ideas work out in the long run, but it is good to think about different ways to present information. At the end of the day, though, what our report looks like for any given project comes from a client’s needs—and not from professional standards. And I learned that when I wrote those 17 PowerPoint slides!

Read Full Post »

Emily’s last blog post (read it here) talked about when evaluation capacity building is the right choice.  When we think about building capacity for evaluation, we think about intentional practice.  This does not necessarily involve teaching people to conduct evaluation themselves, but helping people to ask the right questions and talk with the right people as they approach their work.  RK&A has found this to be particularly important in the planning phases of projects.

The case study below is from a project RK&A did with the Museum of Nature and Science in Dallas, TX (now the Perot Museum of Nature and Science) and involved an interdisciplinary group of museum staff thinking intentionally about the impact the Museum hoped to have on the community.  With a new building scheduled to open a year after this project took place, it was a wonderful time to think intentionally about the Museum’s impact.

Building Capacity to Evaluate [2012]

An evaluation planning project with a nature and science museum

The Museum of Nature and Science (MNS) hired RK&A to develop an evaluation plan and build capacity to conduct evaluation in anticipation of the Museum’s new building scheduled to open in 2013.

How did we approach the project?

The evaluation planning project comprised a series of sequential steps, from strategic to tactical, working with an interdisciplinary group of staff across the Museum. The process began by clarifying the Museum’s intended impact that articulates the intended result of the Museum’s work and provides a guidepost for MNS’s evaluation: Our community will personally connect science to their daily lives. Focusing on the Museum’s four primary audiences that include adults, families, students, and educators, staff developed intended outcomes that serve as building blocks to impact and gauges for measurement. Next, RK&A worked with staff to develop an evaluation plan that identifies the Museum’s evaluation priorities over the next four years, supporting the purpose of evaluation at MNS to measure impact, understand audiences’ needs, gauge progress in the strategic plan, and inform decision making.

The final project step focused on building capacity among staff to conduct evaluation. Based on in-depth discussions with staff, RK&A developed three data collection instruments, including an adult program questionnaire, family observation guide, and family short-answer interview guide, to empower staff to begin evaluating the Museum’s programs. Then, several staff members were trained to systematically collect data using the customized evaluation tools.

What did we learn?

The process of building a museum’s capacity to conduct evaluation highlights an important consideration. Evaluating the museum’s work has become more important given accountability demands in the external environment. Stakeholders increasingly ask, How is the museum’s work affecting its audiences? What difference is the museum making in the quality of people’s lives?

Conducting systematic evaluation and implementing a learning approach to evaluation, however, require additional staff time which is a challenge for most museums. MNS staff recognized the need to create a realistic evaluation plan given competing demands on staff’s time. For example, the evaluation plan balances conducting evaluation internally, partnering with other organizations, and outsourcing to other service providers. Also, the plan incrementally implements the Museum’s evaluation initiatives over time. The Museum will begin with small steps in their efforts to affect great change.

Read Full Post »

As evalua25th Anniversary Butterflytors, we are often asked to help our clients build evaluation capacity among staff in their organization. The motivation for these requests varies. Sometimes the primary motivator is professional development; other times it is perceived cost savings (since conducting professional evaluations can require resources that not all organizations have at their disposal). We welcome when an organization values evaluation enough to inquire about how to integrate it more fully into their staff’s daily work. If an organization has a true interest in using evaluation as a tool to learn about the relationship between its work and the public, building an organization’s evaluation capacity may be quite successful. On the other hand, if the primary motivator is to save costs associated with evaluation, often the outcome is much less successful, mostly because evaluation takes considerable time and invariably there is a trade-off; when the evaluation is being done, something else is being ignored.

Evaluation capacity building can take a variety of forms. It can range from building staff’s capacity to think like an evaluator, perhaps by helping staff learn how to articulate a project’s desired outcomes (I think this is the most valuable evaluation planning skill one can learn), to training staff to conduct an evaluation from beginning to end (identifying outcomes, creating an evaluation plan, designing instruments, collecting data, conducting analyses, and reporting findings). Even among the most interested parties, it is rare to find museum practitioners who are genuinely interested in all aspects of evaluation. As an evaluator, even I find certain aspects of evaluation more enjoyable than others. I’ve noticed that while practitioners may be initially intimidated by the data collection process, they often find talking with visitors rewarding and informative. On the other hand, they have much less enthusiasm for data analysis and reporting; I’ve only encountered a handful of museum practitioners who enjoy pouring over pages and pages of interview transcripts. We lovingly refer to these individuals as “data nerds” and proudly count ourselves among them.

There is yet another challenge, and it has to do with the fact that most museum practitioners are required to wear many mountain-data-mining33hats. Conducting evaluations is my one and only job; it is what I am trained to do and have intentionally chosen for my vocation. While a practitioner may be intrigued by what evaluation can offer, often it is not the job he or she was trained or hired to do, which means that evaluation can become a burden—just one more hat for a practitioner to wear. Some organizations have addressed an organization’s evaluation needs by creating a position for an in-house evaluator and the individual who might fill that position is usually someone who is schooled in evaluation and research methodologies, much like all of us here at RK&A. I would caution organizations to be very realistic when considering building their organization’s evaluation capacity. Does your staff have the time and skills to conduct a thoughtful study and follow through with analysis and reporting? What responsibilities is your organization willing to put aside to make time for the evaluation? And, do you want your staff to think like evaluators or become evaluators?—an important distinction, indeed. Otherwise, even those with the best of intentions may find themselves buried in mountains of data. Worse yet is that what was once an exciting proposition may be perceived as an annoyance in the end.

Read Full Post »

25th Anniversary ButterflyAs we move further and further into the digital age, museums hold something that is becoming a rare commodity—real objects and artifacts. It may be hard to believe, but one day, many tangible objects may be obsolete, the way that printed photographs and airplane tickets are becoming scarce items. Instead of going on “digs,” future archaeologists may primarily use computer-driven devices to search for clues of our ancestors. In the distant future, I can imagine that museums will be magical places where people can see “the real thing.” …But wait, maybe they already are?

 

This is my third reflection—informed by what I have learned about museum visitors in all my years studying them. I have found that there are many reasons people visit museums, but I believe the primary reason, one that we may take for granted, is that they want to see “the real thing.” A common question heard in museums is, “is it real?” especially in regard to bones and historical objects. I have heard it in our research, but you have probably heard it too, or said it yourself while walking through a museum. Why do people ask this question? What underlies the need to know if something is “real.” As museum goers, can’t observing a replica of a dinosaur skeleton or a 17th century Dutch ice skate tell us just as much as the real thing? Maybe so. But there is just something about being in the presence of authentic artifacts and objects that is thrilling; maybe it has to do with feeling connected to other people, to the past, or to other parts of the world. Professors David Thelen and Roy Rosenzweig say it best in their landmark 1998 national study, The Presence of the Past: “approaching artifacts and sites on their own terms, visitors could cut through all the intervening stories, step around all the agendas that have been advanced in the meantime, and feel that they were experiencing a moment from the past almost as it had originally been experienced—and with none of the overwhelming distortions that they associated with movies and television, the other purveyors of immediacy.”

 

Whenever studying museum visitors, I come face to face with their sense of wonder about and desire to get clDinosaur Sueose to (even touch) real objects. Whether evaluating text panels, interactive exhibits, touch tables, or ideas and concepts, visitors will usually keep coming back to the objects. As museum professionals, it is sometimes too easy to forget the centrality of the object when you are knee-deep in trying to interpret and contextualize something. We can get lost in these various mediums of interpretation, but visitors will usually remind us what they are really there for. For example, I was doing a study for a museum and historic site last year in which we were testing ideas for high-tech touch tables intended to convey information about the historic building the museum is housed in. I had gotten so wrapped up in testing all the information, that I had mentally pushed aside the museum’s biggest asset, the historic building it resides in. But visitors brought me back when they practically skipped over my questions about the touch tables and rather, kept circling back to the building itself—its authentic and tangible sense of history. Of course that is what they wanted to talk about and why they were there. This isn’t to say that interpretation of any kind is futile. But I believe it is important to keep reminding ourselves, as museum professionals, that interpretation should be used primarily to help visitors make sense of the objects and artifacts they are there to see—it’s really that simple.

 

The very reason I work with museums is because of my own sense of wonder and astonishment when it comes to objects and artifacts. Yes, I love studying people and how they learn and make sense of experiences, but I could do that in many different settings. I chose to do it in museums because of my own belief that we can learn so much from studying “the real thing.”

Read Full Post »

Older Posts »