25th Anniversary ButterflyAt RK&A, we think a lot about intentional practice and we encourage our clients to do the same. In planning meetings and reflection workshops, we ask clients to think about which elements of their work align with their institutional mission and vision (check out Randi’s blog post for more about the challenges of alignment). We push them to consider who might be the right audience for their program or exhibition, and we ask them to talk about the intended outcomes for their projects. Posing these kinds of questions is much easier for an “outsider” to do because we don’t have institutional baggage or a personal connection to a problem project. As consultants, we aren’t beholden to the way things have always been done. I get it – it can be hard to let go; but seeing clients seek information to make informed decisions is a powerful, exciting process. These clients want more information. They are willing to try new things, to change old (and sometimes new) programs to see if they can improve upon the results. These are museum professionals who want the very best experiences for their visitors.

We recently completed a project with a history museum and the results were, well, not as rosy as one might hope. Change is HardAfter explaining the challenges of changing students’ perspectives in a short, one-time museum visit, we started talking about what could be done to increase the effectiveness of the program. One of our suggestions was to increase the time allotted for the program and rather than spending that extra time in the exhibition, use that time to facilitate a discussion with students so they can process and reflect on what they had seen. Changing a program’s format and duration is a difficult task for the museum to undertake – it may require extra staff and certainly a different schedule – but it could make a difference. A few days later, our client asked us if there are any studies that show that longer programs are more effective. After failing to come up with any examples (if you know of any such studies, please leave a comment), the client asked for another study to see if a longer program leads to a different outcome.

As an evaluator, I want to support museums as they change the way they do their work. Evaluation can provide the necessary information to see if new ideas work. It can give clients the data-based push they need to let go of the way things have always been done and to try something new. If nothing else, the evaluation process can be a forum to remind people that even when you are changing course, there is a place for you on the Cycle of Intentional Practice: Plan, Align, Evaluate, Reflect.

The case study below is from a summative evaluation RK&A did with the Wildlife Conservation Society.  The Madagascar! exhibit at the Bronx Zoo is an indoor exhibit that allows visitors to come face-to-face with wildlife from this island habitat.  The exhibit also features a film, Small Wonders, Big Threats, that addresses environmental challenges the island is facing.

Madagascar! [2009]

A summative evaluation with a zoo

The Wildlife Conservation Society (WCS) contracted Randi Korn & Associates, Inc. (RK&A) to evaluate its new exhibition, Madagascar!, located at the Bronx Zoo. Madagascar! showcases the wildlife and landscapes of the world’s fourth largest island. Built in the historic Lion House, the exhibition transformed the interior, while preserving the historic building’s Beaux-Arts beauty. The exhibition offers opportunities to see the island through the eyes of a conservationist at various interactive stations.

How did we approach this study?

RK&A worked with WCS to clarify its goals and objectives for Madagascar! and to identify criteria to measure visitor outcomes. We conducted a summative evaluation that employed a rigorous, modified pre-test/post-test design to measure visitor learning and attitudinal changes. Through in-depth open-ended interviews, we explored visitors’ attitudes toward and understandings of threats to Madagascar and its animals as well as knowledge of WCS’s conservation efforts on the island. We then scored the interview data using rubrics and compared the achievement of eight objectives by visitors who had not seen the exhibition to visitors who had seen the exhibition.

What did we learn?

Findings demonstrate that the exhibition was extremely successful at achieving its goals. Statistically significant findings showed that visitors who experienced the exhibition gained the following new knowledge, ideas, and beliefs, including: 1) enhanced interest in the animals of Madagascar based on knowledge of their habits, environment, and endangered status (versus interest based solely on novelty); 2) knowledge that Madagascar’s environment and animals are threatened, especially by the loss of trees; and, 3) an understanding of why conservation scientists (including those from WCS) are in Madagascar: to study the animals and environment so that they can implement appropriate conservation strategies toward its protection.

What are the implications of the findings?

Even though recent public discourse on global warming has grown substantially, the general public’s familiarity with environmental issues still tends to be vague or even ill-conceived. Yet, findings demonstrate that Madagascar! shifted visitors’ knowledge of conservation science toward a more accurate, specific, and concrete understanding. These positive findings are remarkable when one considers how difficult it is to change people’s knowledge and attitudes, particularly in one relatively short visit to a single exhibition. Through experiences in exhibitions like Madagascar!, visitors assimilate new ideas and perceptions with their pre-existing ideas and perceptions and create new meaning. The exhibition effectively utilized simple low-tech interactive exhibits, large-scale video walls, live interpretation, and intimate, close-up looks at animals to connect visitors to the environments and wildlife of Madagascar. Evaluation results have shown that zoos can be appropriate environments for moving visitors beyond the novelty of seeing wild animals to developing an understanding of where the animals come from, why they are important, and how conservation efforts can protect them.

25th Anniversary ButterflyAs I have shared in other posts, I value the concept and four actions associated with Intentional Practice. Of the four quadrants that comprise Intentional Practice—Plan, Align, Evaluate, and Reflect—Align is the most complex, and it comes with baggage; tons and tons of it.

At its essence, alignment requires that staff examine all of their work and actions in context of the Impact the museum would like to achieve (as depicted in the center of the Cycle). This examination includes considering what they could continue doing because it helps them achieve their intended impact, what they could change because the project falls short of achieving intended results, or what they might stop doing because results do not support the museum’s intended impact. I have witnessed museums struggling with Alignment because invariably they may need to make some very difficult decisions, and change is inevitable as a result of decision making. And most humans (me included) have trouble with change. Just when things seem to be going well, BAM—something happens and I need to respond by changing Cycle of practice alignsomething.

Alignment can also become complex and difficult because people’s emotions are involved; and when emotions are involved, decision making is a struggle and met with resistance. Among the three possible actions mentioned above, to stop doing something is the most challenging and truly heart wrenching. Before people accept that they may need to stop doing something, their first reaction is to dismiss the evidence and exclaim, “That can’t be true; the evaluation must be wrong.” The next response is a very lucid, logical, rational explanation of how great the program really is—it is the public that needs retooling. Then there is panic and all kinds of thoughts begin to run wild—“How can I stop doing this program (that I love)? How can I stop doing this program that is part of the museum’s tradition?   What will I tell the funder? I know this program takes significant resources, but I love doing this program (and so does the funder). If I stop doing this program, what will I do with the void that is created? What will my colleagues say about the fact that I am doing one less program? What will I do instead?” Complicating matters is the feeling of embarrassment that begins to emerge—a very strong emotion.

One of the reasons people become embarrassed is because they think that others may perceive that the program failed and failure is still embarrassing in the museum community even though so many have written about the value of failure as a way to learn. As an example of how complicated these situations can be, one day an educator called to lament that she was aware that one of the very important programs that the museum had been doing for years was not going as well as it once had. She knew, in her heart, that she needed to reinvent it or drop it all together. Her greatest fear was her director who loved this program. The educator was aware that the program attracted a tiny slice of the public the museum intended to serve, and her annual review was in a few months and she feared that dropping the program, or even changing it would reflect poorly on her, even though it was the right thing to do.

Even though many lament with frustration, “We continue to do things the way we have always done them because it is the way we have always done them,” when there is an opportunity to try something different to reach a better outcome, analyze a situation to stimulate progress, or accept reality and put a program to rest—there is an internal struggle—not in the organization, but a personal struggle. In the Cycle of Intentional Practice, the Align quadrant is the one quadrant that goes deep—becomes personal. Everyone wants to strengthen their museum by aligning what they do with the impact they want to achieve, yet doing so requires a tough-as-nails approach, a relentless focus on the desired result rather than personal feelings about a program, and a recognition that change is inevitable and a complex fact of life.

RK&A’s work with the Perez Art Museum Miami (PAMM) is today’s featured project on the Museum Education Monitor’s (MEM) social media sites!  RK&A has been working with the Perez Art Museum Miami since 2013 to evaluate its Knight School Program, a single-visit program designed to serve all third grade students in Miami-Dade County Public Schools.  We began our work together by helping staff articulate and clarify student outcomes and indicators.  The program intends to enhance students’ critical thinking skills related to observing and interpreting works of art.  We are now in the process of conducting a formative evaluation  that will identify the strengths and areas in need of improvement, before finally conducting a summative evaluation in 2015.

Check out the MEM posting for some additional information by visiting these social media sites today!

Web– http://www.mccastle.com/Public/Default.aspx

Facebook– http://www.facebook.com/Museum.Education.Monitor

Twitter– http://twitter.com/mchriscastle

Pinterest - http://pinterest.com/mchriscastle/

YouTube - http://www.youtube.com/user/MChrisC54

FORUM Blog– http://forum.mccastle.com/

 

25th Anniversary ButterflyWorking in research and evaluation, you become very skeptical of words like “data-driven” and “research-based.” To evaluators, it is quite flattering that these words are so buzzworthy—yes, we want our research and evaluation work to be important, used, and even desired! However, even though these buzzwords grab attention, they can be misleading. For instance, when we talk about data and research at RK&A, we mean original, first-hand data and research, such as interviews, questionnaires, and surveys with museum visitors.

This was on my mind as I recently had the opportunity to help my CARE (Committee on Audience Research and Evaluation) colleague Liz Kunz Kollmann review session proposals for the 2015 AAM Annual Meeting. Liz, as CARE’s representative for the National Program Committee, was charged with reviewing sessions in the Education, Curation, & Evaluation track (all 141 sessions!) along with fellow National Program Committee members in education and curation. Given that audience research and evaluation can be part of many AAM tracks (marketing, development, exhibit design, etc.), Liz recruited some CARE members to help her review sessions in other tracks to see if there were any sessions outside of our designated track that CARE should advocate for.

I volunteered to review sessions in the tracks Development & Membership and Finance & Administration. I had expected to encounter a lot of buzzwords since the AAM session proposals include a description that must be appropriate for display on the AAM website, mobile app, and other published meeting materials. So, I wasn’t surprised but I was struck by the heavy use of terms like “data-driven” and “research-based” (e.g., data-driven strategies for membership recruitment and research-based programming) and was stymied in trying to determine whether these sessions were relevant to CARE—what data is driving the decisions and is it really of interest to CARE members?

Certainly I am not dismissive of research or data that isn’t “original.” There are many definitions of research and data that are applicable to certain scenarios and within certain fields. For instance, arts-based research is a completely valid field of research within art education when conducted well. However, I am biased to collecting original data from visitors first-hand, which is why terminology like “data-driven” and “research-based” makes my ears perk up—because these words prompt many questions for me about the type of data and research and its appropriateness to inform said decisions and practices. Through our work at RK&A, we truly want practitioners to make decisions that are data-driven; that is the greatest outcome of our work! However, we also want our clients to be skilled users and consumers of data and evaluation so much so that their ears perk up at the very mention of “data”—for hopefully, they, too, have become savvy digesters of the language as well as the meaning behind the data when talking about research and evaluation.

Check out our Buzzword Bingo below inspired by Dilbert: http://dilbert.com/strips/comic/2010-10-25/  Warning: this Bingo is informed by RK&A’s professional experience and is not based on original data.  Maybe with the help of our museum colleagues, we can make it “research-based.”  Please share your buzzwords!

Reflection 19 blog v4

The case study below is from a summative evaluation RK&A completed for the Conservation Trust of Puerto Rico.  Based in Manati, Puerto Rico, the Conservation Trust runs a Citizen Science program for local residents.

Citizen Science [2010]

A summative program evaluation with a nature conservancy

The Conservation Trust of Puerto Rico collaborated with RK&A to study the impact of its Citizen Science program, a NSF-funded project designed to involve local Spanish-speaking citizens in scientific research that contributes to growing knowledge about the Trust’s biodiversity and land management efforts. The Citizen Science program underwent formative evaluation in 2009 and summative evaluation in 2010. Summative evaluation is discussed here.

How did we approach this study?

Summative evaluation was guided by four impacts developed using NSF’s Framework for Evaluating Impacts of Informal Science Education Projects. These included that participants will: use and understand the scientific method; experience and understand the purpose of scientific rigor; develop a sense of ownership for the Reserve; and realize that the research in which they participate has wide application to decisions made about conserving the Reserve’s flora and fauna. To explore these impacts, RK&A collected 343 standardized questionnaires, conducted 39 in-depth interviews, and conducted three case studies with participants who have a high level of program involvement.

What did we learn?

In all areas where the Trust hoped to achieve impact with participants, gains were made. Findings show that participants self-reported moderate gains in their knowledge and awareness of flora and fauna and scientific processes; interestingly, those who participated in programs with live animal interaction self-reported greater gains. Some also acknowledged attitude and behavior changes as a result of program participation. Findings further demonstrate that a majority of participants felt the Reserve is relevant and valuable to them and Puerto Rico, honing and developing their sense of pride and ownership. Finally, some participants also recognized the application and value of the research in which they participated. Findings also raised some potential barriers to achieving impact, such as the average participants’ brief, often isolated exposure to a specific research project; as well as the fact that many participants entered the program with prior knowledge and interests that might limit the program’s potential to facilitate significant learning gains.

 What are the implications of the findings?

A review of Citizen Science projects found that very few have formally assessed the impact of participants’ experiences.[1] This study sought to contribute to knowledge in this area by exploring participants’ experiences through the lens of the four program impacts mentioned above. Some findings are consistent with those of other Citizen Science studies, such as the fact that participants exhibited more gains in content knowledge than process skills, and many participants enter with prior interest in and knowledge of science and conservation. Other findings suggest that animal interaction and small group size positively influenced participants’ experiences and perceptions of learning. Collectively, findings suggest implications for program design, including the importance of bridging participants’ experiences so they envision their contribution as part of a greater goal.

[1] Bonney, R., Ballard, H., Jordan, R., McCallie, E., Phillips, T., Shirk, J., & Wilderman, C. C. (2009a). Public participation in scientific research: Defining the field and assessing its potential for informal science education. A CAISE inquiry group report. Washington, D.C.: Center for Advancement of Informal Science Education (CAISE).

25th Anniversary ButterflyA recent Telegraph article announced that the chairman of Arts Council England thinks there should be a one-hour photo ban (on selfies in particular) in art galleries. My first reaction was: “This is an interesting and absolutely horrible idea.” I see how a photo ban could be conceived as a strategy to enhance the visitor experience—I have certainly muttered to myself in annoyance when there are so many people taking photos of an artwork that I can’t get close enough to see it (or if I feel brazen enough to make my way to the front so I can see it, I feel bad for ruining everyone’s photo). However, if this one-hour photo ban were to go through, I see it creating a lot more negative visitor experiences than positive ones when you put yourself in the shoes of the security guard—the person who has to enforce the rule.

Let me step back a moment and say that I owe my current professional career to my work as a security guard. In addition to many other roles as an intern at the Peggy Guggenheim Collection, I guarded galleries and certainly learned a lot about visitor experiences. As someone who wanted to work in a museum, I found that I, as a security guard, had the power to either make or break the quality of a visitor’s experience. Tell visitors about Peggy’s many artist lovers while standing in her bedroom—make their visit and even their day. Ask a visitor to leave her bag in a locker or coatroom—incite anger to the point of someone asking for a refund and never setting foot in the museum. It was a humbling experience to say the least and completely transformed my thinking about the work I wanted to do for museums.

Now jumping back to the policy at hand…when reading the article, I first imagined how this would work on the ground.  I immediately empathized with the poor security guards who would have to enforce this policy (as did a Hyperallergic author who commented on this policy). Yes, perhaps signage would alert visitors to the ban, but from evaluation we know that it would go largely unnoticed. Therefore, my predictor is that the first awareness a visitor would have of the policy is when a security guard tells him or her not to take a photo. No matter how friendly a security guard may be, being told not to do something can create an embarrassing situation. How the visitor then reacts to feeling embarrassed is another story. Does he argue with the guard? Just continue to take pictures anyway? Does he internalize it and feel awful for the rest of the day? Any way it plays out will generally result in a negative experience for the visitor as well as those around him.

The chairman’s comparison of this no-photo policy and the “quiet car” is a perfect analogy in my opinion. As a frequent train rider, I love the concept of the quiet car, and I choose to sit in it more often than not. It works well when everyone knows they are in the quiet car. The trouble is, typically, there is one person who doesn’t know, which puts a negative pallor on everyone else’s experience in the quiet car. Most quiet cars have a sign, sometimes the lights are dimmer than other cars, and sometimes the conductor will announce which car is the quiet car. Perfect—except non-regular riders do not notice the signs, subtle lighting cues, or are aware what car they are in (am I in the first car?). Therefore, when the un-knowing person is encountered by a fellow quiet-car rider or conductor about a rule-breaking cell phone conversation, the ensuing interaction often doesn’t go well. I have seen and heard about everything—from a New Jersey Transit rider being escorted off the train by police after starting a fight with a confronting rider, to an Amtrak rider construing the conductor’s request as him telling her she “has a big mouth.” For these reasons, I find myself avoiding the quiet car lately because I end up being more frustrated than relieved and feeling more negative than positive. From what I have seen as security guard, evaluator, and expert train rider, more negative than positive visitor experiences might result from this potential photo-ban policy.

Photo from the Fortune article, "The Cult of the Amtrak Quiet Car," an interesting read for quiet care devotees and those unaware: http://fortune.com/2014/09/17/amtrak-quiet-car/

Photo from the Fortune article, “The Cult of the Amtrak Quiet Car,” an interesting read for both quiet car devotees and those completely unaware: http://fortune.com/2014/09/17/amtrak-quiet-car/

 

Follow

Get every new post delivered to your Inbox.

Join 142 other followers