25th Anniversary ButterflyAs Randi has shared in some of her posts, we at RK&A value the concept and four actions associated with Intentional Practice—Plan, Align, Evaluate, and Reflect. A few weeks ago, Randi wrote about Align, which she noted is the most complex. Today I write about Reflect, which is probably the most alluring of the four actions. At the same time, it is also the most easily dismissed—the one swept aside as a luxury. In workshops we facilitate, we usually show museum staff the Cycle of Intentional practice and ask them what percentage of time they spend on each of the fours actions. Inevitably, reflection falls short, usually garnering between 5 and 10 percent. Staff tell us this isn’t for a lack of desire or need; they wish they had more time for reflection. But as is so common in our modern world, we tend to get stuck in the continual act of “doing.” Take, for example, how difficult it has been for me to sit down and write this blog post…. Certainly I am not immune.

I love the literal manifestation of reflection, which is when light strikes a surface and bounces around in unusual ways, making us see something we didn’t see before. For me, the allure of reflection in its literal form is not that different from what happens when we talk about reflection in evaluation. When it comes to evaluation, reflection leads to insights, ah-ha moments, and new ways of seeing, thinking, and knowing. This happens when I invariably ask one of my favorite question, “What does it mean?” and even when reflection is very difficult—for instance—when I ask, “What does failure mean?” the pay-off is usually worth the pain.

Billboards at Night

Billboards at Night (Detroit), Knud Lonberg-Holm, 1942

Despite its allure, we can’t avoid the fact that reflection is easily dismissed, postponed, and overlooked. Why is this? The answer may lie partly in how hard and sometimes downright painful it is to reflect. At times it is too difficult to consider those important questions; it feels easier to ignore them and continue “doing.” In the same way, reflection in its literal form can sometimes be painful, such as the way light reflecting off the hood of a car is blinding or when light bouncing around becomes disorienting. But I don’t think discomfort is the barrier—from my experience, it seems the demands (both internal and external) to produce thwart our intentions of taking the time to reflect.

In our work with evaluation, reflection is critical. Without taking the time to reflect on the meaning of data, evaluation results fall flat and hollow. As evaluators it is our duty and privilege to ask and try to answer hard questions about what data means, what it tells us. And we do that. But even though we possess a valuable outsider perspective and can offer significant insights about evaluation findings, the insider perspective is equally important. And, our work is at its best when reflection happens collaboratively between the client and us.

So, we try as often as we can to facilitate a Reflection Workshop at the end of a project. In a Reflection Workshop we meet with staff from across the museum to collaboratively explore the question, What have we learned? from the evaluation. We don’t simply present findings; rather, we pose questions and facilitate discussion to help staff explore the meaning of the evaluation findings. And, we don’t shy away from negative findings; rather, we use those as opportunities for understanding and growth. The purpose of the workshops is to come to some conclusions about ways to improve the effectiveness of a program. But more than anything, the purpose is to simply take the time to ask challenging questions and think deeply.

25th Anniversary ButterflyAt RK&A, we think a lot about intentional practice and we encourage our clients to do the same. In planning meetings and reflection workshops, we ask clients to think about which elements of their work align with their institutional mission and vision (check out Randi’s blog post for more about the challenges of alignment). We push them to consider who might be the right audience for their program or exhibition, and we ask them to talk about the intended outcomes for their projects. Posing these kinds of questions is much easier for an “outsider” to do because we don’t have institutional baggage or a personal connection to a problem project. As consultants, we aren’t beholden to the way things have always been done. I get it – it can be hard to let go; but seeing clients seek information to make informed decisions is a powerful, exciting process. These clients want more information. They are willing to try new things, to change old (and sometimes new) programs to see if they can improve upon the results. These are museum professionals who want the very best experiences for their visitors.

We recently completed a project with a history museum and the results were, well, not as rosy as one might hope. Change is HardAfter explaining the challenges of changing students’ perspectives in a short, one-time museum visit, we started talking about what could be done to increase the effectiveness of the program. One of our suggestions was to increase the time allotted for the program and rather than spending that extra time in the exhibition, use that time to facilitate a discussion with students so they can process and reflect on what they had seen. Changing a program’s format and duration is a difficult task for the museum to undertake – it may require extra staff and certainly a different schedule – but it could make a difference. A few days later, our client asked us if there are any studies that show that longer programs are more effective. After failing to come up with any examples (if you know of any such studies, please leave a comment), the client asked for another study to see if a longer program leads to a different outcome.

As an evaluator, I want to support museums as they change the way they do their work. Evaluation can provide the necessary information to see if new ideas work. It can give clients the data-based push they need to let go of the way things have always been done and to try something new. If nothing else, the evaluation process can be a forum to remind people that even when you are changing course, there is a place for you on the Cycle of Intentional Practice: Plan, Align, Evaluate, Reflect.

The case study below is from a summative evaluation RK&A did with the Wildlife Conservation Society.  The Madagascar! exhibit at the Bronx Zoo is an indoor exhibit that allows visitors to come face-to-face with wildlife from this island habitat.  The exhibit also features a film, Small Wonders, Big Threats, that addresses environmental challenges the island is facing.

Madagascar! [2009]

A summative evaluation with a zoo

The Wildlife Conservation Society (WCS) contracted Randi Korn & Associates, Inc. (RK&A) to evaluate its new exhibition, Madagascar!, located at the Bronx Zoo. Madagascar! showcases the wildlife and landscapes of the world’s fourth largest island. Built in the historic Lion House, the exhibition transformed the interior, while preserving the historic building’s Beaux-Arts beauty. The exhibition offers opportunities to see the island through the eyes of a conservationist at various interactive stations.

How did we approach this study?

RK&A worked with WCS to clarify its goals and objectives for Madagascar! and to identify criteria to measure visitor outcomes. We conducted a summative evaluation that employed a rigorous, modified pre-test/post-test design to measure visitor learning and attitudinal changes. Through in-depth open-ended interviews, we explored visitors’ attitudes toward and understandings of threats to Madagascar and its animals as well as knowledge of WCS’s conservation efforts on the island. We then scored the interview data using rubrics and compared the achievement of eight objectives by visitors who had not seen the exhibition to visitors who had seen the exhibition.

What did we learn?

Findings demonstrate that the exhibition was extremely successful at achieving its goals. Statistically significant findings showed that visitors who experienced the exhibition gained the following new knowledge, ideas, and beliefs, including: 1) enhanced interest in the animals of Madagascar based on knowledge of their habits, environment, and endangered status (versus interest based solely on novelty); 2) knowledge that Madagascar’s environment and animals are threatened, especially by the loss of trees; and, 3) an understanding of why conservation scientists (including those from WCS) are in Madagascar: to study the animals and environment so that they can implement appropriate conservation strategies toward its protection.

What are the implications of the findings?

Even though recent public discourse on global warming has grown substantially, the general public’s familiarity with environmental issues still tends to be vague or even ill-conceived. Yet, findings demonstrate that Madagascar! shifted visitors’ knowledge of conservation science toward a more accurate, specific, and concrete understanding. These positive findings are remarkable when one considers how difficult it is to change people’s knowledge and attitudes, particularly in one relatively short visit to a single exhibition. Through experiences in exhibitions like Madagascar!, visitors assimilate new ideas and perceptions with their pre-existing ideas and perceptions and create new meaning. The exhibition effectively utilized simple low-tech interactive exhibits, large-scale video walls, live interpretation, and intimate, close-up looks at animals to connect visitors to the environments and wildlife of Madagascar. Evaluation results have shown that zoos can be appropriate environments for moving visitors beyond the novelty of seeing wild animals to developing an understanding of where the animals come from, why they are important, and how conservation efforts can protect them.

25th Anniversary ButterflyAs I have shared in other posts, I value the concept and four actions associated with Intentional Practice. Of the four quadrants that comprise Intentional Practice—Plan, Align, Evaluate, and Reflect—Align is the most complex, and it comes with baggage; tons and tons of it.

At its essence, alignment requires that staff examine all of their work and actions in context of the Impact the museum would like to achieve (as depicted in the center of the Cycle). This examination includes considering what they could continue doing because it helps them achieve their intended impact, what they could change because the project falls short of achieving intended results, or what they might stop doing because results do not support the museum’s intended impact. I have witnessed museums struggling with Alignment because invariably they may need to make some very difficult decisions, and change is inevitable as a result of decision making. And most humans (me included) have trouble with change. Just when things seem to be going well, BAM—something happens and I need to respond by changing Cycle of practice alignsomething.

Alignment can also become complex and difficult because people’s emotions are involved; and when emotions are involved, decision making is a struggle and met with resistance. Among the three possible actions mentioned above, to stop doing something is the most challenging and truly heart wrenching. Before people accept that they may need to stop doing something, their first reaction is to dismiss the evidence and exclaim, “That can’t be true; the evaluation must be wrong.” The next response is a very lucid, logical, rational explanation of how great the program really is—it is the public that needs retooling. Then there is panic and all kinds of thoughts begin to run wild—“How can I stop doing this program (that I love)? How can I stop doing this program that is part of the museum’s tradition?   What will I tell the funder? I know this program takes significant resources, but I love doing this program (and so does the funder). If I stop doing this program, what will I do with the void that is created? What will my colleagues say about the fact that I am doing one less program? What will I do instead?” Complicating matters is the feeling of embarrassment that begins to emerge—a very strong emotion.

One of the reasons people become embarrassed is because they think that others may perceive that the program failed and failure is still embarrassing in the museum community even though so many have written about the value of failure as a way to learn. As an example of how complicated these situations can be, one day an educator called to lament that she was aware that one of the very important programs that the museum had been doing for years was not going as well as it once had. She knew, in her heart, that she needed to reinvent it or drop it all together. Her greatest fear was her director who loved this program. The educator was aware that the program attracted a tiny slice of the public the museum intended to serve, and her annual review was in a few months and she feared that dropping the program, or even changing it would reflect poorly on her, even though it was the right thing to do.

Even though many lament with frustration, “We continue to do things the way we have always done them because it is the way we have always done them,” when there is an opportunity to try something different to reach a better outcome, analyze a situation to stimulate progress, or accept reality and put a program to rest—there is an internal struggle—not in the organization, but a personal struggle. In the Cycle of Intentional Practice, the Align quadrant is the one quadrant that goes deep—becomes personal. Everyone wants to strengthen their museum by aligning what they do with the impact they want to achieve, yet doing so requires a tough-as-nails approach, a relentless focus on the desired result rather than personal feelings about a program, and a recognition that change is inevitable and a complex fact of life.

RK&A’s work with the Perez Art Museum Miami (PAMM) is today’s featured project on the Museum Education Monitor’s (MEM) social media sites!  RK&A has been working with the Perez Art Museum Miami since 2013 to evaluate its Knight School Program, a single-visit program designed to serve all third grade students in Miami-Dade County Public Schools.  We began our work together by helping staff articulate and clarify student outcomes and indicators.  The program intends to enhance students’ critical thinking skills related to observing and interpreting works of art.  We are now in the process of conducting a formative evaluation  that will identify the strengths and areas in need of improvement, before finally conducting a summative evaluation in 2015.

Check out the MEM posting for some additional information by visiting these social media sites today!

Web– http://www.mccastle.com/Public/Default.aspx

Facebook– http://www.facebook.com/Museum.Education.Monitor

Twitter– http://twitter.com/mchriscastle

Pinterest – http://pinterest.com/mchriscastle/

YouTube – http://www.youtube.com/user/MChrisC54

FORUM Blog– http://forum.mccastle.com/

 

25th Anniversary ButterflyWorking in research and evaluation, you become very skeptical of words like “data-driven” and “research-based.” To evaluators, it is quite flattering that these words are so buzzworthy—yes, we want our research and evaluation work to be important, used, and even desired! However, even though these buzzwords grab attention, they can be misleading. For instance, when we talk about data and research at RK&A, we mean original, first-hand data and research, such as interviews, questionnaires, and surveys with museum visitors.

This was on my mind as I recently had the opportunity to help my CARE (Committee on Audience Research and Evaluation) colleague Liz Kunz Kollmann review session proposals for the 2015 AAM Annual Meeting. Liz, as CARE’s representative for the National Program Committee, was charged with reviewing sessions in the Education, Curation, & Evaluation track (all 141 sessions!) along with fellow National Program Committee members in education and curation. Given that audience research and evaluation can be part of many AAM tracks (marketing, development, exhibit design, etc.), Liz recruited some CARE members to help her review sessions in other tracks to see if there were any sessions outside of our designated track that CARE should advocate for.

I volunteered to review sessions in the tracks Development & Membership and Finance & Administration. I had expected to encounter a lot of buzzwords since the AAM session proposals include a description that must be appropriate for display on the AAM website, mobile app, and other published meeting materials. So, I wasn’t surprised but I was struck by the heavy use of terms like “data-driven” and “research-based” (e.g., data-driven strategies for membership recruitment and research-based programming) and was stymied in trying to determine whether these sessions were relevant to CARE—what data is driving the decisions and is it really of interest to CARE members?

Certainly I am not dismissive of research or data that isn’t “original.” There are many definitions of research and data that are applicable to certain scenarios and within certain fields. For instance, arts-based research is a completely valid field of research within art education when conducted well. However, I am biased to collecting original data from visitors first-hand, which is why terminology like “data-driven” and “research-based” makes my ears perk up—because these words prompt many questions for me about the type of data and research and its appropriateness to inform said decisions and practices. Through our work at RK&A, we truly want practitioners to make decisions that are data-driven; that is the greatest outcome of our work! However, we also want our clients to be skilled users and consumers of data and evaluation so much so that their ears perk up at the very mention of “data”—for hopefully, they, too, have become savvy digesters of the language as well as the meaning behind the data when talking about research and evaluation.

Check out our Buzzword Bingo below inspired by Dilbert: http://dilbert.com/strips/comic/2010-10-25/  Warning: this Bingo is informed by RK&A’s professional experience and is not based on original data.  Maybe with the help of our museum colleagues, we can make it “research-based.”  Please share your buzzwords!

Reflection 19 blog v4

The case study below is from a summative evaluation RK&A completed for the Conservation Trust of Puerto Rico.  Based in Manati, Puerto Rico, the Conservation Trust runs a Citizen Science program for local residents.

Citizen Science [2010]

A summative program evaluation with a nature conservancy

The Conservation Trust of Puerto Rico collaborated with RK&A to study the impact of its Citizen Science program, a NSF-funded project designed to involve local Spanish-speaking citizens in scientific research that contributes to growing knowledge about the Trust’s biodiversity and land management efforts. The Citizen Science program underwent formative evaluation in 2009 and summative evaluation in 2010. Summative evaluation is discussed here.

How did we approach this study?

Summative evaluation was guided by four impacts developed using NSF’s Framework for Evaluating Impacts of Informal Science Education Projects. These included that participants will: use and understand the scientific method; experience and understand the purpose of scientific rigor; develop a sense of ownership for the Reserve; and realize that the research in which they participate has wide application to decisions made about conserving the Reserve’s flora and fauna. To explore these impacts, RK&A collected 343 standardized questionnaires, conducted 39 in-depth interviews, and conducted three case studies with participants who have a high level of program involvement.

What did we learn?

In all areas where the Trust hoped to achieve impact with participants, gains were made. Findings show that participants self-reported moderate gains in their knowledge and awareness of flora and fauna and scientific processes; interestingly, those who participated in programs with live animal interaction self-reported greater gains. Some also acknowledged attitude and behavior changes as a result of program participation. Findings further demonstrate that a majority of participants felt the Reserve is relevant and valuable to them and Puerto Rico, honing and developing their sense of pride and ownership. Finally, some participants also recognized the application and value of the research in which they participated. Findings also raised some potential barriers to achieving impact, such as the average participants’ brief, often isolated exposure to a specific research project; as well as the fact that many participants entered the program with prior knowledge and interests that might limit the program’s potential to facilitate significant learning gains.

 What are the implications of the findings?

A review of Citizen Science projects found that very few have formally assessed the impact of participants’ experiences.[1] This study sought to contribute to knowledge in this area by exploring participants’ experiences through the lens of the four program impacts mentioned above. Some findings are consistent with those of other Citizen Science studies, such as the fact that participants exhibited more gains in content knowledge than process skills, and many participants enter with prior interest in and knowledge of science and conservation. Other findings suggest that animal interaction and small group size positively influenced participants’ experiences and perceptions of learning. Collectively, findings suggest implications for program design, including the importance of bridging participants’ experiences so they envision their contribution as part of a greater goal.

[1] Bonney, R., Ballard, H., Jordan, R., McCallie, E., Phillips, T., Shirk, J., & Wilderman, C. C. (2009a). Public participation in scientific research: Defining the field and assessing its potential for informal science education. A CAISE inquiry group report. Washington, D.C.: Center for Advancement of Informal Science Education (CAISE).

Follow

Get every new post delivered to your Inbox.

Join 143 other followers