25th Anniversary Butterfly

For the last few months, I’ve been on maternity leave with my second child. My first little girl is three years old and I had forgotten many of the nuances of caring for an infant. Through the process of getting to know my new little one, I began to reflect on a drawing that we often show to our clients in our planning or reflection workshops.

 

panic learning comfortThe drawing consists of three concentric circles. In the center circle, we write “comfort zone.” This, we tell practitioners, is where most of us operate on a daily basis during our work day.  We feel safe in this zone because it consists of routines and interactions with colleagues we know well. In the middle circle, we write “learning zone.” This is the zone where we have “ah-ha” moments and take in new ideas from our own work or interactions with colleagues. Perhaps we go to a meeting where different points of view are shared, and we have a breakthrough moment about a project we are working on or see something familiar in a new light. We tell clients that this is, ideally, where we want them to be during our workshops—open to sharing and receiving new ideas. Then, of course, there is the outside circle which we label the “panic zone.” This is the zone where we shut down because we are too uncomfortable to take in new learning or ideas. When this happens, we long for the “comfort zone” and, until we find our way back there, we are unlikely to be receptive to our colleagues’ ideas. Instead, we often put up walls or use defense mechanisms to deflect what we find uncomfortable.

I’ve operated in all of these zones the past few months. Before going on maternity leave, I was in my “comfort zone” with my first daughter. Even though she changes every day, we have a daily routine that works pretty well. With her, I happily and regularly enter the “learning zone” as well. She is constantly learning new things and, now that she is in school, she is learning at a rapid rate. As I’m sure many of you who are parents know, this is challenging and surprising in a pleasant sort of way. Then, my second daughter was born, and I entered the “panic zone.” She is amazing but, although I remembered the newborn phase in the abstract, I’d forgotten all the little challenges of caring for a very small baby. Once you think you’ve mastered one thing, it changes, and you have to adapt all over again within a relatively short period of time. As someone who loves organization, schedules, and routines, I find this an uncomfortable state of being. She is slowly shifting into more predictable patterns as she grows but I’ve decided that the newborn phase is my “panic zone.” I’m much more comfortable with older babies and toddlers.

Comfort ZoneWhen we facilitate meetings or workshops, we encourage our clients to invite a diverse group of relevant stakeholders to sit around the table. Many times, those who attend have interacted with one another on a limited basis. Thus, the reason we present the learning zone graphic is because we know from experience that facilitating a conversation among colleagues who rarely come together to discuss and reflect on ideas can create a challenging environment for some. And, as we tell everyone, we all have different thresholds for these three zones. What is comfortable for some might cause others to panic. For example, I am more likely to hit the “panic zone” when dealing with a newborn while another mother might hit this zone more readily with a toddler. Since we want everyone to operate in the “learning zone,” we remind practitioners to pay attention to how they and their colleagues are receiving the ideas being discussed so no one enters the “panic zone” where learning ceases to happen. So, at a time when we are all reflecting on the past year and forming New Year’s resolutions, I find myself thinking how important it is for us all to be honest with ourselves and one another about our thresholds for these different zones so we can spend more time learning and less time panicking.

25th Anniversary ButterflyThis year, I was lucky to receive a full scholarship to attend the 42nd annual Museum Computer Network (MCN) conference in Dallas, TX. For those who don’t know, MCN’s a fantastic organization that focuses on digital engagement in the cultural sector. Here’s a video of some of the highlights from the conference:

 

 

I’d wanted to attend MCN for a long time after hearing many friends and colleagues rave about the amazing energy and talents of MCN-ers.  Before I left, I set two (very broad) goals for myself for MCN2014:

  1. Deepen my own understanding of how digital is transforming museums
  2. Think about new ways to apply this understanding to my work as an evaluator

Luckily, this year’s conference theme—“Think Big, Start Small” —aligned perfectly with these goals. I figured I would “start small” by going to the conference, all the while remembering to “think big” about the relationship between evaluation and digital transformation in the cultural sector.  And with that in mind, I dove headfirst into the MCN2014 madness.

 

Most of the MCN scholarship recipients (I’m sitting on the bench on the left).

Most of the MCN scholarship recipients (I’m sitting on the bench on the left).

I quickly realized that I was one of the only full-time evaluators there. Despite the high energy of Wednesday’s activities (including a workshop on dabbling with microcontrollers and a series of inspiring Ignite talks), I couldn’t shake the feeling that I was going to be out of my element among all of these “tech” people as I headed to the first official sessions on Thursday. Would anyone understand why an evaluator was at a conference that’s all about digital? Worries aside, I was curious to learn how other attendees were thinking about and creating digital experiences, and how, if at all, they were working to evaluate their impacts (even if we might use different vocabulary to talk about the evaluation process).

My worries were quickly alleviated.  While there may not have been many full-time evaluators at MCN2014, I was blown away by the amount of evaluative thinking I observed in nearly every session I attended (and, frankly, in all of the side conversations I had throughout the conference). Not only are those who work at the intersection of the cultural and digital sectors a highly energetic and creative group of people, but they are also working hard to determine realistic goals for their projects and are thinking seriously about how to measure them. I was both inspired and amazed by the extent to which evaluation was part of the other attendees’ thinking and planning processes. This evaluative thinking showed throughout the conference tweets (#MCN2014 was actually trending on Twitter during the conference), so I figured I’d use a few of my favorites to talk about some of the many ideas I’ve been thinking about ever since I got back from Dallas:

 

Bollwerk tweetAllen-Greil tweet

These two perfectly sum up a few ideas that we are constantly thinking about at RK&A. To echo Simon Tanner, there’s no point in gathering data just for sake of having information. It’s essential to think about why you want to gather data and outline how you plan to use the data you collect. Without articulating a plan for using the data in the long-run, it becomes difficult to ensure that you’re gathering the types of data that will be most useful to you. And having a plan for how you will use the data will ensure that when it’s time for analysis you can align your analysis with your long-terms goals. At RK&A, we want to make sure our clients clearly understand that data are there to be used so that when it comes time to make changes based on the data, they are already prepared to do so. However, I think that data is primarily used to test assumptions rather than confirm them. The word “confirm” is misleading, because it presupposes positive assumptions, i.e. that a project is working well. If that’s the case then it’s understandable that people want to see those positive assumptions confirmed (which in turn would mean having to make few changes). Learning to accept what the data tells you, even when the results are negative, is no simple task.  It’s very easy to become so attached to a project that you ignore the problems and only see what’s working well. But remaining “open to surprise” and letting the data shine new light on a project is the best way to develop a true understanding of what’s happening so you can adapt and make changes to help achieve your goals.

 

Birch tweet

I have mixed feelings on drawing distinctions between testing for user experience and testing for content.  In my opinion, the two are separated by a fine line—at what point in any museum interactive, mobile app, game, or other digital experience does the user experience become entirely separate from the content?  All content matters in terms of the user experience because the content itself, no matter the particular subject, dictates the experience that visitors/users expect to have.  In other words, I think that visitors’/users’ prior expectations of a (digital) experience and opinions of the experience after use inherently depend on the subject matter presented to them.  Visitors’/users’ preconceptions about the particular content at hand are so much a part of their experience.  While there are always smaller usability issues that can be addressed without giving much regard to content (the size of a button, for example), I ultimately think that the entire user experience can never be truly separated from the content that supports it.  If you change the content, you can’t help but change the experience.

 

Those are just a few of the ideas discussed at MCN2014 that I am still thinking about weeks later. The conference evoked so many interesting issues and questions that I couldn’t possibly go into all of them in one post. Suffice it to say that I left MCN2014 feeling silly for ever being nervous about whether others would perceive the overlaps between the worlds of evaluation and digital. MCN turned out to be a fantastic experience that greatly expanded my own thinking on these issues, and I’m excited to put these new ideas to use in my work and to (hopefully) explore them further at MCN2015 in Minneapolis!

Didn’t make it to MCN2014 but want to view the sessions? Check out MCN’s YouTube channel. And don’t forget to check out the amazing (and short—9 minutes!) Ignite talks.  You can also find all of the conference tweets using the hashtag #MCN2014.

25th Anniversary ButterflyAs Randi has shared in some of her posts, we at RK&A value the concept and four actions associated with Intentional Practice—Plan, Align, Evaluate, and Reflect. A few weeks ago, Randi wrote about Align, which she noted is the most complex. Today I write about Reflect, which is probably the most alluring of the four actions. At the same time, it is also the most easily dismissed—the one swept aside as a luxury. In workshops we facilitate, we usually show museum staff the Cycle of Intentional practice and ask them what percentage of time they spend on each of the fours actions. Inevitably, reflection falls short, usually garnering between 5 and 10 percent. Staff tell us this isn’t for a lack of desire or need; they wish they had more time for reflection. But as is so common in our modern world, we tend to get stuck in the continual act of “doing.” Take, for example, how difficult it has been for me to sit down and write this blog post…. Certainly I am not immune.

I love the literal manifestation of reflection, which is when light strikes a surface and bounces around in unusual ways, making us see something we didn’t see before. For me, the allure of reflection in its literal form is not that different from what happens when we talk about reflection in evaluation. When it comes to evaluation, reflection leads to insights, ah-ha moments, and new ways of seeing, thinking, and knowing. This happens when I invariably ask one of my favorite question, “What does it mean?” and even when reflection is very difficult—for instance—when I ask, “What does failure mean?” the pay-off is usually worth the pain.

Billboards at Night

Billboards at Night (Detroit), Knud Lonberg-Holm, 1942

Despite its allure, we can’t avoid the fact that reflection is easily dismissed, postponed, and overlooked. Why is this? The answer may lie partly in how hard and sometimes downright painful it is to reflect. At times it is too difficult to consider those important questions; it feels easier to ignore them and continue “doing.” In the same way, reflection in its literal form can sometimes be painful, such as the way light reflecting off the hood of a car is blinding or when light bouncing around becomes disorienting. But I don’t think discomfort is the barrier—from my experience, it seems the demands (both internal and external) to produce thwart our intentions of taking the time to reflect.

In our work with evaluation, reflection is critical. Without taking the time to reflect on the meaning of data, evaluation results fall flat and hollow. As evaluators it is our duty and privilege to ask and try to answer hard questions about what data means, what it tells us. And we do that. But even though we possess a valuable outsider perspective and can offer significant insights about evaluation findings, the insider perspective is equally important. And, our work is at its best when reflection happens collaboratively between the client and us.

So, we try as often as we can to facilitate a Reflection Workshop at the end of a project. In a Reflection Workshop we meet with staff from across the museum to collaboratively explore the question, What have we learned? from the evaluation. We don’t simply present findings; rather, we pose questions and facilitate discussion to help staff explore the meaning of the evaluation findings. And, we don’t shy away from negative findings; rather, we use those as opportunities for understanding and growth. The purpose of the workshops is to come to some conclusions about ways to improve the effectiveness of a program. But more than anything, the purpose is to simply take the time to ask challenging questions and think deeply.

25th Anniversary ButterflyAt RK&A, we think a lot about intentional practice and we encourage our clients to do the same. In planning meetings and reflection workshops, we ask clients to think about which elements of their work align with their institutional mission and vision (check out Randi’s blog post for more about the challenges of alignment). We push them to consider who might be the right audience for their program or exhibition, and we ask them to talk about the intended outcomes for their projects. Posing these kinds of questions is much easier for an “outsider” to do because we don’t have institutional baggage or a personal connection to a problem project. As consultants, we aren’t beholden to the way things have always been done. I get it – it can be hard to let go; but seeing clients seek information to make informed decisions is a powerful, exciting process. These clients want more information. They are willing to try new things, to change old (and sometimes new) programs to see if they can improve upon the results. These are museum professionals who want the very best experiences for their visitors.

We recently completed a project with a history museum and the results were, well, not as rosy as one might hope. Change is HardAfter explaining the challenges of changing students’ perspectives in a short, one-time museum visit, we started talking about what could be done to increase the effectiveness of the program. One of our suggestions was to increase the time allotted for the program and rather than spending that extra time in the exhibition, use that time to facilitate a discussion with students so they can process and reflect on what they had seen. Changing a program’s format and duration is a difficult task for the museum to undertake – it may require extra staff and certainly a different schedule – but it could make a difference. A few days later, our client asked us if there are any studies that show that longer programs are more effective. After failing to come up with any examples (if you know of any such studies, please leave a comment), the client asked for another study to see if a longer program leads to a different outcome.

As an evaluator, I want to support museums as they change the way they do their work. Evaluation can provide the necessary information to see if new ideas work. It can give clients the data-based push they need to let go of the way things have always been done and to try something new. If nothing else, the evaluation process can be a forum to remind people that even when you are changing course, there is a place for you on the Cycle of Intentional Practice: Plan, Align, Evaluate, Reflect.

The case study below is from a summative evaluation RK&A did with the Wildlife Conservation Society.  The Madagascar! exhibit at the Bronx Zoo is an indoor exhibit that allows visitors to come face-to-face with wildlife from this island habitat.  The exhibit also features a film, Small Wonders, Big Threats, that addresses environmental challenges the island is facing.

Madagascar! [2009]

A summative evaluation with a zoo

The Wildlife Conservation Society (WCS) contracted Randi Korn & Associates, Inc. (RK&A) to evaluate its new exhibition, Madagascar!, located at the Bronx Zoo. Madagascar! showcases the wildlife and landscapes of the world’s fourth largest island. Built in the historic Lion House, the exhibition transformed the interior, while preserving the historic building’s Beaux-Arts beauty. The exhibition offers opportunities to see the island through the eyes of a conservationist at various interactive stations.

How did we approach this study?

RK&A worked with WCS to clarify its goals and objectives for Madagascar! and to identify criteria to measure visitor outcomes. We conducted a summative evaluation that employed a rigorous, modified pre-test/post-test design to measure visitor learning and attitudinal changes. Through in-depth open-ended interviews, we explored visitors’ attitudes toward and understandings of threats to Madagascar and its animals as well as knowledge of WCS’s conservation efforts on the island. We then scored the interview data using rubrics and compared the achievement of eight objectives by visitors who had not seen the exhibition to visitors who had seen the exhibition.

What did we learn?

Findings demonstrate that the exhibition was extremely successful at achieving its goals. Statistically significant findings showed that visitors who experienced the exhibition gained the following new knowledge, ideas, and beliefs, including: 1) enhanced interest in the animals of Madagascar based on knowledge of their habits, environment, and endangered status (versus interest based solely on novelty); 2) knowledge that Madagascar’s environment and animals are threatened, especially by the loss of trees; and, 3) an understanding of why conservation scientists (including those from WCS) are in Madagascar: to study the animals and environment so that they can implement appropriate conservation strategies toward its protection.

What are the implications of the findings?

Even though recent public discourse on global warming has grown substantially, the general public’s familiarity with environmental issues still tends to be vague or even ill-conceived. Yet, findings demonstrate that Madagascar! shifted visitors’ knowledge of conservation science toward a more accurate, specific, and concrete understanding. These positive findings are remarkable when one considers how difficult it is to change people’s knowledge and attitudes, particularly in one relatively short visit to a single exhibition. Through experiences in exhibitions like Madagascar!, visitors assimilate new ideas and perceptions with their pre-existing ideas and perceptions and create new meaning. The exhibition effectively utilized simple low-tech interactive exhibits, large-scale video walls, live interpretation, and intimate, close-up looks at animals to connect visitors to the environments and wildlife of Madagascar. Evaluation results have shown that zoos can be appropriate environments for moving visitors beyond the novelty of seeing wild animals to developing an understanding of where the animals come from, why they are important, and how conservation efforts can protect them.

25th Anniversary ButterflyAs I have shared in other posts, I value the concept and four actions associated with Intentional Practice. Of the four quadrants that comprise Intentional Practice—Plan, Align, Evaluate, and Reflect—Align is the most complex, and it comes with baggage; tons and tons of it.

At its essence, alignment requires that staff examine all of their work and actions in context of the Impact the museum would like to achieve (as depicted in the center of the Cycle). This examination includes considering what they could continue doing because it helps them achieve their intended impact, what they could change because the project falls short of achieving intended results, or what they might stop doing because results do not support the museum’s intended impact. I have witnessed museums struggling with Alignment because invariably they may need to make some very difficult decisions, and change is inevitable as a result of decision making. And most humans (me included) have trouble with change. Just when things seem to be going well, BAM—something happens and I need to respond by changing Cycle of practice alignsomething.

Alignment can also become complex and difficult because people’s emotions are involved; and when emotions are involved, decision making is a struggle and met with resistance. Among the three possible actions mentioned above, to stop doing something is the most challenging and truly heart wrenching. Before people accept that they may need to stop doing something, their first reaction is to dismiss the evidence and exclaim, “That can’t be true; the evaluation must be wrong.” The next response is a very lucid, logical, rational explanation of how great the program really is—it is the public that needs retooling. Then there is panic and all kinds of thoughts begin to run wild—“How can I stop doing this program (that I love)? How can I stop doing this program that is part of the museum’s tradition?   What will I tell the funder? I know this program takes significant resources, but I love doing this program (and so does the funder). If I stop doing this program, what will I do with the void that is created? What will my colleagues say about the fact that I am doing one less program? What will I do instead?” Complicating matters is the feeling of embarrassment that begins to emerge—a very strong emotion.

One of the reasons people become embarrassed is because they think that others may perceive that the program failed and failure is still embarrassing in the museum community even though so many have written about the value of failure as a way to learn. As an example of how complicated these situations can be, one day an educator called to lament that she was aware that one of the very important programs that the museum had been doing for years was not going as well as it once had. She knew, in her heart, that she needed to reinvent it or drop it all together. Her greatest fear was her director who loved this program. The educator was aware that the program attracted a tiny slice of the public the museum intended to serve, and her annual review was in a few months and she feared that dropping the program, or even changing it would reflect poorly on her, even though it was the right thing to do.

Even though many lament with frustration, “We continue to do things the way we have always done them because it is the way we have always done them,” when there is an opportunity to try something different to reach a better outcome, analyze a situation to stimulate progress, or accept reality and put a program to rest—there is an internal struggle—not in the organization, but a personal struggle. In the Cycle of Intentional Practice, the Align quadrant is the one quadrant that goes deep—becomes personal. Everyone wants to strengthen their museum by aligning what they do with the impact they want to achieve, yet doing so requires a tough-as-nails approach, a relentless focus on the desired result rather than personal feelings about a program, and a recognition that change is inevitable and a complex fact of life.

RK&A’s work with the Perez Art Museum Miami (PAMM) is today’s featured project on the Museum Education Monitor’s (MEM) social media sites!  RK&A has been working with the Perez Art Museum Miami since 2013 to evaluate its Knight School Program, a single-visit program designed to serve all third grade students in Miami-Dade County Public Schools.  We began our work together by helping staff articulate and clarify student outcomes and indicators.  The program intends to enhance students’ critical thinking skills related to observing and interpreting works of art.  We are now in the process of conducting a formative evaluation  that will identify the strengths and areas in need of improvement, before finally conducting a summative evaluation in 2015.

Check out the MEM posting for some additional information by visiting these social media sites today!

Web– http://www.mccastle.com/Public/Default.aspx

Facebook– http://www.facebook.com/Museum.Education.Monitor

Twitter– http://twitter.com/mchriscastle

Pinterest – http://pinterest.com/mchriscastle/

YouTube – http://www.youtube.com/user/MChrisC54

FORUM Blog– http://forum.mccastle.com/

 

Follow

Get every new post delivered to your Inbox.

Join 142 other followers