Posts Tagged ‘reflect’

25th Anniversary ButterflyI have the honor of writing the last blog post of our 25th year—which just so happens to coincide with the end of the calendar year. It was New Year’s Eve in 1989 when my husband and I arrived in DC after driving from LA. RK&A was born soon thereafter. When I reflect on the last 25 years I find it impossible not to think about the changes that have taken place in our little evaluation world and the larger museum world.   Seriously, a whole new world order has emerged. And all of us at RK&A have tried very hard to move along with those changes so we could continue living our passion—working with museums to help them achieve impact in their communities.

Our intent for this celebratory year was to share our learning, and I hope we have done that for you. Our learning isn’t always linear, obvious, or easy to describe. The very act of writing these 25 posts has helped us process and internalize what we have learned, which helps us continually apply our learning to our practice. Honestly, sometimes we struggled to find a learning topic that we were ready to share. Sometimes the things we were learning felt too new or raw to share; other times we weren’t far enough along in our thinking where we had a handle on exactly what we had learned; and sometimes, if we were lucky, through the process of writing, we clarified our thinking and learning. Learning can be a funny thing; new ideas can feel scary—especially if they go against what we are accustomed to thinking or take us out of our comfort zone (see Reflection 24!

I know learning can be fun (or so I am told), but sometimes learning can be really hard—like the times when we (Okay, I) wrote circles around an idea because the learning hadn’t quite jelled, where I didn’t quite have the words to express my thought, or when my writing sounded murky—obviously not my intent. More and more, though, with each passing year, I have come to respect and take advantage of time—that thing we never seem to have enough of. Time can be my friend if I let it; if I patiently let an idea simmer or if I deliberately take the time to become acquainted with a new way of thinking I can begin to ease into the new idea until it feels a tiny bit more comfortable—comfortable enough for me to begin playing with it. Without any self-imposed pressure (take note—that’s the important part), I just let it roll around in my head until it feels more familiar.

A past New Year’s Day walk; it was chilly but lovely.

A past New Year’s Day walk; it was chilly but lovely.

So I am working on the obvious emergence of 2015; a seemingly familiar idea because a new year emerges every 365 days or so—and I’ve lived through enough of them that this shouldn’t be a surprise. The unknown (e.g., the future), like learning, can be scary. And the weather isn’t cooperating either—at this writing it is dreary and gray—not the way I want to end one year and not very welcoming as the start of another. My vegetable garden is soggy and dormant; my front garden lacks interest. But I have hope because on New Year’s Day I will take my 10-mile walk as I have done for the past many years and ready myself for all kinds of new experiences and learning. I’m getting kind of excited just thinking about it. I do not know what 2015 will bring, but I know my glass will be half full and my learning will be rich. I can just feel it.

Happy holiday to all and a very healthy new year!

Read Full Post »

25th Anniversary ButterflyAs Randi has shared in some of her posts, we at RK&A value the concept and four actions associated with Intentional Practice—Plan, Align, Evaluate, and Reflect. A few weeks ago, Randi wrote about Align, which she noted is the most complex. Today I write about Reflect, which is probably the most alluring of the four actions. At the same time, it is also the most easily dismissed—the one swept aside as a luxury. In workshops we facilitate, we usually show museum staff the Cycle of Intentional practice and ask them what percentage of time they spend on each of the fours actions. Inevitably, reflection falls short, usually garnering between 5 and 10 percent. Staff tell us this isn’t for a lack of desire or need; they wish they had more time for reflection. But as is so common in our modern world, we tend to get stuck in the continual act of “doing.” Take, for example, how difficult it has been for me to sit down and write this blog post…. Certainly I am not immune.

I love the literal manifestation of reflection, which is when light strikes a surface and bounces around in unusual ways, making us see something we didn’t see before. For me, the allure of reflection in its literal form is not that different from what happens when we talk about reflection in evaluation. When it comes to evaluation, reflection leads to insights, ah-ha moments, and new ways of seeing, thinking, and knowing. This happens when I invariably ask one of my favorite question, “What does it mean?” and even when reflection is very difficult—for instance—when I ask, “What does failure mean?” the pay-off is usually worth the pain.

Billboards at Night

Billboards at Night (Detroit), Knud Lonberg-Holm, 1942

Despite its allure, we can’t avoid the fact that reflection is easily dismissed, postponed, and overlooked. Why is this? The answer may lie partly in how hard and sometimes downright painful it is to reflect. At times it is too difficult to consider those important questions; it feels easier to ignore them and continue “doing.” In the same way, reflection in its literal form can sometimes be painful, such as the way light reflecting off the hood of a car is blinding or when light bouncing around becomes disorienting. But I don’t think discomfort is the barrier—from my experience, it seems the demands (both internal and external) to produce thwart our intentions of taking the time to reflect.

In our work with evaluation, reflection is critical. Without taking the time to reflect on the meaning of data, evaluation results fall flat and hollow. As evaluators it is our duty and privilege to ask and try to answer hard questions about what data means, what it tells us. And we do that. But even though we possess a valuable outsider perspective and can offer significant insights about evaluation findings, the insider perspective is equally important. And, our work is at its best when reflection happens collaboratively between the client and us.

So, we try as often as we can to facilitate a Reflection Workshop at the end of a project. In a Reflection Workshop we meet with staff from across the museum to collaboratively explore the question, What have we learned? from the evaluation. We don’t simply present findings; rather, we pose questions and facilitate discussion to help staff explore the meaning of the evaluation findings. And, we don’t shy away from negative findings; rather, we use those as opportunities for understanding and growth. The purpose of the workshops is to come to some conclusions about ways to improve the effectiveness of a program. But more than anything, the purpose is to simply take the time to ask challenging questions and think deeply.

Read Full Post »

We have been thinking about intentional practice a lot lately.  The article below, written by Randi, appeared in ASTC Dimensions May/June 2008 issue.  If you would like to read more of Randi’s thoughts on intentional practice, be sure to read her 2007 Curator article, “The Case for Holistic Intentionality.”

At museum conferences these days, people are talking about accountability, public impact, and relevance. These ideas are not new. A decade ago, in a 1997 keynote address for the Mid-Atlantic Association of Museums’ 50th anniversary, the late Smithsonian scholar Stephen Weil spoke of the “in-your-face, bottom-line, hard-nosed questions”—the ones that museums often hope to keep under wraps: “Do museums really matter? Can and do museums make a difference?”In arguing that some museums do make a difference, and that all should strive to do so, Weil supported the notion that “the very things that make a museum good are its intent to make a ‘positive difference in the quality of people’s lives.’” He borrowed this last phrase from the United Way of America, which was then challenging its grantees to document the benefits a given program had made in their lives.

Today, museums face accountability questions from many directions. In response to the Government Performance and Results Act of 1993, U.S. federal agencies began to articulate the kinds of outcomes they expected grantees to document. Private foundations followed suit, reexamining their own evaluation practices, as well as those of grantees. The effort continues. The National Science Foundation recently published its Framework for Evaluating Impacts of Informal Science Education Projects, outlining five categories of impact it expects grantees to assess. And for those who object that “you can’t measure mission-centered work,” current United Way CEO Brian Gallagher, as reported in the Wall Street Journal, has a succinct reply: “You most certainly can. The question is, ‘Are you committed to do it?’ And then, ‘Are you committed to report on it?’”

As museums begin to grapple with their intent to make a positive difference, they can start by reexamining their museum’s mission. Weil believed, as many do still, that a mission is key to an institution’s success. A museum’s mission should be a declaration of its core purpose—clarifying what the museum values, reflecting what the museum embodies, and describing its intent to affect its public and community. Establishing a clear institutional purpose, Weil believed, is the first step to being able to assess effectiveness in achieving public impact.

From my own experience as an evaluator, I would add this observation: Museums do not, in and of themselves, value, reflect, or intend. People do.

An institution’s mission will not be within reach unless everyone who works in that institution is mission-focused and mission-driven. Before museums can assess their impact, staff must collectively clarify their intent. Public impact, relevance, and value grow from what I have called “intentional practice”—the willingness of everyone in the museum to examine all operational activities through three mission-based filters: clarity of intent, alignment of practice and resources, and reflective inquiry.

  • Clarity of intent. Opportunities for all staff to come together to discuss the core values of their museum are vital. Colleagues should both encourage others to explore their passions and also challenge others’ thinking as a way of clarifying what is truly of importance.  In the spirit of thoughtful inquiry, why not ask a colleague to defend his or her position? Most people appreciate being asked to explain why they think the way they do. This kind of exploration allows practitioners to voice the passion behind their ideas and learn what they, as a group, really care about. Reexamining the essence of the museum together can reinvigorate the collaborative spirit, enabling staff to further their practice with intent.
  • Alignment of practices and resources. Unless the work of the museum is aligned with its intent, staff may spend time and resources on activities that are good in themselves but may not support the museum’s intent. Perhaps staff should determine—through evaluation—which programs yield the highest impact, keep those programs, and either improve or discontinue those that do not deliver impact. Aligning practice—the activities a museum does and how it does them—and resources so they support the museum’s intent requires thinking about what you should be doing and what you need not do any more. Conversations about realignment will deepen staff members’ understanding of the museum’s intent and the ways in which their work supports it.
  • Reflective inquiry. As an evaluator, I frequently see front-end and formative evaluation being used effectively to shape a final visitor experience. The same cannot be said of summative evaluation. By the time a mandated final report is done, practitioners may have little time or motivation to review it. This is unfortunate because much can be learned through reflecting on past work.

I see a strong relationship between taking the time to think about the work you have done and learning from the work you have done. Practitioners who want to be intentional in their practice can use summative evaluation as a way to gain insight and knowledge about visitors’ perspectives and experiences. The outcome of such reflective inquiry is learning about the ways in which their museum is achieving impact. I would encourage all museums to routinely set aside time for staff to use inquiry as a reflection strategy and to discuss their practice in the context of the institution’s intent.

In conclusion, accountability questions are not likely to disappear, but even if they did, museum practitioners would still need to respond to the “To what end?” question. The sustainable health of the museum depends on it.

Most of the workers I encounter in museums are passionate about their work and want to make a positive difference in people’s lives. If practitioners begin collaborating with colleagues to clarify their museum’s intent, realign their practices and resources to support that intent, and engage in reflective inquiry to learn how they can improve their efforts, they will be on their way to achieving that goal.

Read Full Post »

25th Anniversary ButterflyWe are excited to introduce a guest blogger this week: Johanna Jones, former Managing Director of RK&A’s San Francisco office.  Having been with RK&A for 14 years, Johanna contributed greatly to the company’s learning, and thus, we are happy she agreed to reflect as part our 25 years of learning series.


As I think back on my work at RK&A, like Stephanie, I am struck by an unintended outcome of the evaluation process; namely, that asking visitors questions about their experience can serve an important interpretive role in museums and for visitors. We don’t often think about the direct value of evaluation on visitors—rather we focus on evaluation as an essential step in the institutional cycle of learning (see reflection 2 for the cycle of learning) and in creating an intentional organization which, of course, benefits visitors. I would argue that the evaluation process itself—the very act of interviewing visitors—serves a powerful interpretative function. When visitors are asked open-ended questions about their visit, they are afforded the time and space to reflect on their experiences. The questions become a framework for thinking about their visit—beyond what they liked and didn’t like—that prompts them to consider “What does this mean to me?”openended question

I didn’t always see the direct value of evaluation for visitors. When I first started building my evaluation skills, the educator in me worried about imposing on visitors’ time by asking questions. I had some rocky starts—people have been so saturated with market research that they are wary of someone approaching them for feedback. But I quickly learned that if you ask visitors meaningful open-ended questions, you are usually met with meaningful responses. I remember a Vietnam veteran who cried when telling me what the American flag meant to him and the young children who were pumped to save the Condors (not exactly warm and cuddly creatures). I recall avid art museum-goers who were amazed to realize that they could interpret works of art for themselves and Twenty-somethings who expressed civic pride in a once beleaguered natural history museum. The more I talked to visitors the more I realized that the evaluations were not only fulfilling the institutions’ need to understand visitors but also visitors’ need to process and make meaning from their museum experience.

Now, as a museum visitor, I still use meaty open-ended questions. I ask them of myself and my family members when we visit museums. And, I must say, I find our conversations richer for it.

Read Full Post »

Welcome to our new Throwback Thursday series, where we take a moment to look back at projects from our archives.  Today we’ll be sharing a case study about our planning and evaluation work with the Science Museum of Virginia and their Sphere Corps Program.  You might recall this particular Science On a Sphere program from one of our prior posts, Learning to Embrace Failure, and today we’ll share a bit more about how we approached the study, what we learned, and the implications of those findings.

Sphere Corps Program [2012]

For this planning and evaluation project with The Science Museum of Virginia (SMV), RK&A evaluated Sphere Corps, a Science on a Sphere program about climate change developed by SMV with funding from the National Oceanic and Atmospheric Administration (NOAA).    

How did we approach this study?  

The study was designed around RK&A’s belief that organizations must be intentional in their practice by continually clarifying purpose, aligning practices and resources to achieve purpose, measuring outcomes, and learning from practice to strengthen ongoing planning and actions.  To this end, the Sphere Corps project included five phases of work—a literature review, a workshop to define intended program outcomes, two rounds of formative evaluation, and two reflection workshops.  Formative evaluation data were collected using naturalistic observations and in-depth interviews.  Each phase of work allowed staff to explore their vision for the Sphere Corps program and how it changed over time as they learned from and reflected on evaluation findings.

What did we learn?SOS

SMV staff’s goal was to create a facilitated, inquiry-based Science on a Sphere program about climate change.  RK&A first completed a literature review that revealed a facilitated Sphere experience was in keeping with best practices and that using inquiry methods in a 20-minute program would be challenging but worth exploring further.  Staff then brainstormed and honed the outcomes they hoped to achieve in Sphere Corps, which guided planning and script development.  The first round of formative evaluation identified implementation barriers and an overabundance of iClicker questions, all of which created a challenging environment for educators to effectively use inquiry.  Upon reflection, staff reduced the number of iClicker questions and added visualizations and questions that required close observation of the Sphere.  Following a second round of formative evaluation, staff made additional changes to the program script and began to reflect on the reality of using inquiry in a single 20-minute program.  Since the script covered a range of topics related to climate change, staff wondered if they should instead go deeper with one topic while encouraging more visitor observation and interpretation of Sphere data.  Out of this discussion arose the idea of “mini-programs”—a series of programs that would focus on communicating one key idea about climate change, such as helping people understand the difference between weather and climate.

What are the implications of the findings?

Central to the idea of the “mini-program” is the idea of doing less to achieve more.  Impact and outcomes are incredibly difficult to achieve and trying to achieve too much often results in accomplishing very little.  Through a reflection workshop and staff discussion, the SMV team was able to prioritize and streamline the outcomes and indicators originally written for the Sphere Corps program.  Staff also recognized that their primary goal with the Sphere Corps program is to encourage visitors to think more critically about the science behind climate change.  By scaling down the number of topics covered in the presentation, each program could intentionally focus on: (1) one key idea or question related to climate change; (2) achievement of only a few intended outcomes; and (3) implementation of specific facilitation strategies to achieve those outcomes.  Intentionally covering less content also opens up opportunities to more effectively use inquiry methods.

Read Full Post »

25th Anniversary ButterflySometimes when learning surfaces slowly, it is barely visible, until one day the world looks different.  Responding to that difference is the first layer of that complex process often labeled as learning.  The Cycle of Intentional Practice was a long time coming—emerging from many years of conducting evaluations, where I worked closely with museum staff and leadership as well as with visitors.  The Cycle of Intentional Practice is an illustration of an ideal work cycle that started to form when I was writing “A Case for Holistic Intentionality”.  I am visually oriented and I often have to draw my ideas before I write about them; in this case, I was writing about my ideas and then I felt the need to create a visualization to depict what I was thinking—in part to help me understand what I was thinking, but also to help others.  I included the first iteration of the cycle in the manuscript to Curator, but the editor said the Journal does not usually publish that kind of illustration, so I put it aside.

That original cycle differs from the one I use today—it was simpler (it included “Plan,” “Act,” and “Evaluate”), and while I didn’t know it at the time, it was a draft.  There have been several more iterations over time (one was “Plan,” “Act,” and “Evaluate & Reflect,” for example); as I continue to learn and improve my practice, I change the cycle accordingly.  Most stunning to me was that the first draft of the cycle showed nothing in the center—nothing!  I feel a little embarrassed by my omission and I am not entirely sure what I was thinking at the time, but I hope my oversight was short-lived.  At some point I placed the word “Intentions” in the center, and as I clarified my ideas, with the hope of applying the cycle to our evaluation and planning work, I eventually replaced “intentions” with “impact.”  I recall how difficult it was to explain the concept of “intentions” so I eventually needed to remove the word from the center (as much as I loved having it there).  If my goal was to have museums apply the cycle to their daily and strategic work, the cycle needed to represent an idea people found comfortable and doable.  Soon I realized that intentionality was the larger concept of the cycle and what needed to be placed in the center was the result of a museum’s work on its publics–impact.  So was born our intentionality work with museums.  Then I realized the true power of intentionality—mission could go in the center as well as outcomes, or anything for that matter.  The artist’s rendition below demonstrates the versatility of intentionality as a concept.

Cycle of Intentional Practice

An artistic rendering of the Cycle of Intentional Practice by artist Andrea Herrick

What I find most amazing is that two crucial ideas—reflection and impact—were not present in the first iterations of the cycle, although they were discussed when I talked about intentionality.  Our intentional planning work (which we refer to as impact planning) would be rudderless without the presence of impact and our ability to learn from our work would be weakened without reflection.  And that brings me to another realization, which I am reminded of daily—the never-ending pursuit of achieving clarity of thought, followed by writing a clear expression of that thought.

Today I talk about the Cycle of Intentional Practice as a draft—it will always be on the verge of becoming, but these days I am more comfortable with the idea of the Cycle being a draft—an idea in process—than I was a decade ago; in fact, I have come to realize that all work is a draft and that if one is serious about learning and applying new ideas to work and life, then all ideas, all products, all knowledge are mere drafts because learning is continuous, right?

Humbling?  Yes indeed.

Read Full Post »

Learning2This week I’d like to share thoughts about evaluative thinking, in part because two weeks ago I was part of a session at the American Alliance of Museums (AAM) annual conference in Baltimore titled “Evaluation as Learning” (titled as such because learning is the ultimate result of evaluative thinking).  We took a risk:  I set the stage by presenting the Cycle of Intentional Practice (see our first blog post) with a distinct focus on the reflection quadrant, and the three panelists were allotted five minutes to present a “story”; we used the remaining time to ask the audience questions (rather than having them ask us questions).  Over the years, AAM has inadvertently trained session attendees to expect 60 minutes of panelists’ presentations (and sometimes more) and 5 or 10 minutes of Q & A whereby the audience would pose questions to panelists.  Rarely have sessions been intentionally flipped where the bulk of a session’s time (50 minutes of 75 minutes) is used to ask attendees questions.  We all wondered if we should ask our friends to attend the session so our queries wouldn’t be met with silence.

We didn’t surprise the audience with this strategy; we were transparent and gave them a heads-up by saying: “Our intention today is to share brief stories about how we have used evaluation as a learning tool (rather than a judgment tool).  Along the way we will be highlighting and clarifying evaluative thinking, and each presenter will spend 5 minutes doing this.  Our intention is also this: we will spend the remaining time asking you questions, in a sense, to model the kind of inquiry that organizations can use to engage in evaluative thinking.  We want to hear your thoughts and reflections, and we welcome you to challenge our thoughts and push us beyond where we are; then all of us will be using inquiry and reflection to pursue learning—the ultimate goal of evaluative thinking.”

Evaluative thinking (ET) is an intentional, enduring process of questioning, reflecting, thinking critically, learning, and adapting.  While learning is at the essence of ET, adapting (one’s thinking or behaviors) is the challenge.  An underlying thread in our presentation supported a fact about evaluative thinking—evaluative thinking is effective and meaningful when it is ingrained in the organization’s culture and the responsibility of everyone—leadership and staff.

Evaluative thinking is embedded in intentional practice and the reflection quadrant is essential, as learning is not likely to happen without people taking the time to ask the tough questions and reflect on reality (e.g., evidence of performance) and practice.  When evaluation as learning is pursued, it can be a catalyst for personal learning, interpersonal learning, project learning, organizational learning, and field-wide learning.

For more on evaluative thinking, check out:

Preskill, H. and Torres, R. T. (1998). Evaluative inquiry for learning in organizations. Newbury Park, CA: Sage.

Read Full Post »

Older Posts »