Archive for February, 2014

Welcome to our new Throwback Thursday series, where we take a moment to look back at projects from our archives.  Today we’ll be sharing a case study about our planning and evaluation work with the Science Museum of Virginia and their Sphere Corps Program.  You might recall this particular Science On a Sphere program from one of our prior posts, Learning to Embrace Failure, and today we’ll share a bit more about how we approached the study, what we learned, and the implications of those findings.

Sphere Corps Program [2012]

For this planning and evaluation project with The Science Museum of Virginia (SMV), RK&A evaluated Sphere Corps, a Science on a Sphere program about climate change developed by SMV with funding from the National Oceanic and Atmospheric Administration (NOAA).    

How did we approach this study?  

The study was designed around RK&A’s belief that organizations must be intentional in their practice by continually clarifying purpose, aligning practices and resources to achieve purpose, measuring outcomes, and learning from practice to strengthen ongoing planning and actions.  To this end, the Sphere Corps project included five phases of work—a literature review, a workshop to define intended program outcomes, two rounds of formative evaluation, and two reflection workshops.  Formative evaluation data were collected using naturalistic observations and in-depth interviews.  Each phase of work allowed staff to explore their vision for the Sphere Corps program and how it changed over time as they learned from and reflected on evaluation findings.

What did we learn?SOS

SMV staff’s goal was to create a facilitated, inquiry-based Science on a Sphere program about climate change.  RK&A first completed a literature review that revealed a facilitated Sphere experience was in keeping with best practices and that using inquiry methods in a 20-minute program would be challenging but worth exploring further.  Staff then brainstormed and honed the outcomes they hoped to achieve in Sphere Corps, which guided planning and script development.  The first round of formative evaluation identified implementation barriers and an overabundance of iClicker questions, all of which created a challenging environment for educators to effectively use inquiry.  Upon reflection, staff reduced the number of iClicker questions and added visualizations and questions that required close observation of the Sphere.  Following a second round of formative evaluation, staff made additional changes to the program script and began to reflect on the reality of using inquiry in a single 20-minute program.  Since the script covered a range of topics related to climate change, staff wondered if they should instead go deeper with one topic while encouraging more visitor observation and interpretation of Sphere data.  Out of this discussion arose the idea of “mini-programs”—a series of programs that would focus on communicating one key idea about climate change, such as helping people understand the difference between weather and climate.

What are the implications of the findings?

Central to the idea of the “mini-program” is the idea of doing less to achieve more.  Impact and outcomes are incredibly difficult to achieve and trying to achieve too much often results in accomplishing very little.  Through a reflection workshop and staff discussion, the SMV team was able to prioritize and streamline the outcomes and indicators originally written for the Sphere Corps program.  Staff also recognized that their primary goal with the Sphere Corps program is to encourage visitors to think more critically about the science behind climate change.  By scaling down the number of topics covered in the presentation, each program could intentionally focus on: (1) one key idea or question related to climate change; (2) achievement of only a few intended outcomes; and (3) implementation of specific facilitation strategies to achieve those outcomes.  Intentionally covering less content also opens up opportunities to more effectively use inquiry methods.

Read Full Post »

25th Anniversary Butterfly

So often we evaluators are asked to measure outcomes or results, which of course align with our expectations.  When we conduct an evaluation and the results are positive, an organization can wave its flag; and ideally the whole museum field benefits from learning why a particular exhibition or program is so successful at achieving its outcomes.  During my time as an evaluator, I have learned that there is enormous value in walking before running.  Because measuring results sounds compelling to museums and their funders, museums often jump over important evaluation processes and rush into measuring results.  Accordingly, staff, in a moment of passion, forgo front-end and formative evaluation—those early stages of concept testing, prototyping, and piloting a program—that help staff understand the gaps between the intended outcomes for their audience and the successes and challenges of implementing a new project. 

So, when we are asked to measure results, we always ask the client if the project has ever been evaluated.  Even then, we may pull the reins to help slow down our clients enough to consider the benefits of first understanding what is and is not working about a particular program or exhibition.  More often than not, slowing down and using front-end and formative evaluation to improve the visitor experience increases the likelihood that staff will be rewarded with positive results when they measure outcomes later.  In fact, when an organization’s evaluation resources are limited, we often advocate for conducting a front-end and/or formative evaluation because we believe that is where all of us will learn the most.  It is human nature to want to jump right in to the good stuff and eat our dessert first.  We, too, get excited by our clients’ passion and have to remind ourselves of the value of taking baby steps.  So, one of the many lessons I’ve learned (and am still learning) is that when it comes to evaluation, encouraging practitioners to walk before they run (or test before they measure) is key to a successful project and their own personal learning.

Read Full Post »

RK&A’s Stephanie Downey will moderate a session, Getting the Most from Evaluation, for the New York City Museum Educators Roundtable on Wednesday, February 12th at 6:00 pm.  Panelist from The Bruce Museum of Arts and Science, The Wildlife Conservation Society, and The Metropolitan Museum of Art will join her to share their lessons learned and best practices from evaluation projects.

For more information, please click HERE.

Read Full Post »

25th Anniversary ButterflySometimes when learning surfaces slowly, it is barely visible, until one day the world looks different.  Responding to that difference is the first layer of that complex process often labeled as learning.  The Cycle of Intentional Practice was a long time coming—emerging from many years of conducting evaluations, where I worked closely with museum staff and leadership as well as with visitors.  The Cycle of Intentional Practice is an illustration of an ideal work cycle that started to form when I was writing “A Case for Holistic Intentionality”.  I am visually oriented and I often have to draw my ideas before I write about them; in this case, I was writing about my ideas and then I felt the need to create a visualization to depict what I was thinking—in part to help me understand what I was thinking, but also to help others.  I included the first iteration of the cycle in the manuscript to Curator, but the editor said the Journal does not usually publish that kind of illustration, so I put it aside.

That original cycle differs from the one I use today—it was simpler (it included “Plan,” “Act,” and “Evaluate”), and while I didn’t know it at the time, it was a draft.  There have been several more iterations over time (one was “Plan,” “Act,” and “Evaluate & Reflect,” for example); as I continue to learn and improve my practice, I change the cycle accordingly.  Most stunning to me was that the first draft of the cycle showed nothing in the center—nothing!  I feel a little embarrassed by my omission and I am not entirely sure what I was thinking at the time, but I hope my oversight was short-lived.  At some point I placed the word “Intentions” in the center, and as I clarified my ideas, with the hope of applying the cycle to our evaluation and planning work, I eventually replaced “intentions” with “impact.”  I recall how difficult it was to explain the concept of “intentions” so I eventually needed to remove the word from the center (as much as I loved having it there).  If my goal was to have museums apply the cycle to their daily and strategic work, the cycle needed to represent an idea people found comfortable and doable.  Soon I realized that intentionality was the larger concept of the cycle and what needed to be placed in the center was the result of a museum’s work on its publics–impact.  So was born our intentionality work with museums.  Then I realized the true power of intentionality—mission could go in the center as well as outcomes, or anything for that matter.  The artist’s rendition below demonstrates the versatility of intentionality as a concept.

Cycle of Intentional Practice

An artistic rendering of the Cycle of Intentional Practice by artist Andrea Herrick

What I find most amazing is that two crucial ideas—reflection and impact—were not present in the first iterations of the cycle, although they were discussed when I talked about intentionality.  Our intentional planning work (which we refer to as impact planning) would be rudderless without the presence of impact and our ability to learn from our work would be weakened without reflection.  And that brings me to another realization, which I am reminded of daily—the never-ending pursuit of achieving clarity of thought, followed by writing a clear expression of that thought.

Today I talk about the Cycle of Intentional Practice as a draft—it will always be on the verge of becoming, but these days I am more comfortable with the idea of the Cycle being a draft—an idea in process—than I was a decade ago; in fact, I have come to realize that all work is a draft and that if one is serious about learning and applying new ideas to work and life, then all ideas, all products, all knowledge are mere drafts because learning is continuous, right?

Humbling?  Yes indeed.

Read Full Post »