Posts Tagged ‘evaluative thinking’

25th Anniversary ButterflyWorking in research and evaluation, you become very skeptical of words like “data-driven” and “research-based.” To evaluators, it is quite flattering that these words are so buzzworthy—yes, we want our research and evaluation work to be important, used, and even desired! However, even though these buzzwords grab attention, they can be misleading. For instance, when we talk about data and research at RK&A, we mean original, first-hand data and research, such as interviews, questionnaires, and surveys with museum visitors.

This was on my mind as I recently had the opportunity to help my CARE (Committee on Audience Research and Evaluation) colleague Liz Kunz Kollmann review session proposals for the 2015 AAM Annual Meeting. Liz, as CARE’s representative for the National Program Committee, was charged with reviewing sessions in the Education, Curation, & Evaluation track (all 141 sessions!) along with fellow National Program Committee members in education and curation. Given that audience research and evaluation can be part of many AAM tracks (marketing, development, exhibit design, etc.), Liz recruited some CARE members to help her review sessions in other tracks to see if there were any sessions outside of our designated track that CARE should advocate for.

I volunteered to review sessions in the tracks Development & Membership and Finance & Administration. I had expected to encounter a lot of buzzwords since the AAM session proposals include a description that must be appropriate for display on the AAM website, mobile app, and other published meeting materials. So, I wasn’t surprised but I was struck by the heavy use of terms like “data-driven” and “research-based” (e.g., data-driven strategies for membership recruitment and research-based programming) and was stymied in trying to determine whether these sessions were relevant to CARE—what data is driving the decisions and is it really of interest to CARE members?

Certainly I am not dismissive of research or data that isn’t “original.” There are many definitions of research and data that are applicable to certain scenarios and within certain fields. For instance, arts-based research is a completely valid field of research within art education when conducted well. However, I am biased to collecting original data from visitors first-hand, which is why terminology like “data-driven” and “research-based” makes my ears perk up—because these words prompt many questions for me about the type of data and research and its appropriateness to inform said decisions and practices. Through our work at RK&A, we truly want practitioners to make decisions that are data-driven; that is the greatest outcome of our work! However, we also want our clients to be skilled users and consumers of data and evaluation so much so that their ears perk up at the very mention of “data”—for hopefully, they, too, have become savvy digesters of the language as well as the meaning behind the data when talking about research and evaluation.

Check out our Buzzword Bingo below inspired by Dilbert: http://dilbert.com/strips/comic/2010-10-25/  Warning: this Bingo is informed by RK&A’s professional experience and is not based on original data.  Maybe with the help of our museum colleagues, we can make it “research-based.”  Please share your buzzwords!

Reflection 19 blog v4

Read Full Post »

Learning2This week I’d like to share thoughts about evaluative thinking, in part because two weeks ago I was part of a session at the American Alliance of Museums (AAM) annual conference in Baltimore titled “Evaluation as Learning” (titled as such because learning is the ultimate result of evaluative thinking).  We took a risk:  I set the stage by presenting the Cycle of Intentional Practice (see our first blog post) with a distinct focus on the reflection quadrant, and the three panelists were allotted five minutes to present a “story”; we used the remaining time to ask the audience questions (rather than having them ask us questions).  Over the years, AAM has inadvertently trained session attendees to expect 60 minutes of panelists’ presentations (and sometimes more) and 5 or 10 minutes of Q & A whereby the audience would pose questions to panelists.  Rarely have sessions been intentionally flipped where the bulk of a session’s time (50 minutes of 75 minutes) is used to ask attendees questions.  We all wondered if we should ask our friends to attend the session so our queries wouldn’t be met with silence.

We didn’t surprise the audience with this strategy; we were transparent and gave them a heads-up by saying: “Our intention today is to share brief stories about how we have used evaluation as a learning tool (rather than a judgment tool).  Along the way we will be highlighting and clarifying evaluative thinking, and each presenter will spend 5 minutes doing this.  Our intention is also this: we will spend the remaining time asking you questions, in a sense, to model the kind of inquiry that organizations can use to engage in evaluative thinking.  We want to hear your thoughts and reflections, and we welcome you to challenge our thoughts and push us beyond where we are; then all of us will be using inquiry and reflection to pursue learning—the ultimate goal of evaluative thinking.”

Evaluative thinking (ET) is an intentional, enduring process of questioning, reflecting, thinking critically, learning, and adapting.  While learning is at the essence of ET, adapting (one’s thinking or behaviors) is the challenge.  An underlying thread in our presentation supported a fact about evaluative thinking—evaluative thinking is effective and meaningful when it is ingrained in the organization’s culture and the responsibility of everyone—leadership and staff.

Evaluative thinking is embedded in intentional practice and the reflection quadrant is essential, as learning is not likely to happen without people taking the time to ask the tough questions and reflect on reality (e.g., evidence of performance) and practice.  When evaluation as learning is pursued, it can be a catalyst for personal learning, interpersonal learning, project learning, organizational learning, and field-wide learning.

For more on evaluative thinking, check out:

Preskill, H. and Torres, R. T. (1998). Evaluative inquiry for learning in organizations. Newbury Park, CA: Sage.

Read Full Post »