Banner
Four women stand in front of a wall of paintings in a New Orleans restaurant.
Title

Evaluation Conference Takeaways: What We Learned

Categories:
Body

In late 2022, three Wilder Research staff attended the American Evaluation Association annual conference in New Orleans. Sera Kinoglu, Jackie Aman, and Alissa Jones all embraced the chance to build their skills, network with other evaluators, and bring back new ideas and methods to share with staff and partners. Here they each share a takeaway from the conference. 

The power of stories: Sera Kinoglu

As a researcher with a bias for qualitative enquiry, I was excited to learn about a methodology that brings together aspects of storytelling and participatory action research to illustrate impact. The Most Significant Change (MSC) technique was developed by Rick Davies and Jess Dart and has been in use for nearly 20 years. Unlike more frequently-used methods such as interviewing or surveying, MSC seeks the “outer edges” of experience rather than finding the most common themes among respondents. In this way, MSC is able to capture the diversity of perspectives and experiences among a group of participants and empower them to tell their own stories. Further, stories have a unique ability to convey complexity and context, and can also carry difficult messages in a digestible way.

The unique value of MSC lies in the dialogue that occurs once stories are collected. Project stakeholders read the stories, then engage in discussion about the impact they see; this review of first-hand accounts can provide powerful insight into what impact can look like “on the ground.” The selection process of “most significant change” among a group of stories may also help to illuminate what stakeholders truly value or could reveal unexpected outcomes. When paired with more traditional methods (it works particularly well when coupled with monitoring, evaluation, and learning), MSC offers an additional lens through which to gather information about complex outcomes.

Decolonizing quantitative data visualization: Jackie Aman

Aside from the exhilaration of being back together with so many colleagues (Coffee meetings! With humans! In person!), a main takeaway from AEA 2022 was the importance of decolonizing data visualization. I realized that my efforts to decolonize evaluation have largely focused on how we engage communities, honor more types of knowledge and experiences, and build respectful methods and designs. Yet several AEA presenters called us to do better in decolonizing all parts of the evaluation cycle – especially in the reporting and visualization of quantitative data. Specifically, I was blown away by the work and insight of Pieta Blakely. Two main takeaways include:

  • Humanize large sets of data with jitterplots or beeswarm plots. Solid lines showing trend data over time can often mask the different experiences represented. Our brains can also forget that those trend lines represent the lives of so many individuals. By using a jitterplot or beeswarm plot, which represent each person with a dot, we can better illustrate both the diversity and magnitude of experiences being represented. Check out Pieta’s article (and stunning visuals) to see an example of data represented using a bar chart vs. using a jitterplot.
  • Create separate charts for each racial/ethnic group when disaggregating data by race to avoid deficit framing and promote targeted universalism. In her portion of the Disaggregating Data Is Not Enough session at AEA (and in her article above), Pieta cautions against displaying data disaggregated by race in a single chart. Combining trend lines for each race into one chart can indirectly center the white data line as the goal line for all other races/ethnicities. It can also reinforce negative stereotypes by failing to convey important context and history necessary for understanding the data we are seeing. Instead of using just one chart to show data disaggregated by race, create a separate chart for each race/ethnicity that uses a goal line – not by default, the white data line – as the target outcome line. Separating out each racial/ethnic category into its own chart also supports the idea that different populations may need varying levels of support or different types of interventions to get to that goal line.

Choosing culturally responsive approaches: Alissa Jones

Do you ever contemplate the differences between various culturally responsive and equity-focused evaluation approaches and which will best meet the needs of your project? If so, I have an excellent resource for you! I attended a session by Felisa Gonzalez from The Colorado Trust, and Blanca Guillén-Woods and Katrina Bledsoe from Strategy Learning Partners for Innovation. They developed an interactive tool, the Eval Matrix, that allows users to select and compare evaluation approaches and key principles.

What’s included?

  • The creators completed an extensive literature review to capture the nuances of seven philosophies and approaches in an easily digestible format. For example, culturally responsive evaluation centers on culture and including community members whereas culturally responsive and equitable evaluation adds an additional layer by aiming at equity.
  • Key principles provide additional context about how evaluation is carried out for each evaluation approach.
  • Focus areas offer considerations for evaluations focused on individual, interpersonal, or structural systems.

Most importantly, the presenters shared that their website is a work-in-progress and will be updated as concepts evolve based on the literature. I highly encourage folx to explore their website and utilize their tool. I think we often utilize the approach we are most comfortable and familiar with, and this tool reminds us to use the approach that best meets our stakeholders’ needs. I wish this tool had been available when I was taking my Program Evaluation Theory and Models course! For all future grad students—you’re welcome.

Sera Kinoglu is a research scientist in Wilder Research. Jackie Aman is a research scientist in Wilder Research. Alissa Jones is the associate director of operations in Wilder Research.

Photo: The authors with fellow conference attendee Joanne Moze of Blue Cross Blue Shield Minnesota.