function over form graphic
February 23, 2022 Michael Moses No Comments

Function over form: Applying best-fit evaluation methods to drive learning and action

Written by: Michael Moses

This piece was originally posted on the American Evaluation Association AEA365 blog website. 

Hey there, I’m Michael Moses, a Senior Monitoring, Evaluation, and Learning Specialist at EnCompass LLC. We work with U.S. government agencies, private foundations, multilateral organizations, and others that are tackling complex development challenges, from systemic corruption to human trafficking. We help our clients and their partners:

  • Develop and implement strategies for addressing these challenges
  • Collect and use data to assess progress
  • Capture and apply lessons to adapt and improve.

Those working in the social sector increasingly acknowledge that there are no out-of-the-box solutions for complex challenges. We know that simply replicating and scaling approaches across contexts is at best not useful, and at worst actively causes harm. Instead, those who seek change need to be able to work collaboratively and adaptively, hand-in-hand with local partners, and iterate their way to strategies and solutions that best fit the contexts in which they’re working. A utilization-focused, developmental approach to evaluation, which focuses on facilitating and connecting just-in-time reflection, learning, and action is essential to driving change in complex systems.

Lessons Learned

In practice, this sort of action-driven evaluation work looks radically different from client to client, and context to context. But over the years, we’ve seen that if you’re trying to help folks learn how to address complex challenges, the principles you bring to bear are just as important as the evaluation methods you use. Three principles seem especially important.

Function Over Form: Don’t be shy about picking and choosing from different methodologies and combining them in ways that strengthen their collective usefulness, such that we can generate the evidence and learning people need to make decisions. Adaptive bricolage (hat tip Tom Aston and his great articulation of this sort of approach) may not result in traditional-looking evaluations, but combining different methods—from most significant change to outcome harvesting to participatory action research—can and often does make for a more useful evaluation.

Collective Learning at the Center: Evaluation is about enabling collective learning and action, not just about accountability. Written reports are important, but providing opportunities for processing results and data, including as they emerge, is just as significant. Opportunities for collective, reflective learning—workshops, dashboards, before and after-action reviews, and more—help partners and clients link lessons to potential action and make informed choices about next steps.

Collaboration to Facilitate Inclusive Action: Evaluation decisions—including decisions around whether an evaluation is desirable in the first place, its focus, and intended users—should involve not just funders, but also the partners and colleagues with whom they work to bring about change. Ensuring that a variety of stakeholders are involved in all stages of the process helps to ensure that our evaluation work answers the needs of those most likely to make a difference, not just those paying for the work.

Balancing these principles, and ensuring they’re consistently put into practice, is hard! It takes effort, and often requires navigating tricky conversations with clients and partners. But when we get the balance right, we consistently see that best-fit evaluation work has promise. We’re excited to do more of this work in the future.

How do you balance form and function in your evaluation work? We’d love more partners as we chart the path toward more inclusive developmental evaluation efforts, and would love to hear from you.

Rad Resources

Aston, Tom (April 2020). Bricolage and Alchemy for Evaluation Gold: In this short blog, Tom Aston argues that combining and adapting evaluation methods can make for a stronger, more useful evaluation approach.

Darling, Marilyn. (2018). How Complex Systems Learn and Adapt: In this brief, Marilyn Darling explains complex adaptive systems theory, and explores how learning can speed adaptation.

Burns, Danny and Worsley, Stuart. (2015). Navigating Complexity in International Development. Practical Action Publishing: Burns and Worsley’s seminal book on how to embed learning in complex systems.

Falconer-Stout, Zachariah and Jones, Jonathan. (2020). Utilization-Focused Evaluation: Recommendations: two of my colleagues at EnCompass explain how to make evaluation recommendations useful and useable.

Michael Moses

Senior MEL Specialist

Michael Moses is a Senior Monitoring, Evaluation, and Learning (MEL) Specialist, working with the EnCompass team for the MacArthur Foundation’s Big Bet On Nigeria and the process evaluation of the Program to End Modern Slavery. He has more than 10 years of strategy and MEL experience. Previously, he was a senior global MEL specialist at DAI, providing strategy, facilitation, and MEL expertise across project delivery, new business, and knowledge management efforts. Mr. Moses also served over 6 years at Global Integrity, where he held many positions including Managing Director of Programs and Learning. He has experience working in Bulgaria, Nigeria, South Africa, Kenya, Tunisia, Malawi, Georgia, the Philippines, Mexico, Colombia, Ireland, the United Kingdom, and India. He received his MA in International Development from Georgetown and BA in Philosophy and Political Science from University of Notre Dame. Mr. Moses is fluent in Bulgarian and functional in Spanish.

Leave a Reply

Skip to content