People talking at a lunch table

Evaluation Caf茅 2023-24

September 6, 2023
 Chris Lysy
Founder
FreshSpectrum Information Design
 

 

Do any of these describe you?

  • Are you a program evaluator, researcher, data analyst, or someone else who regularly shares data and evidence with others?
  • Do you care enough about your work to actually share it with others?
  • Do you want to create better reports that are more accessible and reach more people?
  • Do you feel like your reports are either too long or too short, with absolutely no in between?
  • Does the idea of creating more reports just sound like tons more work?
  • Not sure whether you should create infographics, dashboards, slidedocs, or something else entirely?

Join Chris Lysy of freshspectrum.com, and author of The Reporting Revolution, as he discusses why we need to think beyond the PDF and walks you through building your own modern reporting strategy. As a bonus: everyone who attends will also get access to a free modern reporting strategy template you can use to start crafting your own process.

 

Image

September 20, 2023
Rebecca Teasdale
Assistant Professor
University of Illinois Chicago, College of Education
 

 

Evaluative criteria represent values about what a high-quality or successful intervention 鈥渓ooks like鈥. These implicit or explicit criteria direct evaluators鈥 lines of inquiry, including which evaluation questions are asked, data are collected and analyzed, and conclusions are reached and reported. Community members, program participants, staff, leaders, funders, and evaluators often hold varying values. Thus, evaluators are charged with the complex tasks of identifying relevant values, specifying appropriate criteria, and applying those criteria to direct inquiry. This presentation will introduce an empirically supported model of evaluative criteria developed to guide evaluation practice. Discussion will highlight how the framework can be used to support criteria specification, make criteria more explicit, broaden the range of values that shape evaluative inquiry, and clarify evaluation design and reporting. The presentation will also explore current research on evaluation to refine the model and deepen understanding of practice.

 

Image
October 18, 2023
Mindelyn Anderson
Founder and Principal
Mirror Group
 
 
Culturally Responsive and Racially Equitable Evaluation (CRREE) takes time to consider how diversity, assessment, inclusion, community engagement, and equity all shape what is possible for evaluation to be meaningful, accessible, and actionable for learning organizations and the communities we love (Mirror Group). In this presentation, Mindelyn Anderson PhD, Founder + Principal of Mirror Group, will walk through a real-life "CRREE Makeover" of social services program evaluation where Mirror Group partnered with nonprofit leadership, program administrators, the funder, and community members to design and conduct an evaluation to meet all partner needs.
 
 

November 1, 2023
Allison Corbett and Laura Gonzales
 
In this conversation on strategies for multilingual research, Laura Gonzales and Allison Corbett will challenge assumptions about how to conduct research with and for multilingual communities. Laura Gonzales will address the question of how to adequately assess linguistic needs within a community by going beyond traditional sources of demographic data and will discuss strategies for measuring the impact of multilingual initiatives through community engagement. Allison Corbett will introduce the framework of language justice as a means of holistically designing multilingual survey research and will share the approach to the 2021 re-design of the New York City Housing and Vacancy Survey as a case study.
 
 

Image
November 15, 2023
Emily Gates
Assistant Professor
Boston College
 
 
 
 
As an evaluator, I鈥檝e grown up on the 鈥渧alues branch" of the evaluation theory tree. To me, evaluating is all about assessing the value of something. What value means depends on the criteria we use. Criteria support the validity, credibility, and utility of an evaluation. Yet, research on evaluation suggests that most evaluators do not set criteria. Clients set criteria and evaluators use them (often implicitly). This is problematic and, in this talk, I will unpack why. Criteria are the heart of evaluation: a chance to ensure that evaluative processes center on what really matters. Criteria should be set purposefully with participatory input from those involved in and affected by an intervention. There are lots of ways to do this and I will share a few examples. After making a case for the importance of evaluative criteria, I will address two critiques: one from the evidence-based policy and practice movement and another emerging from systems change and transformation. These critiques, in different ways, raise the question of whether criteria really are the heart of evaluation - a question I believe is worth revisiting as a field. 
 
 
December 6, 2023
Nicky Bowman, Michael Harnar, and Bagele Chilisa
 
In July 2023, The Evaluation Center at 蜜桃社区 partnered with the International Evaluation Academy to publish a special issue of the Journal of MultiDisciplinary Evaluation titled 鈥淒ecolonizing Evaluation: Towards a Fifth Paradigm鈥. This was considered a watershed for the International Evaluation Academy - the first science output, and on such an important topic. The special issue was edited by Bagele Chilisa and Nicky Bowman, with managerial support from Michael Harnar, current co-executive editor of JMDE. The collaboration among the partners was deep and at times intimate, as they negotiated how best to respectfully dive into the challenging topic of 鈥渄ecolonizing evaluation鈥. As Ian Goldman wrote in his Foreword, this was to be 鈥渁n important contribution to rethinking how evaluation can be fit for purpose in a world of polycrises, where the traditional 蜜桃社区 growth models have led us to the brink of catastrophe. We need different ways of looking at the world, and learning from indigenous and decolonized approaches will be key in us garnering the wisdom to learn and transform how we use evaluation in the service of humanity and Nature, rather than for continued exploitation and despoilation.鈥 (p. 1). In this Eval Caf茅 presentation, Bagele, Nicky, and Michael will reflect on the collaboration that brought this important contribution to the open-access evaluation canon and discuss some of the numerous values-laden, decolonized, and cultural conversations they had in the process.
 
 

Professional photo of Min Ma
January 24, 2024
Min Ma
Founder and Principal
MXM Research Group
 
 
 
 
Data is more than spreadsheets and dashboards. Contrary to what many of us are taught about research, data always comes from a perspective. This presentation explores what it looks like to create and work with data in a way that centers justice, equity, and inclusion. It introduces the Data Equity Deck, a card deck developed by MXM Research Group to move teams and individuals through reflection and action around data equity. Min will share examples of how the questions and activities posed in the deck have made a difference to her team鈥檚 evaluation practice. 
 
 

professional photo of Thomas Archibald
February 14, 2024
Thomas Archibald
Executive Director, Center for International Research, Education, and Development
Virginia Tech
 
 
 
 
Practical wisdom鈥攐ften equated with the Aristotelian intellectual virtue of phronesis鈥攚as described by Ernie House, as 鈥渄oing the right thing in the special circumstances of performing the job.鈥 According to Tom Schwandt, it 鈥渋s about knowing what is right, good, or best to do in a particular set of circumstances. It is pragmatic, context-dependent, involves deliberating a course of action based on principles and values, and informed by critical reflection,鈥 incorporating analytical reasoning, practical reasoning, deliberation, reflection, and praxis. Especially as an antidote to the technical-rationalistic, instrumental style of reasoning and professional knowledge production that sometimes characterizes evaluation, practical wisdom is important. This is why an edited volume on it was published in 2023: Practical Wisdom for an Ethical Evaluation Practice (Hurteau and Archibald, editors; Information Age Publishing). This presentation will provide a summary of the book, and will suggest ways in which all evaluators can further emphasize practical wisdom in their work.
 
 

Professional photo of Bianca Montrosse Moorhead
February 28, 2024
Dr. Bianca Montrosse-Moorhead
Associate Professor, Research Methods, Measurement, and Evaluation Program
University of Connecticut
 
 
 
 
 
In a forthcoming chapter, we (Montrosse-Moorhead and Bitar) revisit and visualize Scriven鈥檚 (1991) extended metaphor鈥擳he Country of the Mind鈥攊ncluding the architectural blueprints for the Evaluation Building. We use these metaphorical foundations and draw from recent evaluation scholarship to propose an amended definition of evaluation that clarifies the always-present moral dimensions of evaluation. Our amended definition has significant implications for evaluation, which are discussed in the chapter. In this Evaluation Caf茅 presentation, Dr. Montrosse-Moorhead will preview the Country of the Mind map, where the Evaluation Building is located on this map, and a close-up of the building itself. None of these have been visualized before in published scholarship. Dr. Montrosse-Moorhead will also share the proposed amended definition, why it is necessary, and the implications of adopting the amended definitions for evaluation practice; the implications for the instruments, methods, and techniques we use; and the implications for evaluation鈥檚 theoretical and metatheoretical scholarship.
 
 

Professional photo of Felicia Bohanon
March 13, 2024
Felicia R. Bohanon, Ed.D.
Executive Director, Office of Precollegiate Programs
Northern Illinois University
 
 
 
 
 
The active inclusion of new and emerging perspectives in the evaluation field fosters diversity, sustainability, and the ongoing evolution of theory, methods, and practice. As the American Evaluation Association (AEA) examines its priorities and develops a new strategic plan in 2024, a core consideration is how does the field foster an inclusive environment where all voices are valued? In this session, the AEA President will discuss the organizational priorities that will guide the development of the strategic plan and AEA鈥檚 2024 conference, which will focus on mentoring those new to the field of evaluation, elevating fresh perspectives throughout the evaluation community, and creating space for experienced professionals to share and gain knowledge. Join this Evaluation Caf茅 session to discuss how AEA can foster collaboration among emerging and established professionals to create a stronger future for the evaluation field.
 
 

Outdoor photo of Tatiana Bustos
March 27, 2024
Tatiana Elisa Bustos, Ph.D.
Researcher and Instructor
RTI International, Transformative Research Unit for Equity, Social Science
 
 
 
 
 
Meaningful evaluation engagement requires trust enhancing relational practices and collaborative decision-making that attend to power dynamics. Community-based participatory action research (CBPR) principles offer insights that can benefit evaluation practice by deepening and enriching engagement strategies beyond just evaluation engagement (Israel et al., 1994; Springsett & Wallerstein et al., 2008). Grounded in social equity and rooted in community psychology, CBPR principles call for engaging partners throughout an evaluation with promising practices that emphasize equitable power-sharing and quality of relationships (Israel et al., 1998) Evaluators are encouraged to learn about and use CBPR principles in combination with other evaluation engagement strategies to enhance the partnership process and their evaluations. This presentation will discuss CBPR principles and their connection to the challenges of program evaluations and with engagement practices. This session builds on a book chapter titled, Evaluation Engagement: Historical perspectives and new directions with community-based participatory research (CBPR) principles published in The Evaluation Center鈥檚 Core Concepts in Evaluation Classic Writings and Contemporary Commentary. 
 
 

April 10, 2024
black and white photo of Jennifer Billman
Dr. Jennifer Billman
Associate Professor
HACC, Central Pennsylvania's Community College
 
 
 
 
 
 
Outdoor photo of Bagele Chilisa
Bagele Chilisa
Professor
University of Botswana
 
 
 
 
 
 
During this presentation Bagele and Jennifer will share the origin of their writing partnership, their collaborative writing process, and how they settled on their chapter topic. They will also provide an overview of the key points addressed in the chapter and discuss the necessity of addressing funder, methodological, and pedagogical colonialism in evaluation decolonization efforts. This session builds on a book chapter titled, The Power and Politics of Knowledge Production in Program Evaluation: Funder, Methodological, and Pedagogical Colonialism published in The Evaluation Center鈥檚 Core Concepts in Evaluation Classic Writings and Contemporary Commentary.