We understand that sometimes language in webinars can feel technical or a bit daunting for some audience members. We wanted to help ease this by providing a couple of definitions for key terms ahead of time so that we can all come to the webinar feeling like we’re on the same page!
Over the next few days, panelists and members of the CyberRwanda and A360 teams will be sharing some of the key terms to help prepare you for the upcoming webinar!
Adaptive implementation: Global evidence is clear – interventions benefit from ongoing continuous quality improvement during implementation to refine and strengthen them in response to changing contexts and emerging opportunities. Evidence collected in a narrow, relatively optimal set of circumstances, such as design, may not apply in the same way in every implementation context. In other words, design creates an incubator for positive outcomes, but interventions can never be “optimized” prior to implementation in an actual, real-world setting. Adaptive implementation is a tool which applies routine analysis of mixed methods performance and evaluation data within a multi-disciplinary team to identify opportunities for rapid iteration, adaptation, and course correction within the context of routine implementation. In the global community of practice other terms such as adaptive management may also be used.
It’s also important to note, that the term evaluation can be used to apply across multiple different types of evaluation approaches! Drawing in the example from the experience of A360, evaluation included three different separate but interconnected components:
Outcome / impact evaluation: This may be what most people think of when they hear the term ‘evaluation.’ An outcome evaluation considers how the program results in changes in the outcomes of interest within the project theory of change / results framework. Outcome evaluations can consider change within the specific groups reached by the project, or at the population-level (often called impact evaluations).
Process evaluation: A process evaluation can provide a descriptive or analytical account of how implementation has played out over the course of the project lifecycle. In A360’s case this supported the project in understanding how and why the project was effecting change, to inform mid-project course corrections and learning within the global community of practice.
Cost-effectiveness analysis / evaluation: A less common, but growing, field of evaluation within public health is cost-effectiveness analysis. This type of evaluation seeks to understand the main cost drivers of project implementation and to understand what it costs the project to achieve program outcomes. Cost-effectiveness analysis may also include attempts to benchmark against other similar projects or global costing estimates for similar service delivery approaches.
Important to note that HCD is not a regulated term — so definitions and processes vary. For example I never use the 6 steps you outline
- My short definition: HCD is a collaborative innovation process that generates new or better solutions while strengthening participants’ creative self-efficacy.
- Centering humans means focusing on people’s emotions, experiences, and context
- With HCD, principles and mindsets matter most: process, tools, practices follow
- The process I use most is a variation on the British Design Council’s “Double Diamond” — but increasingly evolving my practice to include trauma-informed approaches and approaches as well as power dynamics and diversity, equity, and inclusion