Principle 5: Integrate primary and secondary learnings and evidence

Please use this thread to discuss your experiences, challenges, and successes with Principle 5 in your current work, and how you hope to use it in your future work! How can we work and advocate to ensure we are achieving this principle in our programs?

Explanation: Project teams should use secondary evidence and consult with technical experts throughout the project life cycle in order to advance the sector’s understanding of an existing challenge. Use of evidence helps enhance design efficacy and ensures that the lessons learned from the field of public health are acknowledged.

We had a few questions come in from community members on Principle 5 during the launch event:

  1. What does it mean to have a good, high quality insight?
  2. How do you reconcile research methods bias given that engagement on the ground is qualitative in nature, yet the field is skewed towards quantitative methods?

Q1: What does it mean to have a good, high quality insight?

Dr. Tracy Johnson from the Bill & Melinda Gates Foundation said:

I think there’s a number of things to consider. And I’ll share these as offerings. But one of the things that I see struggling in the design community is not dealing with the formative research, not really sort of immersing yourself in that and understanding what is known, what is known a little bit, and then what not is not known at all. And so how do we really take what is known and build on that.

Part of the reason maybe we’re not hitting the mark there is we don’t spend enough time actually doing analysis, we go from design research very quickly to synthesis, because we want to get to creating the solutions. But given the way public health is structured versus the private sector, and less chance to iterate on solutions, once they’re done, it behooves us to spend more time in that analysis process and really come out with higher quality insights.

There’s a lot of work out there that talks about what an insight is and what it isn’t. Some of the things I like to say about insights are you should never strive for the thing that blows your mind. Because if it seems so drastically new, it’s probably not quite right. But an insight is almost a little bit like a journey to the obvious. Instead of “Oh my gosh!” it should more be like, “Oh, you’re right. Wow, I hadn’t thought of it that way.” So there’s always going to be the little piece of what we know but it’s going to build on that.

Insights are inherently actionable. You build a high quality insight when it points you towards the solution that you should be thinking about.

And the last thing I would say, and this speaks to the iterative process of design- what if we stop thinking about insights as products or insight as nouns, and we shift our thinking to insight as verb or insight as action? That insights are our compass but then also the thread that loops through the phases of the design process. And what that means is that the insight that we start out at the beginning of the project with is going to and should evolve over time, because in our prototyping sessions we’re making but we’re also continuing to learn, and how do we remain open to that learning? And how do we bring those insights on the journey with us?

I invite people to look again, maybe that’s something we should also think about within the community and how we can put out more help and guidance on that.

Q2: How do you reconcile research methods bias given that engagement on the ground is qualitative in nature, yet the field is skewed towards quantitative methods?

Dr. Sandra McCoy from University of California, Berkeley said:

We’re often in the need to provide very rigorous levels of evidence to ensure that programmes are funded and can be scaled and can continue. So I think that qualitative research has a really, really important part to play in evaluation. We now recognise that mixed methods is just how we do business, it’s not a special thing anymore. It’s the gold standard of how you do evaluation. So qualitative research has an incredibly important role to play to help us. The quant side always tells us does something work? Yes or no? The qual side tells us why or why not? How do we unpack our results? And so they’re two essential sides of the coin. So I always pause, I never want to give one side more importance than the other because they both give us different pieces of information. And they’re so complimentary in terms of evaluation. But I think there are moments where it is appropriate and important to collect quantitative data to evaluate whether what we’re doing works. And there are moments in time, maybe early in the design process where insights that you gained from small qualitative data collection efforts are incredibly important. So I think there’s a role for all of these techniques all along the chain. And they all work in tandem, and they’re quite complementary.