The posts for this week come from the Digital Data & Technology AEA Conference Working Group, and share how digital data and technology are factoring into reshaping evaluation.
Hi, my name is Nicola Harford, and I am Principal Consultant and Managing Director of iMedia Associates, a consulting company working with donors, non-governmental organizations (NGOs) and local media houses in the Global South on social and behaviour change (SBC) interventions. iMedia started out in broadcast and print media—especially using radio and entertainment-education approaches for change—but with the advent of mobile phones and expanding internet access across the globe our clients are increasingly using digital channels and platforms including Instagram, WhatsApp, YouTube, Facebook, and TikTok.
Currently, iMedia is investigating monitoring, evaluation, research, and learning (MERL) for digital SBC communications, mostly around reproductive health. Our aim is to understand what approaches and methods work, what the challenges are, and how to generate data that can both inform programme design and demonstrate impact. We think the lessons we’ve learned can help the wider MERL sector.
Lessons Learned
Firstly, the range of MERL goals addressed by implementing organisations are expanding in complexity. While monitoring/measuring reach and evaluating intervention impact remain key, social listening and monitoring of audience engagement and satisfaction is becoming increasingly important, as is a more sophisticated understanding of audiences and the pathways that drive them to engage for change. In turn, this is enabling organisations to more effectively and continuously shift their implementation strategies and content to meet the needs of their audiences, and adapt to shifts in context.
Secondly, the lines between conventional monitoring and evaluation (M&E) functions, and formative research and wider learning activities, are ever more blurred, not least because digital platforms and channels themselves are tools for understanding audience and measuring impact. Data can be collected in real or near real time as audiences engage with online content. This is where ‘MERL tech’ – the use of digital data and information technologies for monitoring, evaluation, research and learning and ‘MERL of tech’ – conducting monitoring, evaluation, research and learning for digital-based interventions –converge. Leveraging this potential requires methodological versatility and collaboration between programme implementers and those tasked with M&E.
Related to this point is the fact that digital measurement tools are playing catch-up with the platforms themselves – constant technological and skills upgrades are needed to keep pace. Yet, however adept, rapid, and automated the tools and processes for gathering and analysing data become—think machine learning and artificial intelligence like that used to do generative art for these blog posts—human input is needed for modelling, interpretation, and contextual understanding. And good old-fashioned person-to-person, qualitative interactions are still essential to ground truth digital input and understand how people are engaging in digital spaces.
We’ve also looked at the multiple challenges of doing digital MERL: Remote and online data collection pose many of the same risks faced in reaching audiences with online content, relating to privacy, ethics, and safeguarding. Reputational risk can occur when digital data collection is extractive and pushes an external agenda, and lack of (consistent) access to devices by the poorest and most marginalised can lead to bias and elite capture. And not only are digital survey samples self-selecting, but digital and analogue audiences are often very different.
At the same time, digital is opening up a world of opportunity for change communications, on a scale and at a pace not possible in analogue interventions, bringing actionable and positive content to growing audiences. Rigorous, creative, and flexible approaches to (digital) MERL will play a large part in ensuring that promise is fulfilled.
Rad Resources
Getting it on(line): learning about digital communication for social and behaviour change.
The American Evaluation Association is hosting Digital Data & Technology Week with our colleagues in AEA’s Digital Data & Technology Working Group. The contributions all this week to AEA365 come from working group members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.