Hi, my name is Valentine J Gandhi and I am the Senior Researcher, Evaluator and ICT4D Expert at The Development Café.
In development agencies across the globe from donors to implementers there is discussion about emerging technologies like artificial intelligence (AI), blockchain, big data, internet of things (IoT). Given the hype, evaluators have a responsibility to ensure these technologies are harnessed to their full potential by helping to tune technology experts into the realities practitioners face in the field. For example, take the case of the ‘trust-less’, ‘inclusive’, and permanent decentralized database commonly known as the blockchain; people treat blockchain as a “futuristic integrity wand”—wave a blockchain at the problem, and suddenly your data will be valid. For almost anything people want to be valid, blockchain has been proposed as a solution. Blockchain-based trustworthiness can fall apart in practice.
It’s true that tampering with data stored on a blockchain is hard, but it’s false that blockchain is a failsafe way to create data integrity. In a recent interview for The Daily Show, whistle-blower Edward Snowden said data protection isn’t about after data is collected, but how it is collected. The same applies not only to blockchain but AI, big data, the IoT. Human development interventions are not always straight forward, which is why the application of emerging technology should be guided by human development experts and evaluators (especially qualitative), not purely by tech geeks.
The Principles for Digital Development are a great starting point for this work. While some principles may be more relevant for a particular intervention, focusing on some at the expense of others can be detrimental. Take for example the design of a land records system that uses blockchain in rural India; if we Design with the User, but somehow fail to Understand the Existing Ecosystem by analyzing dynamics of oppression and systemic inequalities, then our interventions may risk amplifying existing inequalities. This is why ground-truthing is important. A user centred design may look at how and why a product will be used, but not necessarily how it can be abused.
Lessons Learned:
Therefore a key consideration for evaluators is looking beyond mere equity, equality, and inclusion but also factors of social justice, not just social welfare, but the history of oppression in a given context. A social justice lens that is aware of oppression and historical and structured inequities is useful when evaluating the benefit of technology within development contexts.
Hot Tips:
- Consider using a check list of indicators that look beyond traditional bias or exclusion.
- Seek to understand the technologies themselves, what they are capable of, and the impact they can have in the given evaluation context.
- Ask “Who designed the technology and who it was designed for?”
As we follow these suggestions, evaluators prepare practitioners to thrive in the Fourth Industrial Revolution by approaching what is known as ‘triple loop learning’, where we learn about how to learn so we can focus on transforming lives with technology and giving voice to the voiceless.
The American Evaluation Association is celebrating Integrating Technology into Evaluation TIG Week with our colleagues in the Integrating Technology into Evaluation Topical Interest Group. The contributions all this week to aea365 come from ITE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Excellent–many great take aways about how to operationize vague phrases lIke “social justice.”
My favorite: “…may look at how and why a product will be used, but not necessarily how it can be abused.”