Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

NA TIG Week: Needs and Artificial Intelligence by Ryan Watkins

I am Ryan Watkins and as a professor at George Washington University, where I do research on and teach about needs, needs assessments, and how people collaborate in making decisions with increasing ‘intelligent’ technologies.

Making judgments about what actions to take (or what actions to recommend that others take) routinely requires complex considerations about the desired and undesired consequences of those actions.  These considerations are most commonly derived from a combination of (a) the goals we wish to accomplish but have not yet achieved, along with (b) which actions are optimal for achieving those goals.  Evaluators often refer to the processes associated with these decisions as Needs Assessment.  And we define the “gaps” between desired and current results as ‘needs’, while the activities to close those gaps can be considered possible ‘satisfiers’. 

Along with the rest of the world, needs assessment (as a field) is evolving in parallel with the data-driven technologies that support and guide more (and more) of our decisions. Within this context I would like to suggest that the construct of needs, and the processes for assessing needs, can and should be increasingly incorporated into the design and implementation of advanced technologies within the general label of “artificial intelligence” (AI).  Since many of the advances in AI over the last twenty years have been gleaned from improvements in machine learning, this has increasingly focused the technology on predictive algorithms — which only informs one part of the judgments the people make (and increasingly machines help guide) on a daily basis. Prediction alone, however, is not enough. The concept of needs (which incorporates the concepts of necessity and sufficiency) is also extensively utilized in human decision making, along with the closely related concept of wants; though these are not yet part of the machine ‘intelligence’.

The more formalized integration of needs, I propose, can be valuable at multiple stages of AI design, development, and implementation. From initial human-centric decisions on what needs a proposed AI system might address, to complementing predictive algorithms when prioritizing options the AI system might recommend to human users, the fundamental construct of needs and the underlying measure of needs from the needs assessment literature can help improve the quality of AI aided decisions in the future.

Why should evaluators care? I believe that evaluators and our experiences with applying needs assessment from our unique perspective can (and should) be more than just consumers of AI technologies — we can be contributors too.  Our foundational constructs and time-tested processes can help inform the development of the ‘intelligent’ tools that people are increasingly relying on more for support in their decision-making.  

Rad Resources:

Article on needs in the development of Augmented Reality (AR):A Needs-Based Augmented Reality System

The American Evaluation Association is hosting Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

3 thoughts on “NA TIG Week: Needs and Artificial Intelligence by Ryan Watkins”

  1. Hi Ryan!

    I thoroughly enjoyed your article!

    My name is Amanda and I am completing a Professional Masters of Education at Queens University in Kingston Ontario. I am currently in a course all about program assessment and found the topic of AI and needs assessment fascinating. In particular, I agree with your final stance “that evaluators and [their] experiences with applying needs assessment from [their] unique perspective can (and should) be more than just consumers of AI technologies — [they] can be contributors too”. Previously, when I have been involved in discussions on AI, the main viewpoint shared is the fear of the technology rather than the improvement of it. This is of course is foolish and never arrives at the question “what can I do to contribute to this innovation?”. Furthermore, I had never heard needs define in relation to necessity and sufficiency , what a great way to understand it!

    Overall, thank you for a great read and all your contributions to the evaluation community!
    Amanda

    1. Thanks for the kind comments, and for your patience since it took a while before I got back to your comment. Since this short piece, I have been working with a colleague to write a longer article on needs-aware AI, and that has been submitted for publication. Once we have pre-prints available, I will try to remember to post them here — or you can find them on http://www.NeedsAssessment.org.
      Thanks, Ryan

  2. Hello Ryan Watkins,
    I’m taking a course at Queens as a graduate student on evaluation.
    Thank you for introducing the concept of the need and use of artificial intelligence into assessment. It was a very interesting read to get a glimpse of the need for AI into assessment. AI can help with the design, development, and implementation of the needs in evaluation as they can provide different judgements with its predictive algorithms. AI can make evaluators become contributors as well as our experiences and perspectives can be added into AI technologies.

    I agree what a component of AI in evaluation and thinking would be great as it can provide a different perspective and look upon the issue. Sometimes it is nice to have technology make the evaluation as it will be fair and more systematic. However, at times it lacks the understanding of the student and the background of each student and situation. I think AI’s algorithm would provide a different input and can give different predictions that can be deemed useful in assessment. Are there specific assessments you would recommend for AI to use? I do see AI aided decisions in the future. Furthermore, I do believe AI can help advance human decisions and predictions in the future, as well as influencing what we need to focus on and the needs, similar to assessment. AI can provide a more reliable and support in human decision-making.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.