A Platform for Understanding Fairness, Equity & Bias in AI Healthcare Products
The Problem
AI is transforming healthcare, but expanding research suggests that AI healthcare technologies may unintentionally amplify bias, exacerbating already-existing inequalities. The fairness and equity of AI applications in healthcare is a complex and nuanced topic.
There are limited tools available to educate builders and users of AI tools about the manifestation of bias in healthcare products. HealthTech Build Studio aims to create a guided interactive platform that provides builders and users of AI healthcare products with easy access to key insights from published studies on fairness and equity in predictive algorithms, generative AI, and other AI technologies.
HealthTech Build Studio was searching for a technology innovation partner to help them develop a platform to democratize the learnings from research on fairness and equity through easy access to concise insights on the platform.
Light-it Foundation’s Tech Collaboration Grant
At the same time, Light-it was looking for an exceptional nonprofit in the healthcare domain driven by a passion for creating real change with technology. Light-it contributed $50,000 in services, including software development, product design, and expert consultancy, to help HealthTech Build Studio fulfill its goals.
The Solution
We developed an e-learning solution designed to provide a guided experience for users, enabling them to navigate both quantitative and qualitative analyses within each research study. This helps users quickly grasp the most important insights relevant to the AI products they are building, deploying, or using.
The platform includes an administrator module that empowers the HealthTech Build Studio team to independently create new educational modules and comprehensive content. This ensures the platform remains robust and cutting-edge by reflecting emerging research.
Our solution is structured around modules and a slide-builder that supports multimedia elements, including videos, images, activities, and text. This allows for the creation of custom learning experiences tailored to each research study.
Sample content for one educational module includes:
• Guiding users through the evaluation of label choices, ground truth, data limitations underlying an AI predictive algorithm in healthcare
• Leading users through the statistical metrics to assess bias in a predictive algorithm
• Exploring a framework to evaluate tradeoffs in metrics choices & data limitations
Light-it’s UX/UI team also designed a prototype of an interface featuring a future community forum. This forum will enable users to engage in topic-specific interactive learning opportunities, structured to facilitate engagement and communication.
Tech Stack
Sneak peek of the solution
A user-friendly platform builder
The course builder administrators can efficiently create each course module through an intuitive platform. This e-learning platform allows the integration of various modalities, such as video, text, and images, for an enhanced learning experience.
In addition to the module builder, they can create the slides that will accompany each class within the HealthTech Build platform.
User management
HealthTech Build Studio administrators can easily manage platform users, depending on whether they are members or open-source users.
Within the User Management section, administrators can both send invitations and accept or reject new member requests.
The registration process for new members is not only aesthetically appealing but also very intuitive and user-friendly.
Future concepts: HealthTech Build Studio
Our UX/UI team has already developed the interface design that will accompany this new project functionality.
This next step of the project will incorporate forums, including live webinars and other opportunities for community engagement.
Impact and Context
• Global AI in Healthcare Market in 2022 was valued at USD 11.2 B.
• It's estimated to reach USD 427.5 B by 2032.
Source: GlobeNewswire, Oct. 2023
• President Biden issued an Executive Order to incorporate equity principles in AI-enabled health and human services technologies to monitor algorithmic performance against discrimination and bias.
Source: Federal Register, EO 14110, Oct. 2023.
• "We do need to have a way of judging what's good and what's not so good, which programs are better than other ones, and which ones are biased. Right now, we don't have in place very much to help".
Source: Isaac Kohane, Harvard University, and New England Journal of Medicine, Jan. 2024
• Confirming that AI healthcare products have been evaluated for bias is crucial to safeguarding the quality, safety, and efficiency of the solutions.
• A platform that provides foundational research-based knowledge will be highly valuable.