The integration of AI into daily life has become inevitable. AI is permeating nearly every aspect of life, from employment and healthcare to social media and transportation. The rapid adoption of AI and the race to innovate these advanced systems puts users at risk of algorithmic discrimination, systematic distortions of data and system development that result in unfair outcomes or harm to marginalized groups of people. has been. Algorithmic discrimination disproportionately impacts marginalized communities, including Black people, Indigenous peoples, other people of color, the LGBTQIA+ community, and women. To address this issue and create more fair AI systems, developers often collect user demographic data to assess the impact of AI systems across different identity groups. However, collecting sensitive demographic data from users can cause further harm, especially to those already affected by algorithmic discrimination.
To address these challenges, the Partnership on AI has developed participatory and inclusive demographic data guidelines. These guidelines provide AI developers, teams within technology companies, and other data practitioners with tools for fairness assessments to advance the needs of data subjects and communities, particularly those at greatest risk of harm from algorithmic systems. Provides guidance on how to collect and use demographic data. The guidelines are structured around the demographic data lifecycle, identifying at each stage the key risks faced by data subjects and communities, particularly marginalized groups, and the steps organizations can take to prevent these risks. It identifies baseline requirements and recommended practices that should be followed, as well as guiding questions that organizations can use. Accomplish recommended practices. The accompanying implementation workbook and case studies provide detailed guidance for developers.
Read the guidelines
This guideline was developed in collaboration with a multi-stakeholder community. A working group of 16 experts representing technology industry, academia, civil society, and government perspectives from six countries (US, UK, Canada, South Africa, Netherlands, and Australia) met monthly to review each element of the guidelines. I am creating a draft. . Feedback was gathered from participants through workshops and a public comment period that ran from May to December last year. Seven equity experts specializing in topics such as data justice, AI ethics in a majority world, racial justice, LGBTQ+ justice, and disability rights were asked to advise on the development of the resource.
These resources were first released for public comment in Spring 2024. During the public comment period, we received valuable feedback that led to the following changes.
Further emphasizing intersectionality across resources. Include collective rights in the definition of data justice. We have strengthened the definition of accessible consent to emphasize the need to consider a variety of disabilities. We expanded the definition of demographic data to include age. Additional context about the intended audience for this resource.
We would like to thank those who submitted comments during the public comment period, the working groups assembled to develop these resources, and the expert reviewers who provided their valuable time. We would especially like to thank Research ICT Africa and the Google Equitable AI Research Roundtable for comments that helped improve the guidelines.
The Participatory and Inclusive Demographic Data Guidelines are part of a larger multi-year workstream on demographic data and algorithmic fairness. While this work is nearing completion, the multi-stakeholder insights and ethical considerations that surfaced through the Demographic Data and Algorithmic Fairness workstream will continue to impact PAI as a whole, specifically data supply chains, AI safety, and participation. A comprehensive AI approach. We look forward to building on these learnings and seeing AI developments benefit everyone. To keep up to date with our progress in this area, please subscribe to our newsletter.