Close Menu
Versa AI hub
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

What's Hot

Reddit appeals to humanity over AI data scraping

June 6, 2025

Grassley discusses the AI ​​whistleblower protection law in a “start point” interview

June 5, 2025

Piclumen Art V1: Next Generation AI Image Generation Model Launches for Digital Creators | Flash News Details

June 5, 2025
Facebook X (Twitter) Instagram
Versa AI hubVersa AI hub
Friday, June 6
Facebook X (Twitter) Instagram
Login
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools
Versa AI hub
Home»AI Ethics»Creating unbiased AI systems using participatory and inclusive demographic data guidelines
AI Ethics

Creating unbiased AI systems using participatory and inclusive demographic data guidelines

By January 23, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

The integration of AI into daily life has become inevitable. AI is permeating nearly every aspect of life, from employment and healthcare to social media and transportation. The rapid adoption of AI and the race to innovate these advanced systems puts users at risk of algorithmic discrimination, systematic distortions of data and system development that result in unfair outcomes or harm to marginalized groups of people. has been. Algorithmic discrimination disproportionately impacts marginalized communities, including Black people, Indigenous peoples, other people of color, the LGBTQIA+ community, and women. To address this issue and create more fair AI systems, developers often collect user demographic data to assess the impact of AI systems across different identity groups. However, collecting sensitive demographic data from users can cause further harm, especially to those already affected by algorithmic discrimination.

To address these challenges, the Partnership on AI has developed participatory and inclusive demographic data guidelines. These guidelines provide AI developers, teams within technology companies, and other data practitioners with tools for fairness assessments to advance the needs of data subjects and communities, particularly those at greatest risk of harm from algorithmic systems. Provides guidance on how to collect and use demographic data. The guidelines are structured around the demographic data lifecycle, identifying at each stage the key risks faced by data subjects and communities, particularly marginalized groups, and the steps organizations can take to prevent these risks. It identifies baseline requirements and recommended practices that should be followed, as well as guiding questions that organizations can use. Accomplish recommended practices. The accompanying implementation workbook and case studies provide detailed guidance for developers.

Read the guidelines

This guideline was developed in collaboration with a multi-stakeholder community. A working group of 16 experts representing technology industry, academia, civil society, and government perspectives from six countries (US, UK, Canada, South Africa, Netherlands, and Australia) met monthly to review each element of the guidelines. I am creating a draft. . Feedback was gathered from participants through workshops and a public comment period that ran from May to December last year. Seven equity experts specializing in topics such as data justice, AI ethics in a majority world, racial justice, LGBTQ+ justice, and disability rights were asked to advise on the development of the resource.

These resources were first released for public comment in Spring 2024. During the public comment period, we received valuable feedback that led to the following changes.

Further emphasizing intersectionality across resources. Include collective rights in the definition of data justice. We have strengthened the definition of accessible consent to emphasize the need to consider a variety of disabilities. We expanded the definition of demographic data to include age. Additional context about the intended audience for this resource.

We would like to thank those who submitted comments during the public comment period, the working groups assembled to develop these resources, and the expert reviewers who provided their valuable time. We would especially like to thank Research ICT Africa and the Google Equitable AI Research Roundtable for comments that helped improve the guidelines.

The Participatory and Inclusive Demographic Data Guidelines are part of a larger multi-year workstream on demographic data and algorithmic fairness. While this work is nearing completion, the multi-stakeholder insights and ethical considerations that surfaced through the Demographic Data and Algorithmic Fairness workstream will continue to impact PAI as a whole, specifically data supply chains, AI safety, and participation. A comprehensive AI approach. We look forward to building on these learnings and seeing AI developments benefit everyone. To keep up to date with our progress in this area, please subscribe to our newsletter.

author avatar
See Full Bio
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleChloé Messdaghi on AI Security, Policy and Regulation – O’Reilly
Next Article Texas could change rules for health insurance companies

Related Posts

AI Ethics

Artificial Power: 2025 Landscape Report

June 2, 2025
AI Ethics

NYC Book Release: The Empire of AI

June 1, 2025
AI Ethics

ai can steal your voice, and there’s not much you can do about it

May 24, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

New Star: Discover why 보니 is the future of AI art

February 26, 20253 Views

How to use Olympic coders locally for coding

March 21, 20252 Views

Dell, IBM and HPE must operate at a single digit margin when it comes to the server market, and only gets worse

March 10, 20252 Views
Stay In Touch
  • YouTube
  • TikTok
  • Twitter
  • Instagram
  • Threads
Latest Reviews

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

Most Popular

New Star: Discover why 보니 is the future of AI art

February 26, 20253 Views

How to use Olympic coders locally for coding

March 21, 20252 Views

Dell, IBM and HPE must operate at a single digit margin when it comes to the server market, and only gets worse

March 10, 20252 Views
Don't Miss

Reddit appeals to humanity over AI data scraping

June 6, 2025

Grassley discusses the AI ​​whistleblower protection law in a “start point” interview

June 5, 2025

Piclumen Art V1: Next Generation AI Image Generation Model Launches for Digital Creators | Flash News Details

June 5, 2025
Service Area
X (Twitter) Instagram YouTube TikTok Threads RSS
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
© 2025 Versa AI Hub. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Lost password?