this post was submitted on 03 Jul 2025
254 points (97.4% liked)
Technology
72414 readers
2598 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's still not racism. The article itself says there is a lack of diversity in the training data. Training data will consist of 100% "obvious" pictures of skin cancers which is most books and online images I've looked into seems to be majority fair skinned individuals.
"...such algorithms perform worse on black people, which is not due to technical problems, but to a lack of diversity in the training data..."
Calling out things as racist really works to mask what a useful tool this could be to help screen for skin cancers.
Why is there a lack of robust training data across skin colors? Could it be that people with darker skin colors have less access to cutting edge medical care and research studies? Would be pretty racist.
There is a similar bias in medical literature for genders. Many studies only consider males. That is sexist.
I never said that the data gathered over decades wasn't biased in some way towards racial prejudice, discrimination, or social/cultural norms over history. I am quite aware of those things.
But if a majority of the data you have at your disposal is from fair skinned people, and that's all you have...using it is not racist.
Would you prefer that no data was used, or that we wait until the spectrum of people are fully represented in sufficient quantities, or that they make up stuff?
This is what they have. Calling them racist for trying to help and create something to speed up diagnosis helps ALL people.
The creators of this AI screening tool do not have any power over how the data was collected. They're not racist and it's quite ignorant to reason that they are.
I would prefer that as a community, we acknowledge the existence of this bias in healthcare data, and also acknowledge how harmful that bias is while using adequate resources to remedy the known issues.
There is a more specific word for it: Institutional racism.