ramiza abdool

View Original

How AI can perpetuate racist/sexist imbalances

Could AI be perpetuating the imbalances within society? Yes, It can be racist and it can be sexist. But dont get angry at the tech, when it's running on algorithms developed by (mostly) outdated belief systems.

Mike Bugembe , Data Scientist, Author and founder of Lens.ai of stripped back the technical coding jargon for the COLORINTECH Leadership series which I was fortunate to have been shortlisted to be apart of. There were so many nuggets of wisdom, but here are some thought starters I wish to share, because they need to be wider understood and explored.

As a start, open your browser on your phone or laptop, then:

Type in Cute babies on google. What pictures and forums did it pull up? What were your search results showing?

Type in: ‘Housewife’ in your search, what are your results? What images did you find of women?

Are the results you were shown a representation of the world you live in at present? Or is it a reflection of systemic bias?

These are the questions that for any solution architect, marketing teams and people who are looking at sets of data need to factor in , in order to make “informed” or “accurate” decisions, is to not be of the belief that the machine is right. The machine, is making predictions on inputs.

In the instance of facial recognition, Mike showed us examples of where the AI used for identification flagged what is called a “False positive” - only when there were ethnic/black faces.

False positives are when it finds a face that matches closely to a database - of criminals in this instance. It is not regulated and so this continues. There are no regulatory bodies in place to seed these errors out.

To read more about this topic, examples of AI being a tool for progress and positive impact and a valuable for leaders of the future, check out Mike’s Blog linked below.

Visit Mike Bugembe of Lens.ai