About Us Services Contact Us Learning

Coded Bias Documentary Podcast

Algorithms can be unfair to some groups of people because they learn from data that has bias inside it. In the video, a person at MIT tested face recognition software that did not work well on darker skin until they used a white mask. This shows that the technology was made by a small group of people and does not work well for everyone.


Algorithms can make wrong decisions, like wrongly stopping or accusing innocent people, especially people of color. This happens because the data used to teach the computer has old unfair ideas about race and gender.



Recent Posts

No comments:

Post a Comment