MIT Takes Down Popular AI Dataset Due to Racist, Misogynistic Content

Earlier this week, MIT permanently pulled its 80 Million Tiny Images dataset—a popular image database used to train machine learning systems to identify people and objects in an environment. The reason? It used racist, misogynistic, and other offensive terms to label photos.

Read more...



from Gizmodo https://ift.tt/31HlclR


EmoticonEmoticon