Some of the biases come from data, but most of it is a result of who's writing the algorithms, which in WEIRD countries tends to be white males unfortunately. In China, in tends to be Chinese males. So what needs to be sorted is who's writing the algorithms and the screening of the data used for training.

--

--

Giles Crouch | Digital Anthropologist
Giles Crouch | Digital Anthropologist

Written by Giles Crouch | Digital Anthropologist

Digital Anthropologist | I'm in WIRED, Forbes, National Geographic etc. | Speaker | Writer | Cymru

No responses yet