Jun 27, 2023
Some of the biases come from data, but most of it is a result of who's writing the algorithms, which in WEIRD countries tends to be white males unfortunately. In China, in tends to be Chinese males. So what needs to be sorted is who's writing the algorithms and the screening of the data used for training.