catherine jonell says it's. not only algorithms that are to blame because they are still programmed and fed with information by human algorithms because we train it using the machine learning so either whether it's deep learning with a neural network so to speak or it's machine learning were feeding it data and were asking it to form an opinion and of course when that opinion is based on data that has unfair treatment of groups then very much learned. like translation software millions of text teach it which would be most appropriate if these texts mention. teaches remembers that picture recognition algorithms learn from large picture databases you can find pictures of women in a kitchen men. to associate women with kitchens another problem is how the rhythms reinforce what they have already learned to the point even that identified this man as a woman. changing difficult time consuming. one important thing is to know all of the sources of your data so to be very aware that if you're going to collect a lot of da