we check with alan hewitt, who has been working with a lot of issues with harassment.g a look at ai, bias, and the criminal justice system. has been pretty well known amongst people who are championing fairness in machine , but she became more well-known last december when she wrote a medium post about some of the harassment she has experienced in her career. jason: at first blush, people might wonder how a machine can be biased. data sets back to the that people use to build these predictive models that they then applied to real-life setting. paperample, they read a published by a predictive and thiscompany company makes a product that helps police out on the beat figure out which areas in a city are most likely to have crime. not quite like, minority report, but it is similar. it says, maybe you should check out this intersection at this time of day. of that output is based off of police records that the police department has been gathering. so her and her coworkers looked attwo things, they looked arrest records related to drug crimes in oakland, and compared them t