According to this article in the Harvard Business Review bias can creep into algorithms in several ways. Artificial Intelegance (AI) systems learn to make decisions based on training data, which can include biased human decisions or reflect historical or social inequities, even if sensitive variables such as gender, race, or sexual orientation are removed.
See related story: Amazon scraps secret AI recruiting tool that showed bias against women
Californias Fair Employment and Housing Council conducts hearing to assess bias in the use of computer algorithms on employment practices. For more information see the press release.