Amazon has scrapped an automated recruiting tool after reportedly discovering that it didn’t like female candidates. The company tried to mechanize its talent searches by creating a computer program that gave applicants scores ranging from one to five stars, just like Amazon products. However, its models were trained to vet applicants by observing patterns in résumés submitted to the company over a 10-year period—most came from men, so the system effectively taught itself that male candidates were preferable over women. It reportedly penalized résumés that included the word “women’s,” and downgraded candidates from two unspecified women’s colleges. The company abandoned the project by the start of last year. “This was never used by Amazon recruiters to evaluate candidates,” an Amazon spokesperson said in a statement.