But the accuracy of the system reported in the paper seems to leave no room for mistake: this is not only possible, it has been achieved. In addition to exposing an already vulnerable population to a new form of systematized abuse, it strikes directly at the egalitarian notion that we can’t (and shouldn’t) judge a person by their appearance, nor guess at something as private as sexual orientation from something as simple as a snapshot or two. The research is as surprising as it is disconcerting. Today’s illustration of this fact is a new paper from Stanford researchers, who have created a machine learning system that they claim can tell from a few pictures whether a person is gay or straight. We count on machine learning systems for everything from creating playlists to driving cars, but like any tool, they can be bent toward dangerous and unethical purposes, as well.