Fertik describes Social Intelligence as "not that interesting a tool," since it relies on human analysts reviewing what's been dug up on the Internet to verify, say, that the guy in the Klu Klux Klan outfit really is the same guy being considered for V.P. of Human Resources.
Fertik calls Social Intelligence's searches "just a preview of what's to come, when all this will be done by machine. Then it will get a lot more dangerous"--from the candidate's point of view.
Background checks of tomorrow, he predicts, will use far more sophisticated technology—facial recognition software, for example, that would obviate the need for a human being to vet the Klansman photo. The person being scrutinized will never know his data spoor is being tracked, nor will he know what doors opened for him or which closed because of his social network postings. Consumers, he says, will gravitate to services like Reputation.com because merely changing one's Facebook settings to their highest privacy level won't be sufficient to give you the protection you'll need.
The founder and CEO of Social Intelligence, Max Drucker, maintains that his company's use of human analysts is a strong plus: at present, people are more adept than software at taking into account such nuances as the context of a social media posting. If the company's proprietary software finds an incriminating image, say, the human reviewer may decide it's harmless because of some contextual explanation. For example: "Look at this embarrassing costume my boss insisted I wear to the office Halloween party."
The report given to an employer includes only the kind of information an employer legally can use to evaluate a candidate, as determined by the Equal Employment Opportunity Commission (EEOC). "We redact anything that isn't permitted," explains Drucker. Nor does the search contain only negatives. It also provides positive criteria: any charitable or volunteer work the candidate may have done, any awards they may have won, or anything else that suggests they have displayed leadership in their field.
Employers, he points out, face a dilemma: If they conduct their own search of a candidate by, say, doing a Google search of his or her name, they expose themselves to one kind of liability: The possibility that they will see types of information that legally cannot be used in the decision-making process. By seeing that the candidate is Catholic, they open the door to a discrimination suit on the grounds of religion if they don't hire him. If, however, they fail to perform a background check, and if because of their failure to perform due diligence, they hire somebody with homicidal tendencies who one day kills his co-workers, they're liable for their failure to have performed due diligence.
Why do people post stuff that they must know will only get them into trouble some day? "Why do people get tattoos of swastikas or drive drunk? No one knows why they make stupid decisions. They just do. It's like wearing a T-shirt to a job interview: You're creating a public persona that you're presenting to the world."
And that world, more than ever, includes potential employers.