Monday March 21, 2016
A Closer Look: Can culture fit and personality be accurately determined by analyzing a person’s digital footprint?
Several companies in cyberspace scour the Web looking for digital trails left behind by users. It makes sense (only 15% of people in the US don’t bother with the Internet). Careersunbound is one of those companies. They collect data from various social media platforms and reformat findings into a detailed report about a person – personality, culture-fit, profiling stuff.
The belief is that these e-footprints provide a truer snapshot of an individual since they were left directly by the source (i.e. the candidate). It’s who they are and what they represent based on likes, dislikes, preferences and words – as compared to traditional personality and job-fit tests which we all know can be manipulated. Think back to the last time you “thumbs-upped” a controversial column or made a snarky comment on a post or a friend’s wall. Depending on what you’ve written and what you’ve liked and the company’s benchmarks, you either make it through their filter or you don’t. It begs the question, how reliable is this approach and is it valid?
So I tried one - sort of. More like a personality test through the analysis of writing – coincidentally (or not) powered by IBM Watson. From it, you also get a detailed report showing were you fall on the spectrum of several traits. It requires a minimum of 100 words to pull results and for an accurate analysis it recommends 3,500 to 6,000+. Put another way, for statistically significant results, you need over twenty pages of written text. Not to panic, I’m talking double-spaced here people.
I ran my resume, a cover letter and a blog all with the same approximate word count (400) through this ringer. In so many words it said I was a lunatic. Three samples yielded 3 different results. My resume had me pegged as inner-directed, restrained and strict with a 1% on the ‘helping others’ scale and the same number on the introversion/extroversion. I wouldn’t hire myself either.
Since all three were scored as a “weak analysis” I went bigger. I copied and pasted 3 substantial works exceeding the recommended word count and the same thing happened - as I expected. I was classified as genial and agreeable in one; shrewd, insensitive and tranquil in another; and in the third, I was heartfelt and confident. Different topics generated different results. Writing porn is not the same as writing about the Krebs cycle of fungus for the New England Journal of Medicine. The one thing all three works had in common was their “strong analysis” score, which made me wonder how accurate a machine is in analyzing the personality of a person through their writing... or maybe the problem is me…
Either way, that’s not the only issue. What if the content being analyzed is not even yours?
To be concluded…