Detect signals of hireability
How often do we update our resumes as soon as we start to consider a job change? Almost never because:
- we might not want our employer to know that too early;
- we might try a couple of interviews, fail, and reconsider;
- we might discuss our concerns with a manager and get promoted instead;
- we might decide to delay the process for… reasons;
- we might change our minds, again for reasons.
Recruiters hunt for the “green badges” specialists ignoring the fact that this signal is far from perfect. It never appears on some profiles (being hired), or it appears too late. By chasing those badges you doom yourself to work with people who failed to find the job immediately: perhaps their skills are not in great demand a.t.m, or their expectations are too high.
The point is that we miss some great opportunities by working only with “explicitly hireable”. But wasting our efficiency by working with passive talent does not sound like a great alternative. Is there another option?
Yes. DevScanr’s team is creating a system to detect signals of hireability. The system discovers talent that consider a job change – sometimes months before it becomes public knowledge. We’ve analyzed a lot of data to be sure it’s indeed predictable, and we will share a lot of examples along with conclusions in this article.
About DevScanr’s Research
The table below (and other tables down the page) share the same format where each row describes a profile of a person. All personal data is removed – we operate with just talent titles, locations, and starting years of tech careers.
^ it’s a tiny sample of data collected and processed by DevScanr, the smallest possible volume to explain the point. A somewhat larger table is provided below.
Signal A, Signal B, Signal C columns represent certain signals that DevScanr detected at the corresponding date, or post-factum. What exactly these signals are and how we infer them is an intellectual property that won’t be revealed. Let’s assume that at a specific date (or consequent dates) a person performed certain actions (or a set) which was captured as a signal. So far we’re experimenting with 2 unique signals (A/C) and 1 synthetic (B) which is derived from them.
The hypothesis is that a detection of such signal means that a talent is considering a job change with an increased probability. We can conclude that this hypothesis is true, if there’s a positive correlation between found signals and any definite outcome (other than “No change”). And vice-versa – no correlation means the hypothesis is likely false.
These signals were collected from a period of Jan 2024 – Apr 2024. We show historical data because it has clear “inputs” and “outcomes” for data analysis, but the algorithm is, obviously, intended to operate on fresh data and produce relevant results that you could use today.
Outcomes
The Result column has one the following values:
- Employment has been changed
- Employment has been stopped, a talent is currently open to work
- No employment yet, looking for a first job
- Talent was promoted within the same organization
- No change (--)
The above table has signal confirmations in 5 of 6 cases. For such numbers it could easily be a coincidence but we’ve actually analyzed hundreds of profiles and there is a positive correlation. However, it’s hard to say how strong it is – because of ambiguity of the “No change” outcome.
We do not know how many talent canceled their job change plans if no data was changed. We do not know how many talent haven’t updated their profiles yet. We do not know how many talent actually attempted to find a job, failed an interview, and decided it’s better to stay where they are – at least for a while.
An intermediate hypothesis: there’re quite a lot of such people and, therefore,even a weak observed correlation is profound for our original hypothesis. The overall nature of the model and experiments, in this case, are such that any observed correlation is underestimated (never overestimated).
We don’t have enough numbers yet, but, speculatively, the precision of a model gets over 80% for a single signal (like A/C) and can easily reach almost 100% for derived signals (like B). High precision (very few false positives) does not necessary mean that the system is usable (should consider false negatives as well). But the article is not for data scientists so we’ll keep it simple...
DevScanr currently keeps signals for up to 3 months. From the data, we observe that a job change (alternatively a job quit / a promotion) typically happens in 2-3 months after the system catches a signal. It correlates well with an average time of 3-6 months to find a new opportunity.
Who is sending signals?
Experience
Another observation, that alone proves the accuracy of the method, is that about 42% of all hireability signals are generated by students, looking for internships and other work opportunities after graduation. Most of intern profiles that DevScanr currently detects have Bachelor or Master degrees. About 36% of all signals are received from specialists with five or less years of experience. The remaining 22% consist of senior developers.
Specialization
DevScanr is a sourcing platform focused on engineering and specialists in Software, Web, ML, Data, Mobile, Gamedev, etc. So far our exploration was limited to the most popular categories but there’s no reason to believe it’s not applicable to others.
Geography
Worldwide data. Countries with larger populations tend to have more engineers. DevScanr expectedly catches more signals from them with India and the United States holding the top of the list. Yet LATAM, Europe, and other regions are also adequately represented.
More data for curious minds
Here’s a larger data table for your interest (still a small sample of the full data we used). You can see a correlation here more definitely: at any moment 90%+ of profiles on LinkedIn are non-hireable, so the expected result of null hypothesis (signals do not work) would be seeing “No Result” in the majority of rows in the last column. It’s not nearly the case, hence the alternative hypothesis is more likely to be true.
☝️ scrollable...
If signals work, what next?
Something tells us that you (if you’re a recruiter) already have an idea of what you could do with such signals. This functionality is yet to be fully integrated with the rest of DevScanr: talent search and other systems, so it will take some time before it’s publicly available. As always, we encourage the most enthusiastic readers to contact us and ask for early access.
In our opinion, this information is a treasure for talent acquisition experts and in-house recruiters. But, honestly, it’ll be useful for everyone who searches for and hires engineers and cares about qualified talent pools and important details. DevScanr already provides a wealth of knowledge about developers: per-skill history, rankings, inferences, introspective details.
You can get much further with a kind word and a gun AI & automation than you can with a kind word alone.
– Al Capone