Subscribe to DSC Newsletter

The screen shot below says it all: candidates for data science positions are plentiful, considerably more abundant than job openings: more than 600 LinkedIn users applied for this data science job. Even less prestigious companies routinely attract more than 50 applicants per job opening.

In view of this, you would thing that getting an expensive analytic degree is a waste of money (except a degree from MIT, Stanford, CMU, Northwestern and a few other select schools), and applying for a data science position is a waste of time (you would be competing with top candidates who apply to all the advertised positions). The problem is actually a bit more complicated. The root causes and solutions are as follows:

  • Companies want candidates with very deep rather than broad expertize, are not willing to accept telecommuting, and will not train a new employee, and in some cases only hire Ivy league candidates. In doing so, they drastically restrict the pool of of potential employees
  • Many university curricula are outdated, so despite the volume of applicants, hiring managers complain that very few have the right skill set. Few candidates are willing to acquire these new skills (e.g. Mapreduce), although it can be learned at no cost if you have an Internet connection and a browser.
  • Data Scientists are not properly used or hired. In the case of Facebook for instance, one might ask how - despite all the great scientists and great data that they collect about users - they generate so little revenue per page impression. They should generate 10 times more revenue if data science was fully and properly leveraged by top management, read comments here for details. In this case, the issue is probably poor communication between top management and data scientists: a solution to significantly increase ad revenue by optimizing ad relevancy is described in my free eBook, so there is no excuse for poor performance. The same can be said about many companies (Google under-utilizing its Internet real estate, Microsoft having very poor marketing campaigns and not hiring the right people) and many problems such as spam detection or fraud detection. 
  • There are many alternate options outside traditional employment for data scientists, and as a data scientist, you should consider these new options.

Finally, an unexpected consequence is a rise in sophisticated fraud, as an oversupply of unemployed math PhD's with great expertize end up working for rogue organizations (their only choice), while government and other organizations fail to hire the best people, for whatever reasons.

Views: 9813

Comment

You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

Comment by Nell ATL on May 18, 2013 at 12:47am

I can't always agree that managers aren't willing to listen.  I believe more often than not, the problem comes from not being able to communicate your idea in a persuasive way.  So many really smart people suck at this and then blame it on their managers or their customers.

That's why one of the most sought after skills is the person who can take complex ideas and math and break it down to a point that even a child could understand.

As much as smart guys hate salespeople, you have to accept the fact that you're a salesperson whenever you're presenting an idea.  If you get good at selling, you'll notice that you will have less complaints about people not understanding.

Comment by Vincent Granville on July 4, 2012 at 3:00pm

@Mike: The test was performed according to design of experiment best practices. The test is one element among many others (e.g. number of applicants per data science position posted on LinkedIn, number of resumes that we receive each week, number of satisfied clients when we post a data science job ad on their behalf, number of applicants we receive when we advertise a position for ourselves) clearly indicating that for the average person, acquiring an analytic degree or applying for an analytic job in US is a waste of time, possibly worse than playing lottery in terms of expected (negative) return. Of course there are exceptions, e.g. if you have a CS master from Stanford acquired recently, and good experience with processing large data sets.

On the plus side, we propose very interesting and challenging alternatives to analytic candidates interested in leveraging their skills, alternatives that don't require sending resumes. Read this article and all the comments to learn about the alternatives - if you are a candidate. If you are a recruiter complaining about lack of analytic talent, talk to us - we'll quickly find great talent for your data science positions.

Comment by Mike O'Neil on July 4, 2012 at 1:54am

Vincent,

I don't think your test proves anything at all. There are a raft of biases. Not the least of which is the exact reverse of what you did. The posting of bogus jobs by recruiters looking for ammunition for their putty cannons. I personally have had no problem recruiting analytics capable individuals.  I have experimented with a number of different styles of advertising, different sites, and different levels of role.  You just have to look at the job postings that come into my email inbox where the same headline for generic industry roles with exactly the same text that has not changed for months which shows that recruitment companies think that spamming sites like this one are productive. They wouldn't have a clue about what works.

Comment by Vincent Granville on July 3, 2012 at 5:25pm

@John: Our test proves that the way many hiring managers recruit data scientists does not work. It proves that many great job applications go into black holes. It does not explain the reason why: it could be due to poor keyword filtering algorithms, incredibly long response time to an application, email not working (e.g. you contact an applicant via email but the recruiter's message goes into a junk box) or other reasons:

  • candidates must send a resume to jobs@yourcompany.com - very few will do, only those desperate to find any job will
  • require an applicant to fill in an application web form that is broken (404 error upon submitting the application), or application form is too long, or it asks questions about gender or race, scaring candidates away (very common practice).
  • eliminate candidates that are not local
  • eliminate candidates without cover letter
Comment by John Stites on July 3, 2012 at 3:18pm
@Vincent, I'm not sure that submitting a few hundred bogus responses to job postings is the appropriate test of the thesis. A real test of the thesis would be to put forth several false open positions and then look at the respondents. What you conducted was a test on the efficacy of on-line job application process. The failure lies in the methodology used in e-recruiting, consequently no valid conclusions can be drawn about the quantity of qualified applicants.
Comment by Vincent Granville on July 2, 2012 at 2:57pm

@John: We have strong data to back our claim, including hundreds of made-up, high quality, anonymous, varied, well targeted applications to more than 200 advertised positions (all requiring at least a BS in an analytic field), with abysmal response rate from hiring managers. This was part of an experiment to check whether the claim of lack of analytic talent was bogus or true.

Also, for any open position advertised on LinkedIn, anybody with a LinkedIn account can easily check the number of applicants who applied. Most data scientist / analytic positions (and you can check it yourself since this is public information) have more than 50 applicants. Additional anecdotal evidence (when we tried to hire analytic talent) shows that there is no shortage of highly qualified applicants (although we are a bit biased in the sense that we were accepting 100% telecommuting).

Any shortage would have to be in very specialized fields (positions requiring US citizenship + clearance), but quant is not one of them. Unless what you mean by quant is a cheap Java/C++ coder that in addition is a true data scientist. Indeed, these people don't exist and will never do.

Comment by John Stites on July 2, 2012 at 1:18pm
If I were to form an opinion based on both the question and the responses, I would be lead to believe that there are very few quants out there. Neither the original blog nor any of the responses attempt to supply any "data driven evidence.". The thesis of the original blog (and all subsequent discussion) is based on anecdotal evidence. Anecdotal evidence is subject to a number of fallacys including non-representational sampling and hasty generalization. In this context it takes the form of "casual observations or indications rather than rigorous analysis.". If it was the authors intention to generate discussion, then he was successful. If he was looking for a logical data driven response, the author failed.
Comment by jwinburnnc on June 17, 2012 at 4:29pm

It's completely amazing to me that in a world where these methods are applied so ubiquitously (and successfully) that there remains such a huge population, especially in management that have no idea of their existence. I guess everything has a bright side in that I rarely hear anyone connect "credit default swaps" or the flash crash in 2010 to quant methods ... 

... we're on the same page Mike ...

I'm very careful even with simple concepts like Monte Carlo. A significant amount of what I do involves some form of MC .... I just rarely say it...

One of my exec supporters, meant as a term of endearment, even calls it "Voodoo" math! ;-)

Comment by Mike O'Neil on June 17, 2012 at 1:24pm

Jwinburnc - it is even more distressing when a direct report to the CE does not understand the difference between a stock and a flow!  We are having the same issue with understanding. We find we cannot use some better performing models because the business cannot intuitively grasp them. But hey, we have come a long way.  Back in my early days Box Jenkins techniques we a black art and frowned upon, and data mining - the devil's spawn.  If you could not justify the model within a range of currently accepted economic theory you got no traction at all.  Nowadays at least we have some acceptance on the basis of ROI.

Comment by jwinburnnc on June 17, 2012 at 7:21am

Even companies that get "it" at the senior executive level still struggle. As an example; Recently, my group produced a predictive model for a resourcing problem that when "back tested" proved an 80% improvement over current methods. When presented to a group of managers in finance, one of the managers responded "I don't understand the math, so I don't believe it." Fortunately others did "get it", and it was applied. This is our reality. ;-)

Follow Us

On Data Science Central

On DataViz

On Hadoop

© 2016   AnalyticBridge.com is a subsidiary and dedicated channel of Data Science Central LLC   Powered by

Badges  |  Report an Issue  |  Terms of Service