trey causey

What it's like to be on the data science job market

<<< Back to writing

Update: I was remiss in not originally including Erin Shellman's excellent Crushed it! Landing a data science job, which is probably the best guide to preparing for the actual data science interview that I've ever read. It's full of good resources, so read it first and then head back here.

Sooner or later you're going to find yourself looking for a data science job. Maybe it's your first one or maybe you're changing jobs. Even if you're fully confident in your skills, have no impostor syndrome, and have tons of inside leads at great companies, it's a tremendously stressful experience. The process of looking for a new job is often one that occurs secretly and confidentially and then is so exhausting that discussing the process is the last thing you want to do. I hope to change that.

I recently went through this myself and thought I'd record my thoughts on the process while they're still fresh. I interviewed a lot. Some went well, some didn't go well at all. The reason for this was sometimes me, sometimes them, often both. Sometimes I didn't get selected for an on-site interview. Other times I withdrew from the process after seeing that it wouldn't be a good fit for me. I took notes throughout, though, and here they are.

Warning: What follows are my personal thoughts, extrapolating from a small sample, and generalization from anecdotes. Precisely the kind of thing that data scientists hate! But, despite the frequent misquotation, the plural of anecdote is data, so this discussion should start somewhere.

Interviewers: What do they want?

Many companies still have no idea what they're looking for when they're looking to hire a data scientist. As Robert Chang, a data scientist data engineer at Twitter Airbnb, lays out in this superb post, there are two kinds of data scientists -- those who are stronger at analysis (type A) and those who stronger at building things (type B). As things stand today, there seems to be a strong bias in hiring requirement for type B data scientists. Quite frequently, I encountered interviewers that were essentially looking for software engineers who knew a little stats / ML.

And that's fine, if the role requires you to mainly be a software engineer that knows some stats / ML. However, I think many interviewers default to this profile because they've been hiring software engineers for years and "know" how to do it (more on that below), so they fall back on that process when it comes to hiring data scientists. Simply put, however, if the job is analysis heavy, a technical screen that is almost entirely software engineering questions is not a good idea and won't select individuals who are good for the actual work involved.

Everyone wants a "full-stack" data scientist but haven't really reflected on whether or not this is what they actually need. Don't assume that what the recruiter or hiring manager says they need is what they actually need for the role -- this is especially true if you're the first data scientist being hired.

No one knows everything and everyone has strengths and weaknesses. It should be fine to admit these weaknesses and identify the areas in which you'd like to grow. Good organizations will welcome this as both a sign of self-awareness and an opportunity to grow. Others will say "maybe we can work with that" and not call you back while they look for someone who either doesn't need that help or doesn't admit to it.

Interviews: The standard process

The interview process is mostly the same everywhere, with slight deviations. You'll usually have some kind of initial phone call with a recruiter who will ask you some general questions about your skills and background. They may try to get you to offer a salary number at this stage. There are different takes on this, but I lean towards not discussing salary at this stage. They'll say they just want to make sure that you're in the same ballpark, but if they're a reasonable company, they already know what market value is (better than you do, most likely). If you're being referred by an internal employee, this call may or may not happen.

Next, you typically do a technical phone screen. This may or may not involve writing code over the phone or in a screen sharing environment. You may or may not have access to your usual development environment. It may literally just be a shared Google document. If you're anything like me, this is pretty unnerving. If you don't pass this screen, you won't be advanced to the on-site interview.

It's especially unnerving because interviewers often resort to 'classic' programming interview questions that are drilled into computer science undergrads but are often quite puzzling to non-CS-trained data scientists. Many will recommend reading and memorizing Cracking the Coding Interview for these types of questions but, if you are a type A data scientist, it's also worth asking yourself if this is a good signal that they want someone with your skills. These kinds of questions include "reverse a linked list" or "invert a binary tree". These are also the kinds of questions that will be done on a whiteboard during the on-site at some companies.

If you're very nervous about this kind of question or don't perform well in this kind of environment, offer to provide code samples that better reflect your ability and style.

Assuming you pass this session, prepare for the on-site interview, which will be anywhere from 3 to 7 hours of sitting in a single room and talking to people for 30-60 minutes each. I think the uncertainty of this portion of the process is very unsettling, but there's not much to be done about that. I think the best places to interview prepare the candidate for what they will be talking about and tell the candidate what to expect in terms of number of meetings & length.

Prepare to interview a lot if you're really focused on finding a good fit for you.

Interview questions: What to expect

Prepare to be asked terrible questions. Some of the worst questions I've been asked:

The last kind of question can actually be really fun if it's proposed as an honest-to-goodness brainstorming question and not one that you're expected to "solve" by finding the same solution as the interviewer.

There may or may not be coding on the whiteboard questions (if they've read my post, maybe there won't be). You should be prepared to talk about the complexity of your solution in terms of time or space. Depending on the interviewer, this may be as simple as talking about what you need to persist in memory or keep track of to talking about the big O notation for your solution. If it's a model, is it slow at training time? Prediction time? What are the tradeoffs of your approach?

You should spend some time memorizing formulas such as binomial probabilities, Bayes' Rule, and so on. Be acquainted with the most common probability distributions. Understand and be capable of explaining model fitting procedures like stochastic gradient descent and maximum likelihood as well as model evaluation metrics. Be prepared to answer how you would implement these things 'from scratch' without the help of a package or library. William Chen's probability cheatsheet is quite good for this.

Be prepared to talk in detail about any of the projects listed your resume. At a recent interview, I spent the bulk of one session answering questions about the oldest project listed on my resume, and I was pretty rusty on the details. However, I really enjoy both asking and answering questions like this, as they give you a chance to demonstrate your mastery of a topic. The one caveat, though, is that you will always know more about these things than the person asking about them, so don't gloss over "obvious" details.

Depending on the kind of role you're applying for, there may also be more product-focused questions. This is something that's more difficult to prepare for but may be very important, especially if the role is highly integrated with a product team. Spend time using the product that the company produces (when possible), think about it as both a user and a data scientist. What is good or bad about the experience? How would you know from collecting data if the experience was good or bad? What are the kinds of things you would want to optimize for? What are both the short-term and long-term consequences of doing so? Is there existing instrumentation?

Questions you should ask

Through all of this interrogation and puzzle-solving, it's important to remember that you are interviewing them as well. You need to find a good fit. You need to want to work there. I've been in several bad fits which has been difficult, but it's also provided me with very specific questions to ask. Offer to sign an NDA if they say they can't give you details on these answers. You don't want to go into a new job uncertain about what you'll actually be doing and working on. These questions include:

One note -- if the company you're interviewing with doesn't leave ample time for questions and answers with each and every interviewer, this is a red flag. They do not see this as an opportunity for you to assess fit, only an opportunity for you to demonstrate your worth. Be very, very wary of this. I once had an interviewer at a prestigious company say to me, "OK, we have three minutes left. I'm trying to decide if I'll ask you another question or if I want to leave time for you to ask questions. I think I'll ask you another question." Instant bad experience.

The troubling reality of interviews

Most companies are bad at hiring. They'll treat you skeptically and make you prove "you can code" as if an existing body of work isn't enough. They'll make you solve problems by hand that you haven't solved by hand since your undergrad days and probably wouldn't solve by hand today because that's how stupid mistakes get made. They'll make you solve ridiculous problems that don't reflect the actual day-to-day work of the position. They'll say they do this to see "how you think" or "how you approach a problem", but no one has any idea if these exercises are actually valid measures of those skills. The assumption that these kinds of questions actually measure some attributes is unspoken but widespread. Very little actual empirical work is done to see if these are actually good predictors of good employeees.

For a great in-depth take on this, Ann Harter is a must-read.

It can be easy to internalize your performance in interviews as an overall reflection of your abilities as a data scientist and even as your worth as a person. I've done this. There is some signal in there -- you can identify gaps in your area of knowledge, or at least identify things that you need to focus on learning in order to pass the interview stage. It may be mostly a waste of time for your day-to-day work, but you have to play the game on some level. It's unfortunate, but that's where we stand right now.

As I've gotten older and more experienced, I push back in interviews. I ask questions about what the purpose of a problem is or state that I don't think this is a good evaluation of my skills or abilities. Some people probably see this as me thinking I"m "too good" to answer the questions everyone else has to answer, but I see it as doing my part to be a critical thinker about evaluation, prediction, and hiring. Hopefully you'll do this too and, as more of us are in a position where we are building teams and hiring, we'll think more carefully about what we're trying to accomplish and how we can get there instead of just copying the same patterns that have been around for years.

<<< Back to writing

Copyright ©2022 Trey Causey