There's been some discussion in the Facebook group where a bunch of people wanted various info about jobs in the tech industry.
A lot of people expressed interest in asking questions and a lot of people expressed interest in answering them, so I thought maybe the forum would be a better venue for this than a Facebook comment thread. So, ask away! I'll come back to answer stuff later today. (And hopefully some other folks will too so that it's not just my perspective!)
What are people's stories about how to get a job as a data scientist? (or data analyst). This is a kind-of new career, and although quantitative background is obviously key, it's not obvious how to get your foot in the door, if you haven't done it.
I didn't feel like I had my "foot in the door" when I applied, and got, data science jobs. For instance, I hadn't done any data science projects and had only 1-2 related courses on my transcript. (To be fair, I was only applying for an internship and later converted to full-time, so perhaps that's your answer?)
Same here.
The old joke is that a data scientist is a better statistician than a programmer and a better programmer than a statistician. That's what you need -- statistics and programming. You don't have to be world class in either, though it helps. You just need both.
I did not have a degree directly related to data science. I had studied political science and psychology, when most data scientists study statistics or computer science. (But both poli sci and psych do involve statistics.)
I had always been a hobbyist programmer, but I took time to learn R, which is a very common language for data scientists. (Python is also popular.) I did this through Coursera.
I had also learned Ruby, and I got my start as a software engineer intern after graduating college. I then transferred to the data science department.
Another big benefit for me was that the head of the data science department is a friend of mine, who also helped me get the internship. Skills matter, but so do internal referrals. ;)
It may have become more difficult. At my company, I believe we interview a lot more people for data science than we do Engineering. We seem to have a lot more difficulty finding engineers. That said, this could in part be because our data science seems more interesting than our engineering.
What is in your company the difference between data science and 'engineering'?
Software engineers do computer programming and are expected to know a lot about a programming language (stereotypically Ruby) and are not expected to know any math or statistics.
Data scientists are expected to both know how to program (typically Python or R) and to know a lot of statistics (and some math), but generally are not expected to know how to program nearly as well as software engineers.
Data engineers (my profession) are in the middle ground and are expected to know how to program just as well as a software engineer, just in a data-relevant language (typically Python or R). Data engineers are also expected to know some stats (much more than a software engineer) but not nearly as much stats as a data scientist.
What skills/experience do you think will be useful to have in 3-5 years, either in general or for EA plots?
Lots of different skills for lots of different careers: In general, as you advance your career, management and sales skills are fairly useful and transferable. Being experienced and expert in any domain is useful. If you want to do EA research, then academic skills are handy. In tech and research, programming looks useful. If you care about tech and know some maths, then machine learning looks like a good and growing area. That's just off the top of my head.
A lot of discussion is about web development, and most bootcamps focus on that. What other fields are particularly interesting? Most job descriptions I find are not about web development and data science. How much does the European job market differ from the US?
Areas I personally think are interesting include programming languages, databases, machine learning, cryptography/security, and networks, off the top of my head. Though I suspect that these don't make up a very large fraction of job postings! Lots of jobs are just writing miscellaneous tools to automate various parts of other businesses.
A concern mentioned on 8000hours.org is a possible oversupply due to bootcamps. MOOCs might also contribute to this - you can learn programming anywhere without formal education. To what extent is this true?
I think it's highly unlikely that macro-level job prospects for generic software development will continue to look as good as they do right now:
Venture investment in software is large and growing extremely quickly right now (more than 50% yearly last year). Annual venture capital investment is now equal to about 5% of technology industry revenue (~$50b on ~$1T) and probably has an outsized effect on jobs, so a slowdown could put downward pressure on salaries.
Right now, bootcamps are small (~10% of newly-educated entrants into the tech industry). I'm not sure on what timescale 80k is worried about a "short-term oversupply"--bootcamp graduates won't be a large fraction of tech industry workers for at least 10 years, since college grads are growing as well.
I'd be more worried about a long-term equilibration of supply: right now there appears to be a substantial amount of money lying on the ground (as bootcamps demonstrate), which suggests the market is not in equilibrium and we should expect equilibrium wages of tech workers to be lower.
For people who have worked in the technology sector, what form has the most useful learning come in? (ie. learning from school, learning while working on a problem independently, learning while collaborating with people, learning from reading previous work/existing codebases, etc.)?
When first starting out: learning while collaborating with people.
When going from beginner to intermediate: learning while working on a problem independently.
When going from intermediate to expert: learning from reading previous work/existing codebases
That’s also almost my experience, but for me “learning while working on a problem independently” applies in all phases about equally. I haven’t tried “learning while collaborating with people” at novice level, but it’s well possible that it’s useful.
I learned a ton of useful statistics and machine learning by reading textbooks. So far that's been my best source.
It seems like the way to make the most money from working in tech jobs would be to find identifying startups/companies that are likely to do well in the future, work with them, and make money from the equity you get. For example, Dustin Moskovitz suggests that you can get a better return from trying to be employee #100 at the next Facebook or Dropbox than by being an entrepreneur Any thoughts on how to identify startups/companies likely to do well/be valuable to work for, or at least rule out ones likely to fail? (It seems like the problem of doing this from an investor standpoint is well investigated, and hard to do, but the employee standpoint is different).
It seems like the correct approach would be to make predictions on the future performance of a bunch of startups and track the results, in order to calibrate your predictive model, but one would need time to build up a prediction history. Short of this, there might be heuristics that are sort of helpful, ie. I'd guess that startups with more funding or more employees are more likely to succeed due to more people having confidence in them and having survived for some period of time already, but this also indicate that you are likely to get less equity.
Although I'm a fan of this attitude in general, venture investment is not the ideal candidate for the efficient market hypothesis, and investors have very different deal structure from employees. Some notes:
VCs manage other people's money, which means they're basically buying options on the startups, not the startups themselves. As a result they do a lot of variance-chasing.
The market for venture investments is incredibly illiquid. It's virtually impossible to short sell them, for instance, which inflates valuations.
The things that get traded in a venture deal are not just cash. A company that got valued at $10m by Sequoia is likely more valuable than one that got valued at $10m by a first-time investor. Similarly, a company that got valued at $10m in a round where the VC got a 3x liquidation preference is much less valuable than the equivalent with no liquidation preference.
Anecdotally, investors often do not know very much about the businesses they invest in and do not understand them well. My impression is that most venture investors are not much better than an index of startups, but mostly profit/stay in business because (a) the entire sector is growing and (b) they're the ones with access to dealflow.
In summary, investor valuations are biased high, probably by a large factor IMO, and also have incredibly high variance. I would use them only with extreme caution.
Employee #100 seems a bit implausible. If you joined Dropbox as employee #100 it would be in early 2012, at which point they had just gotten a $4B valuation. It's only gone up 2.5x since then--a mere 35% per year--so you probably wouldn't have done better than a founder over the equivalent timespan. Especially once you take into account the many worse options that were in Dropbox's reference class in 2012, like Fab.
That said, I agree that trying to forecast startups is probably a useful exercise--and maybe even possible to do historically, if you're interested in ones as high-profile as Dropbox. It's an open question to me how efficient the market is here (i.e., are companies with semi-obvious predictors of success likely to offer less equity).
I heard a rumor suggesting Dropbox was slower to hire than the typical tech company (i.e. a $4B company with <100 employees is somewhat atypical even in tech), though this may be what the norm is trending towards.
Great point. I would add that 35% annual raises are completely within the realm of possibility in direct employment as well.
What are the minimum skills or experience necessary to get hired as a full time web developer?
I don't know what the bare minimum to get hired anywhere is, but I know that most medium-sized and up places that you might want to work will hire an entry level employee who looks smart but has a very small amount of actual experience.
A good applicant can write a simple program on a white board and has a project on github, or a past internship, or a dynamic website that they run, to point at. If you think you're on the edge now, these accomplishments shouldn't be too far away.
BSing interviews and lying on your resume? Or even less if you can rely on nepotism? :P
Seriously, there are places that are basically just looking for warm bodies, but I doubt that you'd actually be interested in working there. Perhaps you could be a bit more concrete?
This question is abstract enough that I'm having a hard time coming up with a meaningful answer. Perhaps someone with more experience hiring for webdev roles could fill in? (I only interviewed for webdev jobs, so I don't really know what their minimum was!)
It's tricky as I'm just starting to consider this career, so may not be familiar enough with it or far enough along with my planning to be usefully concrete. It partly depends on where sensible places to start are with my level of professional experience and knowledge (not negligible, but never fulltime webdev). Pick an example: a junior job at a webdev agency which builds websites for hire. The requirements for that might be illuminating.
I would second Ben's statement – if you have actual experience coding you're probably overqualified for "a junior job at a webdev agency which builds websites for hire."
A clarifying question: When you say "builds websites for hire" I think "set up a boilerplate Word press installation with some stock photos to impress the rubes". Is that what you mean? Or do you mean "create highly interactive single page websites that need to scale to millions of concurrent users"? Those are very different things.
Maybe if you gave a salary target that might help us calibrate.
Wow weird.
No, not static WordPress sites - more like the second, or something in between, though as a junior webdev I wouldn't be the one taking care of the scaling (setting up the server with varnish, etc.), apart from avoiding direct database queries where possible.
Again I run into the problem of not knowing enough about the industry, but how about €35,000 in a place where you could relatively quickly head up towards €50,000?
This may be highly dependent on your location, but the average starting salary for a computer science grad in the US is greater than €50 K.
Maybe I'm completely miscalibrated, but if you know words like "varnish" and realize that they apply to scaling, then I think you are qualified to be a junior web developer. I would recommend applying to some jobs and seeing what happens. Let us know either way!
If you can complete this, you can probably get an internship somewhere, and from there you can easily transition to a full-time job.