The Computer Science Degree: Baptism by Fire

The Computer Science Degree: Baptism by Fire

Breaking down the barriers for entry in the tech industry is in our best interests

Mon Apr 04 2022

An Analogy

Imagine you are a recruiter working at one of the top supermarket chains. You need to hire a new shelf stacker. You'll be offering a top salary for this position and will consequently get a very high number of applicants. Because of this, you decide to be very selective about who you actually employ. You decide that you will make all applicants take part in a weightlifting contest and whoever can lift the most, gets the job.

Word gets around about this and soon enough everyone who wants to be a shelf stacker, of course wanting to earn the top salaries at the top supermarkets, is in the gym, training weights. As a result, all supermarkets, even the smaller, cheaper ones, are now easily able to require an 80kg deadlift as a minimum requirement.

It seems a bit unfair, doesn't it?

98-99% of able-bodied applicants are more than capable of stacking shelves. The over-the-top selectivity of the top supermarket drove the whole industry into selectivity madness and it's now very difficult to get a job as a shelf stacker without being able to lift heavy weights even though it should by no means be a requirement.

The supermarkets are the tech companies and the weightlifting competition is the computer science degree.


The Tech World

As much as I myself enjoyed my computer science degree from a level of academic interest, I, like many other software developers, don't think it was a necessary pre-requisite for my day job. Information theory, quantum computing and discrete mathematics are all fascinating topics but not ones I use when rolling-out deployments to our Kubernetes clusters, designing and creating new backend APIs or re-organising the layout of a frontend page.

The requirement to have a computer science degree to be a software developer comes from the big companies driving unneeded barriers for entry in order to whittle down applicants.

Here's a quick look at where the big tech companies currently stand on degree requirements:

  • Meta: Most jobs seem to require a degree.
  • Google: About a 50/50 split with jobs listing a degree requirement. Many of the other jobs however list a degree as preferred however.
  • Netflix: Seems no jobs list a degree as a requirement.
  • Apple: Most jobs seem to require a degree.

Is there a better way now? Requiring a degree is an age old hiring technique, but can we do better in the modern world?


What's Wrong With Degrees?

Nothing!

Degrees are a great form of education. I enjoyed mine and many people enjoy theirs. If you want to go into academics then go for it. I won't expound into discussion on the higher education system here, but academics is a great path for some people.

I should also drop a pragmatic point here and say that as much as I don't wish it to be this way, most companies, as shown above, do still require a degree. So if you are looking to get into the top tech companies, I still recommend you go for one.

I Can't Afford the Gym!

Reform of the financial requirements for higher education is a hot topic but in the current world not everyone can afford to get a degree. Therefore requiring degrees of all your applicants is going to be a real barrier for entry. Higher education in the UK and US is rife with underrepresentation of many minorities, so any company requiring certification from these institutions is only going to reflect this.

Modern Tech Education

Google is a fantastic place to learn anything in tech. We all know that. Want to learn how to spin up a Kafka cluster? Google it. Want to learn how to register an SSL certificate for your website? Google it. Want to learn the absolute basics of a particular programming language? Google it. This is how I and many others learnt to be a software developer. Not by studying at university, but by using freely available internet content to drive my own personal projects and interests.

There are also non-free platforms out there like Udemy and Coursera that provide courses for computer programming and many other topics. These are great, but also shouldn't be used as requirements on job descriptions. They basically become mini-degrees that add more financial barriers for entry.

Personal projects have and will always be the best way to teach yourself tech. They force you to think for yourself, come up with your own designs and adapt your own style as a programmer. When I look at CVs, a project done out of personal interest speaks 10x more than a Udemy course certificate.

Choosing Applicants

In a hypothetical world where no developers have computer science degrees, how can companies decide who to hire from applicant pools? Here's a few ideas, each with their own pros/cons:

  • The interview stage: Problem solving and coding interviews are another controversial topic, but I like them. They give the company a better understanding of how “smart” you are, how good you are at solving problems and how you tackle unknowns without them needing to check any educational certificates. The obvious con to this method is that a company won't be able to spend time interviewing all candidates in this manner.
  • Random sampling: When companies get too many applicants for a position, they will often just reject some percentage of them with no screening whatsoever. This may sound unfair to applicants, but it's actually quite the opposite. If the company were to try and assess all applicants, they wouldn't be able to do so fairly since there are so many. By random sampling, you allow a fair assessment of all chosen candidates. As a candidate, if you get rejected like this, you just apply to more jobs. Software engineers will rarely apply to a single job anyway and there's always more than 1 great opportunity out there. The cons to this approach are that companies could miss out on a top employee and that applicants aspiring for a single dream position could get dropped without any consideration.s
  • Junior and inexperienced hires: Many jobs that require a degree probably also require some number of years of industry experience. This isn't intrinsically a bad thing; sometimes you need someone with experience and many lessons learned. However, many companies fall into the trap of requiring some number of years of experience across all jobs apart from the explicitly junior or apprentice style jobs. If they were to step away from this and be more willing to, when a space opens up in a dev team, recruit a fresh faced inexperienced candidate, then the company creates more of a culture of internal mentoring and training. This has many benefits such as ease of internal promotion and better employee retention which in turn means you end up hiring less frequently so the hiring process can become more thorough. The con to this approach is that hiring Junior inexperienced developers is always going to be more difficult/risky than recruiting experienced developers.

Time to Play Devil's Advocate

It would be remiss of me to make such strong arguments without also mentioning some counter-arguments here.

There are some very good reasons why traditionally a high paying job would require a degree. A degree not only shows technical competency in a subject, but also a level of diligence and maturity. Spending 3 years working hard at an academic institution is great proof to an employer that you're driven and hard working.

In many other industries like medicine, veterinary sciences and dentistry, degrees are far more appropriate. There's a lot of domain knowledge that simply can't be acquired so easily outside of academic institutions. One could argue that if degrees are required for many other industries, maybe we need a single standard instead of tech engineers getting a different hiring treatment.

One of the nice things many companies get out of checking your degree is secondary competencies e.g. if you went to an English speaking university, your degree acts as proof of your English language skills. This can be helpful if the company can't afford to waste time following up certifications for these secondary competencies from private institutions littered around the world.

Conclusion

In my humble opinion, the tech world is still too reliant on higher educational certification. It's time to recognise that our industry is different; programming and software development is not a matter of textbook knowledge recall. It is instead a heavily opinionated, community driven art form. It is therefore our responsibility to make this community as open and as accessible as possible or we risk stagnation of true development and innovation.