Imagine you are a recruiter working at one of the top supermarket chains. You need to hire a new shelf stacker. You'll be offering a top salary for this position and will consequently get a very high number of applicants. Because of this, you decide to be very selective about who you actually employ. You decide that you will make all applicants take part in a weightlifting contest and whoever can lift the most, gets the job.
Word gets around about this and soon enough everyone who wants to be a shelf stacker, of course wanting to earn the top salaries at the top supermarkets, is in the gym, training weights. As a result, all supermarkets, even the smaller, cheaper ones, are now easily able to require an 80kg deadlift as a minimum requirement.
It seems a bit unfair, doesn't it?
98-99% of able-bodied applicants are more than capable of stacking shelves. The over-the-top selectivity of the top supermarket drove the whole industry into selectivity madness and it's now very difficult to get a job as a shelf stacker without being able to lift heavy weights even though it should by no means be a requirement.
The supermarkets are the tech companies and the weightlifting competition is the computer science degree.
As much as I myself enjoyed my computer science degree from a level of academic interest, I, like many other software developers, don't think it was a necessary pre-requisite for my day job. Information theory, quantum computing and discrete mathematics are all fascinating topics but not ones I use when rolling-out deployments to our Kubernetes clusters, designing and creating new backend APIs or re-organising the layout of a frontend page.
The requirement to have a computer science degree to be a software developer comes from the big companies driving unneeded barriers for entry in order to whittle down applicants.
Here's a quick look at where the big tech companies currently stand on degree requirements:
Is there a better way now? Requiring a degree is an age old hiring technique, but can we do better in the modern world?
Nothing!
Degrees are a great form of education. I enjoyed mine and many people enjoy theirs. If you want to go into academics then go for it. I won't expound into discussion on the higher education system here, but academics is a great path for some people.
I should also drop a pragmatic point here and say that as much as I don't wish it to be this way, most companies, as shown above, do still require a degree. So if you are looking to get into the top tech companies, I still recommend you go for one.
Reform of the financial requirements for higher education is a hot topic but in the current world not everyone can afford to get a degree. Therefore requiring degrees of all your applicants is going to be a real barrier for entry. Higher education in the UK and US is rife with underrepresentation of many minorities, so any company requiring certification from these institutions is only going to reflect this.
Google is a fantastic place to learn anything in tech. We all know that. Want to learn how to spin up a Kafka cluster? Google it. Want to learn how to register an SSL certificate for your website? Google it. Want to learn the absolute basics of a particular programming language? Google it. This is how I and many others learnt to be a software developer. Not by studying at university, but by using freely available internet content to drive my own personal projects and interests.
There are also non-free platforms out there like Udemy and Coursera that provide courses for computer programming and many other topics. These are great, but also shouldn't be used as requirements on job descriptions. They basically become mini-degrees that add more financial barriers for entry.
Personal projects have and will always be the best way to teach yourself tech. They force you to think for yourself, come up with your own designs and adapt your own style as a programmer. When I look at CVs, a project done out of personal interest speaks 10x more than a Udemy course certificate.
In a hypothetical world where no developers have computer science degrees, how can companies decide who to hire from applicant pools? Here's a few ideas, each with their own pros/cons:
It would be remiss of me to make such strong arguments without also mentioning some counter-arguments here.
There are some very good reasons why traditionally a high paying job would require a degree. A degree not only shows technical competency in a subject, but also a level of diligence and maturity. Spending 3 years working hard at an academic institution is great proof to an employer that you're driven and hard working.
In many other industries like medicine, veterinary sciences and dentistry, degrees are far more appropriate. There's a lot of domain knowledge that simply can't be acquired so easily outside of academic institutions. One could argue that if degrees are required for many other industries, maybe we need a single standard instead of tech engineers getting a different hiring treatment.
One of the nice things many companies get out of checking your degree is secondary competencies e.g. if you went to an English speaking university, your degree acts as proof of your English language skills. This can be helpful if the company can't afford to waste time following up certifications for these secondary competencies from private institutions littered around the world.
In my humble opinion, the tech world is still too reliant on higher educational certification. It's time to recognise that our industry is different; programming and software development is not a matter of textbook knowledge recall. It is instead a heavily opinionated, community driven art form. It is therefore our responsibility to make this community as open and as accessible as possible or we risk stagnation of true development and innovation.