The DEI Issue No One Is Talking About

You can do everything else right, but if your DEI initiative doesn’t also address this one thing…well, you’re doing it wrong.

The DEI Issue No One Is Talking About
Photo by Rodion Kutsaiev / Unsplash

Spend any time job hunting and you’re likely to see one requirement pop up over and over again, mocking you like the dog from Duck Hunt. Sometimes its pinned to the top of the job description, sometimes it’s buried at the bottom in the small text, but it’s usually there. Finding a job listing without it is incredibly rare these days, and the competition for those kinds of jobs is extremely tough. And, for a specific subset of the American workforce, seeing these few words listed in a job description can be one of the most disheartening aspects of searching for their next opportunity.

Those words?

Bachelor’s or Master’s degree in software engineering, computer science or related field

“But wait,” you might say, “don’t you want people who are qualified for the job to apply?”

Of course I do. I think everyone does. But the mistake here is conflating education with qualification: graduating from a four-year college program in any discipline doesn’t make you qualified to then go practice that discipline, it simply means you took some classes and have some [mostly] theoretical knowledge about a subject. If a degree equaled qualification then doctors wouldn’t spend an additional four or more years after graduation continuing to learn and hone their craft; the same goes for lawyers, pilots, and many more professions. It is important to divorce the notion of education from qualification if we want to have a candid conversation about what it means to be truly qualified for a position, especially in industries like technology, design, and various media verticals where the lines between personal and professional experience are blurred and growing more so with each year.

So why are companies insisting on requiring at least a four-year degree to be considered for their positions? Well, that question isn’t so easy to answer — the number of reasons for the requirement can be as numerous as there are companies but I do want to address a few of the more common reasons that I’ve come across.

Foundational Knowledge

For many, acquiring a degree in computer science (CS) is proof that a candidate has the foundational knowledge necessary for a successful career in one of the various computer science disciplines. While there certainly are disciplines within CS where someone would benefit from that sort of institutionalized knowledge, the same does not hold true for the entire spectrum of possible career paths within CS and technology as a whole. Understanding the intricacies of how processors cache and retrieve data from the various caches and registers is good knowledge to have if you’re working with high-performance computing applications or software that runs close to the hardware, but your average web or API developer using a high-level language like JavaScript, Python, or Java doesn’t need it. The foundational knowledge required to perform at a high level in a typical product development role — one where business logic and external integrations are more important than complicated memory management or sub-millisecond latency — can be acquired in a multitude of other ways and, I would argue, can’t be learned through a college course.

It Shows Commitment to Something

This reason is the one that annoys me the most by far, as if the only way to show that someone can start and stick to something is by going to college. Commitment and the ability to drive a project to completion can also be demonstrated in other ways outside a formal education background. Candidates who volunteer their time outside work, are involved in the local community, or have a second “job” at their church all show commitment. Candidates who have a hobby that they’ve worked at for years show commitment. Candidates who opted to join the military instead of going to college after high school and have spent the last four years doing the exact job being advertised show commitment. A bachelor's degree is not the be-all, end-all of showcasing dedication to a cause.

Pre-Filtering

Probably the most frustrating reason for having a degree requirement is sheer laziness. Hiring managers, recruiters, HR administrators, and whoever else uses schooling as a pre-filter for applications are lazy and run the risk of ignoring exceptional candidates for arbitrary reasons not grounded in reality.

The DEI Problem

So what does all of this have to do with diversity, equity, and inclusion (DEI)?

Everything, because when you screen for a college degree you’re not really screening for educational background, you’re screening for socioeconomic background.

The cost of higher education in the US has increased 130% since 1990 and its growth outpaces inflation by 171.5%. The average annual tuition for a 4-year public college was around $9,300 in 2019 and $32,700 for a private college, meaning a 4-year degree (not taking into account expenses like books, room & board, meals, and a host of other money drains) can cost anywhere from $37,000 to $131,000. That number is astounding by itself but it is also wholly out of reach for a lot of Americans regardless of which route they take. This, combined with the ongoing student debt crisis, has many more young Americans opting to forego college altogether and instead focus on a career or trade earlier, but this endeavor can be hampered by unnecessary and discriminatory degree requirements.

As minorities in the US tend to be on the lower end of the wealth equality spectrum, by requiring expensive degrees we are essentially pre-screening for a whiter, wealthier candidate pool than we would have otherwise. Identifying and hiring candidates from all walks of life is hard enough but we’re just making the job even harder by artificially restricting the candidate pool, and the numbers back this up. According to the US Bureau of Labor Statistics, black and Latinx people are less likely to obtain a 4-year degree and are less likely to work in management or professional (ie, “white collar”) positions. Until we drop degree requirements, especially for positions like software development, we will continue to fail to represent the diversity of America in our tech workforce.

So What Then?

“Okay,” I can hear you asking, “what do we use to screen candidates instead of degrees?”

I’m glad you asked.

Experience.

For positions that aren’t entry-level positions you screen on experience instead of education. There is a vast chasm between knowledge and experience that is only bridged by living through the shit. Knowledge can tell you that it is not a good idea to give developers production database access, but the experience of waking up to a dropped table in production will help reinforce that knowledge and give context as to why.

People have an incredible, almost miraculous, ability to learn and to do so in a multitude of ways. So why should we limit everyone to just one way of learning?