A quarter of a century ago, a single developer could build a decent website using only HTML and CSS. Recruiting developers was fairly straightforward because the skills, while scarce, were easy to name and recognize. And aspiring engineers could learn the bulk of the skills they’d need from standard college computer science coursework.
Today, top-tier websites require collaborative efforts from a team of engineers with increasingly specialized skill sets. Front-end developers, who design the pages users interact with, work alongside back-end engineers, mobile engineers, platform engineers, information security teams, all of whom have taken years to master their specialty.
Employers are, in turn, no longer wrestling with just one “technical skills gap,” but hundreds of discrete shortages and surpluses. “Developer” is no more apt a job description than “doctor.” And yet, most computer science degree programs look much like they did 25 years ago.
The challenge stems from the fact that while faculty do their best to keep pace, software development evolves constantly. Entire segments of the sector can be unrecognizable year to year. It’s unrealistic to expect institutions, or faculty, to stay apprised of the industry’s exponential growth curve amidst the daily responsibilities of teaching and research.
As a result, many computer science students, while well-versed in the fundamentals of operating systems or data analytics, barely touch on new technologies in areas like DevOps, cybersecurity, and artificial intelligence. So, even while more students enroll in computer science programs, each graduating class is more and more misaligned with the industry’s fastest-growing specialties.
The disconnect is also grounded in the way computer science students learn. At most major firms, codebases are often millions of lines of code, but most college students never work with more than 1,000 lines at a time. New engineers begin their careers with no sense of how to troubleshoot problems with the massive codebases they are dealing with. Even worse, graduates often lack the soft skills that are necessary for success in the industry. The industry’s best engineers are prodigious collaborators and communicators—but college computer science programs consist almost entirely of solo work.
College computer science programs were not designed to teach software engineering. Most were created to teach the theory behind computer science, and are taught by academics decades removed from the industry—which means that the gap between engineering education and practice widens every year. But, the tech talent shortage is expected to reach one million engineers by 2024. To close the gap, the tech industry will have to take a more active role in computer science education.
The good news is that a growing number of employers are taking matters into their own hands by deepening relationships with colleges to build programs that equip students for a tech-driven future of work. And they’re finding that colleges are up for the task.
In 2017, Google launched a pathbreaking partnership with Howard University that gives students access to industry-relevant software engineering training at Google’s headquarters. Originally dubbed “Howard West,” the three-month summer residency provided rising juniors and seniors in the university’s computer science program with dedicated workspaces at Google’s Mountain View campus, a stipend to cover housing and other costs, and access to courses taught by Howard faculty and Google engineers. Last year, Howard and Google expanded the program to cover a full academic year, and widened it to include students from other historically Black and Hispanic-serving institutions.
City College of New York recently partnered with Facebook to develop a cybersecurity graduate program, drawing upon Facebook’s insights and experts to give students an edge in one of the industry’s most in-demand, fastest-growing specialties. The social media company has also partnered with several community colleges to help develop industry-informed digital marketing and coding programs designed to meet the specific employment needs of area businesses.
Northern Virginia Community College has teamed up with Amazon to create an innovative degree program focused on cloud computing. Google has developed an IT Support Certificate that is now available through 25 community colleges. And, last year, several institutions in the Washington, D.C., region—including Georgetown University, the University of Marylan, and the University of Richmond—partnered with companies like Capital One, JPMorgan Chase and Under Armour to form an alliance focused on developing stronger technology talent.
Replicated at scale, these partnerships will lay the foundation for a new sort of relationship between colleges and corporations. It is a relationship that challenges colleges and universities to come to the table with and build industry-relevant coursework that will set students up for success in an increasingly specialized marketplace. But it also challenges employers to share their needs and expectations with greater precision.
Rather than play a passive role on the receiving end of talent development, employers can stake their role in identifying the granular skills they need and partnering with the colleges and universities charged with building them. Together, they hold the potential to build a tech talent pipeline that evolves as fast as the industry.