We may be on the verge of a recession, or we may already be in one. After decades of slack labor markets and anemic wage growth, for an all-too-brief period the post-COVID-lockdown world saw a tight job market and employers actually competing for workers. But with inflation high and the Fed having raised rates aggressively, many are projecting a serious downturn that will surely hurt workers. Under those conditions, it’s more important than ever that college students take practical majors. Like French.
Yes, French. The major that’s so often derided as the height of impractical folly, the interest of people who want to fritter their time away reciting poetry and watching New Wave cinema, in fact revolves around a skill that has a great chance to be invaluable in the coming half-century: the ability to communicate in one of the fastest-growing languages in the world. Though it’s barely discussed in American news and commentary, central and west Africa — that is to say, Francophone Africa — has seen a population explosion in recent decades that’s arguably the biggest in the world. And while birth-rate growth in the region has started to level off, declining birth rates or outright declining populations across the world mean that the French-speaking part of Africa will play a huge role in determining humanity’s future. The French language rises with it. To put things in relative terms, the Francophone world, where as many as 525 million people live, is larger than the entire European Union. And where population growth happens, economic importance tends to follow.
There’s an obvious reason why the population boom in West Africa isn’t filtering into our perception of world economic trends. It’s racism! (A friend of mine who’s an African history professor has a handy abbreviation, ADC — Africa Doesn’t Count.) One consequence of this myopia is that the study of a language that will grow tremendously in international importance comes pre-mocked. Mandarin and other Chinese languages and dialects have been considered serious, practical majors for some time because of the potential professional value of speaking in China. But why would the ability to speak in Francophone Africa be less valuable, unless you think Africa will never produce economic muscle to match its population?
The broader point here is simple: We have a prevalent concept of the “practical college major” in our society, but that concept is vague, not buttressed with evidence, and shifts according to whim and prejudice. And the ultimate point of stressing the practicality of certain majors while denigrating the frivolity of others is to blame people for economic conditions they can’t control.
The first and most basic problem with the notion of the practical major is that practicality is not a static, timeless quality. Consider the story of the pharmacy major in the mid 2010s. As a very telling New Republic story from 2014 spells out, the popularity of pharmaceutical studies could stand as a cautionary tale when it comes to the very concept of the practical major, of the educational “safe haven.” In the 2000s and 2010s, dozens of new schools of pharmacy were opened thanks to the perception that pharmacy was a safe field for young graduates. Thousands of newly minted pharmacists flooded the market. Somehow, administrators in higher education were surprised to find that these new graduates had a harder time finding a good job than previous generations. But this is an inevitable outcome of telling young people an academic field is a practical choice, since you’re making that field more attractive and thus increasing the competition they have to face in the labor market.
The point isn’t that the pharmaceutical industry became a uniquely bad field to be in — it wasn’t. The point is that a supposedly safe field became less friendly to new entrants over time. And it happened fairly quickly, in a world where economic data is often lagging and where it can take four or more years to get credentialed into a given field. What were the current pharmacy majors supposed to do when it became clear there would be a lot of competition for jobs after all? Quit halfway through their majors, after investing years and tens of thousands of dollars?
For another example of the folly of practicality, look at the major of business, a serious field for serious people — or maybe not. People are often surprised when I tell them that many of the career-outcome metrics for business majors are middling at best. After all, what could be a more intuitively practical major than business? The problem is that business is by far the most popular major in American higher education; each year, we graduate something like 350,000 students with bachelor’s degrees in the field. That means that, if you’re one of those students, you’re graduating into a labor market where you have an immense amount of competition. That inevitably depresses your career prospects. (Supply and demand applies to educated labor.) “Practicality” has nothing to do with it.
Or we might look at petrochemical engineering, where the job market tracks the notoriously volatile price of oil. Sample 2015 headline: “Petroleum engineering degrees seen going from boom to bust.” Working for oil companies seems like the definition of a practical, even mercenary ambition to me. And yet that superficial practicality is no match for macroeconomic conditions individuals can’t control.
The height of this style of thinking lies in the expression learn to code. Learn to code has been shouted at everyone who’s suffered in the job market for decades. “Hey, if you wanted financial security, you should have just learned to code!” But there are some glaring problems here. First, not everybody is going to be good at coding. All human beings have skills and abilities, but not all human beings have the same skills and abilities. Learn to code and the broader genre of blaming people for their job prospects depend upon the false notion that everyone is equally gifted in remunerative skills like programming. More to the point, even the mighty computer-science degree is not immune to the forces of industry contraction and fierce competition. Major technology firms have made huge job cuts in the face of rising interest rates and shrinking pools of loose cash, and the number of computer-science majors has grown substantially in the past decade. A New York Times article from December of last year laid out the inevitable consequences: an increasingly difficult job market for recent graduates.
Again, the point is not that computer science is suddenly a bad field to be in. The point is that programming, like all skills, is subject to the simple constraints of supply and demand, and thus the practicality of studying the major is a moving target.
I have a Ph.D., where I studied assessment of learning and education policy. I’ve been active in political and policy discussion of education for years. I’ve written about education for major publications. My first book was all about education and meritocracy, with a particular focus on the role of college in our socioeconomic system. I’ve done higher-education policy work for think tanks like the New America Foundation. And I have never — never — found a consistent and coherent definition of a “practical major,” anywhere. The meaning of the term floats around depending on the whims of the person using it, and those whims are usually dependent on mockery. The entire concept seems to exist simply to serve as an instrument to blame people for their own economic misfortune. In the years following the financial crisis and recession of 2009, it was common to hear people complain that the bad labor market was a reflection of the poor academic choices of 18- and 19-year-olds. Which is lunacy! We had just seen the global economy crushed by the machinations of a few thousand bankers. But the urge to blame people for their own financial misfortune was just that strong.
Some will say that a practical major is one that gives you the best opportunity for secure employment. Setting aside the fact that life spent in singular pursuit of money is soul-deadening, this strikes me as great advice for people in late adolescence who are in possession of a time machine. For the rest of us, perhaps we should build a society where the educational path chosen early in life is less consequential for lifetime economic security, and where we’re all more free to study what we actually care about.
The wisdom of that approach may have recently grown more obvious, as there’s a chance that the working world will change considerably once again. I’m something of a skeptic about the degree to which machine learning–derived AI systems will change the world. But either way, their rapid development and expansion, and the potential attendant effects on the labor market, should make it clear: Technology can change the economy faster than any person can reasonably be expected to keep up with. Nobody knows for sure which fields might be disrupted by AI, which skills rendered unmarketable. But if the effects are as big as some predict, a lot of people are suddenly going to find their once-practical path has become fraught and unsustainable. The question is, are we callous enough to blame them for it?