Premium wages. Company picnics and events. Subsidized housing. And brand recognition at some of the hottest companies to work for.
I’m not describing tech companies. I’m talking about blue-collar jobs in the 20th century.
The rise and fall of early private welfare capitalism in the United States, plus a little economic history, shows key parallels for where the programmers and software engineers of today may be headed tomorrow.
Private Welfare Capitalism
Private welfare capitalism is the development of welfare policies (e.g. health insurance, subsidized food, retirement benefits) within for-profit private business enterprises, generally above and beyond what is legally mandated.
Dr. Chiaki Moriguchi, an economic historian, is known for her work on the emergence of private welfare capitalism in the early 20th century. In particular, Moriguchi has focused her efforts on contrasting the evolution of private welfare capitalism in the United States and Japan, from the Roaring Twenties into the depths of the Great Depression.
The welfare policies of the 1920s were driven by implicit contracts; employees were largely non-unionized, and employers weren’t legally obligated to give lots of benefits.
While some of the motivation for these benefits may have been out of generosity, Moriguchi’s research points out that it’s undeniable that strong pragmatic forces helped maintain implicit contracts. Health and fitness programs benefitted workers’ lives overall, but also made sure that they stayed in good health for factory work. Subsidized cafeteria meals, tenure-based benefits, and higher wages overall (like Ford’s famous “$5 a day” wage floor) encouraged retention. The promise of these benefits also enabled companies to recruit from a surplus of interested workers and hire top talent.
Most importantly, implicit contracts reduced the incentive for workers to unionize, mitigating the threat of explicit collective bargaining that had been ruminating in the Progressive Era. After all, through private welfare systems, workers had their loyalty rewarded with increasingly lucrative benefits. In return, employers still held the option to hire and fire if needed.
What happens when good times end?
By the end of the 1920s, and throughout the 1930s, the Great Depression severely impacted American life, including companies that had previously been providing generous perks. To staunch losses, many programs were cut, thereby repudiating the implicit contracts between employers and employees.
Moriguchi observes that, for the first couple years of the Depression, firms focused on maintaining employment and wage levels. By 1932, however, many companies that had led the charge on private welfare capitalism began major reductions. Companies like Ford, GM, and GE cut 50-80% of their workforce, reduced wages, and cut or suspended most of the benefits that employees became accustomed to.
As cushy job perks faded, collective bargaining and a push for unionization bubbled to a critical mass (such as the 1933 Ford plant strike). Moriguchi’s work, however, is more nuanced than “companies cut all benefits, and clashes between employees and employers became the norm”. In particular, she finds that firms with less repudiation during this time period also had a lower chance of switching to adversarial relations.
Although lower repudiation is most notable in Japan, where the effects of the Depression were less severe, US companies like Proctor & Gamble and IBM also fell under the “low repudiation” category, since they were able to keep programs and employment mostly intact. Companies like Ford and GM, however, fell into the “high repudiation” category, cutting many welfare programs and failing to provide relief efforts to laid off employees.
The results of these choices continue to reverberate nearly a century later. IBM’s US union suspended its operations in 2016, falling from a peak of over 70,000 paying members in the 1980s to a few hundred in the 2010s. Meanwhile, Ford and GM regularly contend with high-profile union strikes, with their remaining blue-collar employees leveraging negotiated deals across companies. More broadly, blue-collar manufacturing jobs have encountered a precipitous decline since the 1980s, as the adversarial relations dynamics led to mass offshoring and automation by employers. Over the past four decades, these job levels have plummeted further following each recession, despite population increases over time.
The setup for Tech to fall into adversarial relations in a major recession or depression is remarkably similar: as perks disappear, and wages and employment are cut, there is a lot more incentive to unionize. Broader developments in employee/employer relationships will shape the short and long-term effects of these crises.
Throughout the past couple decades, tech companies have evolved their own form of private welfare capitalism. Very generous perks include 26 weeks of paid parental leave (Salesforce), an on-site Medical/Vision/Dental center (Facebook), and, most famously, free food throughout company campuses.
The current global pandemic, and subsequent recession, threaten to upend these perks (and the implicit contracts behind them). Already, the “work from home” model means no free food at work for employees, with Google having to explicitly tell its employees that groceries aren’t a “business expense”. If perks are cut more permanently, and layoffs continue to roll through, many of these companies will have to contend with long-term adversarial relations that they have avoided so far.
A switch to unionization may not be the win that some programmers perceive, either, as evidenced by the many American blue-collar jobs that have been automated or outsourced to global supply chains. With advances in remote work, domestic programmers are possibly even more exposed to long-term outsourcing, especially as countries like China and India rise as tech powerhouses in their own right. The megatrends of machine learning/AI may ironically automate many of the current programming jobs as well.
How can programmers prepare?
For proactive programmers and savvy software engineers, investing in human skills will help. Computer science knowledge atrophies faster than critical thinking, communication, teamwork, and general problem-solving skills, which are far more in demand than you may think. If moving into a leadership position isn’t to your liking, then make sure that your human skills allow you to be adaptable through rapid workforce changes.
Broken implicit contracts aren’t an inevitability, but they are a possibility to keep in mind. As the current COVID-19 crisis shows, hard times reveal the truth of company leadership: Airbnb CEO Brian Chesky’s masterclass in empathy on one side of the spectrum, and Bird’s mass layoffs via a two-minute Zoom call on the other. Tech unicorns have begun sizable layoffs, including Lyft (17% of staff), Uber (14%), and Airbnb (25%). Tech giants Facebook, Google, and Microsoft have slowed or frozen hiring across departments, and only time will tell if they, too begin laying off workers.
If good times come to an end in tech, the jobs of today will be saturated, limited, or perhaps entirely automated. With the parallels of quintessential blue-collar jobs as a guide, hope for the best and plan for the worst, upskilling and reskilling accordingly to prepare for what the future holds.