The Decline of American Manufacturing and Why Today’s Leaders Should Care


Structural economic impacts led to America's manufacturing decline, but the root cause is not what you think.



Three well-established economic views are commonly credited with the decline of manufacturing in America. All three are wrong. Over the last 50 years, structural changes in the American economy harmed our manufacturing base because it was left outside of our innovation ecosystem and vulnerable as a result.

In the aftermath of World War II, Japan, Germany, and later on, China made fundamental investments in their industrial bases, specifically through manufacturing. America, on the other hand, was already a world leader in manufacturing and focused on constructing a new system of innovation.

But, the system of innovation America built did not include a vision for long-term support and sustained investment in domestic manufacturing. This left America's manufacturing base isolated. After decades of decline, American manufacturing entered what would be its worst decade at the turn of the century. Between 2000 and 2010, capital investment, output, and productivity all decreased. As a result, manufacturing jobs in America plummeted 34% - a loss of more than 5.8 million jobs. Here's the real reason why.


The American System of Innovation

After World War II, America constructed a new system of innovation that would be long-lasting and the first of its kind. The architect of this system was Vannevar Bush, an MIT engineering dean, co-founder of Raytheon, and head of the US Office of Research & Development. Bush believed that basic research was the cornerstone of new technology and innovation.

He created a marriage between government funding and basic scientific research in his role as President Roosevelt's and then President Truman's science advisor. This relationship rose to scale during World War II with the invention of radar and the atomic bomb – which Bush oversaw. After these wartime successes, Bush pushed President Truman to institutionalize the relationship and expand federal funding. This became the backbone of the new American system of innovation.

Ultimately, world-leading institutions like the National Science Foundation (NSF), the National Institutes of Health (NIH), the Department of Defense (DOD), and the Department of Energy (DOE) would fund research and carry out Bush's vision.

What Bush constructed was a front-end innovation system. This approach directed federal funding towards basic research in the front of the innovation pipeline. It is a technology push model which MIT Professor William Bonvillian calls the "pipeline model."



Matthew Boyd on Leveraging the Opportunities in Manufacturing-Led Innovation


This model is capable of creating breakthrough innovations and would do so repeatedly throughout American history. Including semiconductors, high-speed computers, computer graphics, broadband communications, mobile telephony, the internet, and genomics (Pisano & Shih, 2012, p. 18).

Around this system, America developed a robust entrepreneurial ecosystem of technology licensing, venture capital funding, and corporate mergers and acquisitions. As a result, America became the global leader in commercializing scientific discoveries. This was much to the envy of the rest of the world.

But, the build-out of the American system of innovation paid little attention to the role of manufacturing. America dominated manufacturing at the end of World War II and had already built the world's largest and best mass production system. It didn't appear to need much support or stimulus.

Manufacturing was left adjacent to America's new system of innovation because its existence was assumed to be institutionalized. It was not, and decades later, this would prove to be a meaningful oversight. Leaving manufacturing outside of a federal support structure isolated it and left it vulnerable. Over the second half of the century, structural economic changes occurred, and our isolated manufacturing base declined disproportionately as a result.


Asset-Light Corporations

American corporations were diversified and vertically integrated in the first half of the 20th century. They contributed generously to economic development in America. They contributed to research, training, and diffusion of new technologies to suppliers and helped to attract government support for investments in infrastructure. But that began to change in the second half of the century (Berger & Inno, 2013, p. 19).

In response to Japan's manufacturing quality revolution in the 1970s, financial pressures mounted to make American corporations more competitive. The idea that companies exist to create value for their shareholders became the dominant management philosophy. Increasing shareholder value was the primary concern driving corporate strategies.

Public markets demanded that companies focus on core competencies and divest their non-core competencies. Shareholders rewarded corporations that did so, and their market values increased. This resulted in the proliferation of asset-light organizations that had fewer employees, were less diverse, and were less vertically integrated.

Asset optimization led American corporations to transfer manufacturing functions outside the organization and, in many cases, abroad to lower-cost regions. As a result, the number of manufacturing plants in America employing more than 5,000 employees dropped from 192 in 1977 to 49 in 2007. In addition, the number of plants with more than 1,000 workers fell by half over the same period (Berger & Inno, 2013, p. 43).


The Industrial Commons

In the following decades, in addition to outsourcing manufacturing, this shift diminished corporate contributions to everything else outside their core business. Harvard Professors Gary Pisano and Willy Shih define everything else as the industrial commons. Pisano and Shih lay out the concept of the industrial commons in their book - Producing Prosperity.

In the book, the authors liken their industrial commons to past times when farmers would share a local pasture as a community resource for their livestock. The commons were no single individual's responsibility, but if they were to fall into disrepair, everyone would suffer.

The "pasture" of a modern industrial commons includes a labor pool, enabling technologies, and partners that support innovation from early-stage research to commercial production. This stands in contrast to a business cluster that aims to promote innovation and growth in one industry. Instead, a robust industrial commons serves many industries and is more innovative and resilient.

In our modern economy, the ingredients of an industrial commons are more complex but still somewhat surprisingly localized and geographically bound. They are an interconnected ecosystem of academia, government, and business in their simplest form.

In America, our industrial commons have eroded significantly over the last 50 years, leaving small and mid-sized manufacturers with few external resources to draw from. As MIT Professor Suzanne Berger explains, "All growth depends on internal resources. They do not find any complimentary capabilities they can draw on in the industrial ecosystem" (Berger & Inno, 2013, p. 13).


Banking Consolidation

In the 1990s, the American banking industry also went through a period of significant change – specifically consolidation. This occurred due to several fundamental policy changes to bank industry regulations.


  • The Riegle-Neal Interstate Banking and Branching Efficiency Act of 1994 allowed branch banking beyond one state and throughout the United States.
  • The Gramm-Leach-Bliley Act of 1999 allowed banks to enter other financial markets and provide additional financial services.
  • The Glass-Steagall Act was reformed in 1999 to allow banks to conduct commercial banking alongside investment banking, which was banned after the Great Depression.

These policies allowed banks to operate across state lines, expand their services, and combine commercial banking with investment banking. This created an attractive environment for mergers and acquisitions.

As a result, independent local banks were acquired by national and international banks. In 1980, there were over 19,000 banks operating in the United States. By 2010, this declined 60% to only 7,000 institutions. These local banking relationships were the lifeblood of small and midsized manufacturing firms.

The majority of the US manufacturing sector belongs to 250,000 small and midsized firms representing 86% of our manufacturing workforce and producing 46% of private sector non-farm output. However, these companies relied on local banking relationships as their primary source of capital, and it was no longer readily available. As a result, manufacturing investments in new technologies and capital equipment stagnated and slowly eroded manufacturers' global competitiveness (Bonvillian & Singer, 2018, p. 59).


The American Narrative

As these structural changes occurred, economists coalesced around three arguments to explain why manufacturing was in decline. But, as MIT Professor William Bonvillian explains in his book Advanced Manufacturing, none of these arguments proved to be true.

The first argument is that the decline of American manufacturing was an inevitable symptom of emerging economies entering global markets with cheap labor. Of course, emerging economies did enter the global market with cheap labor. Yet, in Germany - where 20% of its workforce is employed in manufacturing - and its wages are over 60% higher than in the United States – they run a major trade surplus in manufactured goods.

The second argument is that it resulted from our economy reaching a point of post-industrial maturity where services would take over our gross domestic product. America built a trade surplus in services, which has grown as a part of our GDP. But, the manufacturing trade deficit has not been replaced by this surplus – far from it. America's trade deficit in manufactured goods increased from $25 billion in 1980 to $922 billion in 2020. Meanwhile, the surplus in our services has only increased from $6 billion to $245 billion over the same period.

And the third argument is that it was simply the result of productivity gains due to technological advancement. Productivity, a topic of much debate, has also not contributed to a decline in manufacturing employment in America. On the contrary, productivity stagnated in the manufacturing sector for decades and declined during the Great Recession.

Even though these arguments were wrong, they fed a false narrative that became entrenched in American society. A 50-year-old American has been completely consumed with a negative narrative about manufacturing for their entire lives. This was not only their guidance counselor telling them manufacturing was a dead-end career. Our culture romanticized a narrative of leaving rural America and escaping the production floor for half a century.


Brian Johnson on Changing the Narrative Around Manufacturing


These criticisms were once valid. Our prosperity gains from scaled manufacturing eventually reached a point of diminishing returns. As a result, corporations sought only to improve efficiency and reduce the cost of manufacturing. As a result, we treated manufacturing and its people as costs and never as benefits, and our relationship to manufacturing became reductive in the United States. But today, this is a false narrative and a relic of the Industrial Revolution.


Why Today's Leaders Should Care

America led the world in manufacturing less than a century ago. Since then, we've excluded manufacturing from our innovation ecosystem, allowed our industrial commons to deteriorate by promoting asset-light corporations, and restricted access to capital. Manufacturing did not decline due to economic evolution or other externalities. Instead, manufacturing declined due to conditions we created in the United States.

Ultimately, the root cause of the decline in American manufacturing is that it was left adjacent to the new American system of innovation after WWII. But perhaps the costliest mistake was convincing ourselves that the decline in manufacturing was natural or even good. Deindustrialization and the loss of manufacturing jobs in America have, in fact, not been a healthy transition into a modern, knowledge-based economy.

The narrative we have accepted fails to celebrate the vital role manufacturing plays in our innovation ecosystem, our job market, and our national defense. And perhaps more importantly, the challenge we must overcome to bring manufacturing back into favor. This narrative could keep us from capitalizing on our future – where manufacturing will play an increasingly important role in innovation.

Today, most Americans still think of manufacturing as a long process of transforming raw materials into products through fabrication, assembly, and distribution stages. Even our most modern view of manufacturing is of a process performed in faraway low-cost countries. In concert with this, we view American companies as innovators who know better than to manufacture themselves. This view is flawed and outdated.

Markets have evolved because of new technologies, and a modern advanced manufacturing system is emerging. Despite decades of decline America is still well positioned to capitalize on this. But to do so, today's leaders need to understand the modern economy where we now operate – and focus our manufacturing investments on that future. 





Adams, R. (2012, August 8). FRB: Finance and Economics Discussion Series: Screen Reader Version - Consolidation and Merger Activity in the United States Banking Industry from 2000 through 2010. Federal Reserve Board. Retrieved August 26, 2022, from

Berger, S., & Inno, M. T. F. O. P. I. T. (2013). Making in America: From Innovation to Market. The MIT Press.

Bonvillian, W. B., & Singer, P. L. (2018). Advanced Manufacturing: The New American Innovation Policies (The MIT Press). The MIT Press.

Pisano, G. P., & Shih, W. C. (2012). Producing Prosperity: Why America Needs a Manufacturing Renaissance. Harvard Business Review Press.


Related Articles

More In