It wasn’t always this way. For most of the 20th century, the U.S. ran a modest surplus in both goods and services. The U.S. manufacturing sector was also a much bigger part of our economy than it is today. In 1960, about 25% of all American employees worked in manufacturing; today the share is under 10%. Manufacturing as a percentage of the total U.S. economy has also sharply declined in importance over the same time period.
Throughout the 19th and into the 20th century, the political consensus was to protect and nurture American manufacturing. It’s a little known fact that in the 19th century, tariffs on imported goods were one of the federal government’s largest sources of revenue. While academic economists today disparage “protectionism,” the truth is that the U.S. in the late 19th century became the world’s largest economy, the world’s largest manufacturer, and had the highest average wages of any country in the world. America achieved this incredible success despite taking actions which directly contradict the trade theories taught today by many academic economists.
The problem is not just academic trade theory. Today, manufacturing suffers from a perception problem, especially among the most well-educated Americans. Factory jobs are often seen as dirty, relatively low paid work, an artifact of the past. I saw this first hand at the elite universities I attended. The default career paths are law, medicine, banking, tech and academia. Manufacturing is nowhere on the agenda, save perhaps for a limited segment of students in engineering fields.
In fact, the poor perception of manufacturing has little basis in reality. In the Chicago area, still one of the nation’s great manufacturing centers, the average earnings for manufacturing jobs is over $67,000 a year — 16% higher than the average job in the region. The same is true nationwide. Manufacturing jobs pay more because, in many cases, it takes years of training and expertise to really do the job well. You can’t afford to lose an employee who is critical to the process.
Mondelez recently announced it was closing its Oreo cookie plant on Chicago’s economically depressed South Side and moving the jobs to Mexico. At the plant, many factory employees made $25 an hour with good benefits. These are employees who, for the most part, do not have a post-secondary education. Many of them will be lucky to find a job which pays half as much.
So-called “free trade” deals shoulder substantial the blame for the decline in U.S. manufacturing. For example, research shows that trade with China has cost the U.S. three million manufacturing jobs — more than the entire population of Chicago. NAFTA has also cost the U.S. over 600,000 manufacturing jobs. Those were, by and large, good jobs which supported families. The result is plain to see: many American cities, small and large, which look like bombed out war zones, with empty factories, vacant homes, and crumbling infrastructure. Detroit is classic example — going from one of the highest-income regions of the U.S. in the 1950s, to a shadow of its former self, having lost over half its population. It’s very difficult to conclude this is an acceptable outcome of “free” trade.
As a graduate student at the London School of Economics in the late 1990s, I was thoroughly exposed to the “consensus” Anglo-American view on free trade. For the most part, the view is that trade is good even if one side manipulates its currency or otherwise tries to put a thumb on the scale. This is the American consensus on trade, but it’s hard to believe that elites in Germany, China, Japan, South Korea — countries with strong manufacturing sectors and big trade surpluses — share this “consensus” view. Their actions suggest they don't put much stock in our current academic trade theories. And perhaps they are right. There’s an old saying: “if you’re at a poker game and you don’t know who the sucker is, the sucker is you.”