Behind Biden’s policies is a sea change in the way economists think


WASHINGTON, DC August 16, 2022:
US President Joe Biden signs HR 5376, the Cutting Inflation Act of 2022 (Climate Change and Health Care Bill) in the State Dining Room of the White House on Tuesday August 16, 2022. (Photo by Demetrius Freeman/The Washington Post via Getty Images) The Washington Post via Getty Im

This article is adapted from The Middle: The Rise of the Progressive Economypublished this month by Doubleday.

When the Inflation Reduction Act was signed into law last month, even many long-term observers were taken by surprise. Since the start of the Biden administration, it had long been apparent that no meaningful legislation could pass the Senate without the support of West Virginia Sen. Joe Manchin, who has historically shown little appetite for the type of reduction in climate change which forms part of the bill; but because Larry Summers calmed his inflation-related nervousness and because the bill also contained some West Virginia treats, he came.

But perhaps equally surprising was the ambition of the bill’s economic provisions, which are primarily aimed at making life easier for the middle and lower class. He promises to increase union jobs by creating electric vehicles and other clean energy technologies; he promises to enforce prevailing wages; he promises a fairer tax code and cracks down on wealthy tax cheats; and it promises to lower health care costs for millions of Americans.

One would have to go back at least to the 1960s to find a single text of economic law so radical and explicitly pro-worker. But while it may seem like a new political development, Biden’s economic policies reflect a sea change in economic thinking that has been going on for decades. To put it bluntly, the often murky world of economics has finally caught up with reality and is now focusing on inequality in ways that were inconceivable a few years ago. And these changes in the economics profession are setting the stage for economic policymaking in Washington.

Remember that in 1993, economists David Card and Alan B. Krueger published a pioneering study on the minimum wage. Two years later, they published a book expanding on the journal titled Myth and measurement: the new minimum wage economy. They were hosting an event at the Brookings Institution, the famous Washington think tank, and laying out their basic argument that they had found no evidence to support the idea that a higher minimum wage leads to cuts jobs. A show of hands from an economist in the audience who objected to all this talk of evidence, saying, “Theory is also evidence.

Card and Krueger’s work presented a challenge to their field precisely because it was based on evidence and data. They looked at what really happened in low-wage workplaces around the New Jersey-Pennsylvania border when New Jersey raised its minimum wage, and they found that there was in fact no resulting reduction in low-wage employment in New Jersey compared to Pennsylvania. In economic slang, what Card and Krueger conducted was called a “natural experiment.” Their method was a challenge to the way most economics was done at the time, namely that most research was based on theoretical models rather than real-world evidence. And their conclusion went against the mainstream theory, which had held for a century or more that when a price (here, a wage) goes up, the demand (for workers) goes down.

So it was very controversial, and a lot of people, especially Republican politicians, still don’t accept it. If you’re an ordinary human being, you might think it’s pretty obvious that real-world evidence and data should be at the heart of any kind of analysis, whether in the hard sciences or the social sciences. But economics, as Paul Krugman wrote in a 2009 essay, has become in the latter part of the 20th century more based on theories and models than on evidence. The models appealed to many people, writes Krugman, because they were elegant and based on increasingly complex mathematical calculations. They have earned their creators Nobel Prizes. And they were reassuring because they tended to move away from the long-standing assumption of neoclassical economics that actors behave rationally – that is, people never behave rationally. irrational, didn’t make bad decisions, etc. – and that the system was protected against undue risks.

At the time, there was some justification for this reliance on theoretical models, explained Jesse Rothstein of UC Berkeley, one of the leading economics departments in the United States that challenged traditional thinking. “Before the ’90s, generally, if the theory conflicted with the empirical data, you ignored the empirical data and focused on what the theory was saying,” Rothstein told me. “And it was probably the right thing to do because the empirical data wasn’t very good. We didn’t have a lot of data. We didn’t have very good methods for sorting out all the different causal factors. And so you were probably more right to do that than not.

But in the 1990s, something changed, Rothstein said, something that has only grown ever since: “We have better data. We have better computers. We have better empirical methods. Economics kind of became known as a field that took causal inference really seriously.

Author Michael Tomasky (courtesy Penguin Random House) jledbetter

This idea of ​​causality is essential, as Heather Boushey wrote in an essay in the journal Democracy in 2019 in which she explained to lay readers the radical changes that had taken place in economics. The techniques pioneered by Card and Krueger and quickly adopted by others “allowed economists to estimate causality, that is, to show that one thing caused another, rather than simply to be able to tell that two things seem to go together or move in tandem”. Causality meant that, on the basis of all this newly available data, economists could seek explanations for problems such as inequality, wage stagnation or global poverty in a way that was not possible in the previous eras. This was mainly driven by access to new data, and it was a profound change.

As Boushey wrote, “While in the 1960s about half of the articles in the three major economic journals were theoretical, about four out of five now rely on empirical analysis – and of these more than a third use the researcher’s own data, while almost one in ten are based on an experiment.

The most striking example of work in this new empirical field that has had enormous impact in the real world is that of Thomas Piketty and his collaborators at times Emmanuel Saez and Gabriel Zucman, as well as the pioneer of inequality research Tony Atkinson. Piketty’s most famous work, the Book Capital in the 21st Century, sold millions of copies worldwide and was made into a movie. He argued that the return to capital (profits, dividends, income, rents, etc.) is greater than the growth rate of national income (total economic output), which has the effect over time of concentrating wealth enormously. in the top 1%. , the top 0.1% and even the top 0.01%. His conclusions were driven by mountains of US income tax data spanning decades, data that showed how the rich shunned others and how the super-rich shunned even the rich.

The impact of the book is hard to overstate. It has moved economic inequality to the white-hot center of economic debates and political conversation. Writing together, Piketty, Saez, and Zucman examined tax and other data since 1913 to compare pre- and post-tax growth rates for different segments of the U.S. population (the pre-tax/after-tax distinction is important because it tells us whether policies government on income tax and other issues help shift wealth in one direction or the other). They found that since 1980, eight percentage points of national income has gone from the bottom 50% of the population to the top 1%. They also found that “government redistribution offset only a small fraction of the increase in pre-tax inequality.” In other words, tax rates do not follow the movement of wealth to the rich.

Another highly publicized example of the impact of data analytics comes from the field of so-called development economics, the study of global poverty. Here, Esther Duflo, Abhijit Banerjee and Michael Kremer are among the best known practitioners. They introduced the idea of ​​randomized controlled trials (RCTs) in the study of various aspects of global poverty – essentially, a way to assign people or entire villages to a “treatment group” or ” comparison group” randomly to try to determine the impacts of particular interventions. This practice has earned development economists the nickname Randomistas. The trio won the Nobel Prize in 2019 for their work. Another development economist wrote of the announcement: “Over the past fifteen years, the work of Abhijit, Esther and Michael has truly revolutionized the field of development economics by changing our view of what what we know – and what we can know – about when and why policy interventions work and others do not. RCTs have come under heavy criticism: that they cannot be generalized and that by focusing so intensely on small questions, their adherents ignore the big, important questions. But RCTs have helped development efforts determine how best to improve outcomes in education or health care, for example, in many poor parts of the world.

The IRA will of course not end inequality on its own or any time soon. But the fact that it has been passed makes it very likely that there will be more bills like this that reflect new economic thinking.

Behind Biden's policies is a sea change in the way economists think

Previous Wishful thinking: why the mistake of making peace with Russia requires feeding the wolf of war
Next Rolling out BIMI to all Apple inboxes in Fall 2022