Print
Whither modern economics? Subjective semi-sociological observations
expand article infoRostislav Kapeliushnikov§
‡ Primakov National Research Institute of World Economy and International Relations, Russian Academy of Sciences, Moscow, Russia
§ National Research University Higher School of Economics, Moscow, Russia
Open Access

Abstract

The paper summarizes main recent sociological, epistemological, methodological and ideological trends in modern economics and tries to evaluate its current state and further perspectives. Special attention has been paid to a change in economists’ methodological ideal: economic science began with trying to become like physics but actually has become like medical statistics. The paper’s general conclusion is that what we are witnessing today in modern economics is simply an ordinary working state rather than a triumph or a crisis. However, that state is not very promising since the period of new large theoretical ideas seems to be over for economics, the new atheoretical tendency in it is becoming stronger and in the very near future, economics is most likely to become more and more interventionist.

Keywords

econometrics, epistemology, experiments, behavioral economics, methodology, ideology.

JEL classification: A1, B22, B23, B4, B5, C9.

1. Introduction

The Great Recession of 2008–2009s triggered an endless flood of publications, both in the mass media and academic journals, on the deplorable state of modern economics. Its models are far from reality; it is over-mathematized and blind to the most urgent problems that now hit mankind; it has suffered a devastating fiasco in failing to predict the global economic crisis; its recipes are mostly counterproductive, only paving the way for even greater disasters; it is divided into several competing schools that are unable to reach agreement even on the most fundamental concepts etc. These kinds of loud invectives have been voiced not only by politicians, journalists, pundits and the general public, but also by many professional economists. However, after the global economy successfully avoided turning the Great Recession into a severe protracted depression, public discourse changed noticeably and other voices emerged. It has been credited to economics that events did not follow the worst scenario: economists learned well the lessons of the Great Depression of the 1930s and this enabled them to offer to governments political measures that successfully averted the threat of an overall economic collapse.

A natural question arises: Is it a deep crisis, as some think, or a triumph, as others believe? I must admit that I do not follow specialized literature which analyzes the recent evolution in economic science and evaluates its current state. All I can offer is to share my subjective observations on what seems to be the most significant and noteworthy developments that have occurred within it in recent decades. Undoubtedly, such observations have, by definition, to be subjective, fragmented and selective. It is also evident that any scholar might cover only a tiny part of voluminous current economic literature, so that the question always remains open as to what extent an evaluation of the state of affairs in a particular branch of a theory can be extrapolated that theory as a whole. This is why I would like to warn the reader: the following notes are not an academic study with all of its prescribed attributes, but merely a kind of “traveler’s impressions” that do not pretend to be comprehensive, objective or systemic. I decided to call them “semi-sociological” since, in discussing the state of modern economics, I will try to proceed from some obvious but crucial characteristics of its functioning as a certain institution or as a certain social phenomenon.

2. A bit of sociology

Perhaps the most fundamental and essential social fact about modern economics is that, today, an immense number of scholars are involved in the “industry” of economic research. Some estimates suggest that the number of economists far exceeds the number of persons engaged in any other social discipline, second only to psychologists. The “massification” of the economic profession has a few important implications.

First, the role and significance of formal criteria and procedures increase sharply under such conditions. This seems absolutely inevitable when one deals with the vast anonymous mass of potential authors and the ever increasing competition between them. We are witnessing an unstoppable advent of “formalism”, growing more aggressive every year, at all stages of the education, research and publication processes (up to stringent unified requirements imposed on the composition of academic articles). Various types of indices and ratings are now calculated for universities, journals, individual researchers and even for university graduates (Fourcade et al., 2015); economic research funding is allocated mainly on this basis. It is not clear, however, that this expansive formalization of everything is totally neutral in terms of progress of scientific knowledge. For example, I find it hard to imagine such out-of-format papers like Ronald Coase’s articles to be published anywhere today. Even if they did show up in some third-rate journal, I am sure no one would notice them and Coase’s ideas would be lost for economic theory.

Second, the “overpopulation” of the economic profession is changing (and has changed) the key social role of academic journals. Having formerly been the means for channeling scientific information, they are now, in fact, a certification filter for research products. Nowadays, seven, eight, or even more years may pass from the time a paper is written to its publication in a journal. During that period, the author has time to present it at a few conferences and publish it several times as a working paper. As a result, when it finally appears in a journal, its basic concepts and findings may have been well-known for a long time by everyone who works in the same field. A final publication simply indicates its successful certification. This is important because today there is a huge gap, if not a chasm, between “certified” and “uncertified” papers. One could go so far as to say that publication in leading journals primarily verifies whether a research belongs to the economic mainstream.

Third, thanks to its massification, economics now finds itself in a situation that nominally might look like the Hundred Flowers. Every tiny field of research and every unorthodox school establishes its own association and journal and sometimes several associations and journals. There are now journals for schools of thought such as econophysics, bioeconomics, socioeconomics, evolutionary economics, the Austrian theory, old institutionalism, Post-Keynesianism, public choice school, Marxism, Neo-Marxism, radical political economy, feminist economics etc. Unfortunately, when examined more closely, the situation of the Hundred Flowers turns out to be an illusion, as there is essentially no actual dialogue between the mainstream and the heterodoxy and it seems that blame can be laid on both sides. The mainstream simply ignores what goes on in the unorthodox schools since, for a mainstream economist, devoting his or her attention to them would be a waste of time that could only diminish his or her publication activity. As for the advocates of the unorthodox approaches, it is true that they have to respond to new developments in the mainstream, if for no other reason but to criticize it. However, a sectarian spirit is quite prevalent within the heterodoxy, which I would even say, is cultivated. There are many examples of how the most broad-minded mainstream economists have tried to start a dialogue with the unorthodox and how those attempts ended up. In most cases, it evoked overly aggressive reactions from non-mainstream economists. As a result, unorthodox theories are now doomed to exist in isolation and “stew in their own juice,” becoming a kind of intellectual ghetto.

However, here I would like to identify a phenomenon which has never been discussed in academic literature and which, to a certain extent (though insignificant but still!), offsets the trends mentioned above. I refer to the explosive development of blogosphere economics, which has flourished in recent decades. Certain more open and more active economists have been creating their own websites where they speak about their recent studies, comment on papers by others, share their views on economic policy issues etc. What does this imply?

First of all, professional economists and the general public are engaging in an actual live dialogue, which undoubtedly has an enlightenment effect because this dialogue translates scientific concepts from over-formalized lingo used by modern economists into ordinary “human” language. The general public also gains the opportunity to look into the professional economist’s laboratory and get acquainted with new ideas at their developmental stage. Indeed, new ideas are rarely born already formalized. Very often they come up during informal discussions between economists in a cafeteria, a walk, working out etc. (For example, the idea for one of the famous joint papers by Paul Samuelson and Franco Modigliani, 1966, first emerged on a tennis court).

Secondly, when comments are exchanged on the Internet between mainstream and unorthodox scholars, they also become engaged in a meaningful real-time dialogue. It should be noted that the mainstream views are not necessarily the most prevalent in the blogosphere: amongst the most popular economic websites, those created and maintained by heterodox economists account for a disproportionately larger share. In addition, since discussions in the blogosphere use minimal maths for quite obvious reasons, we find ourselves in a kind of time machine, traveling all the way back to an era which preceded today’s pervasive formalization of economic analysis.

Of course, this does not mean that the blogosphere has already had any significant impact on academic economics. (Another situation developed within some sister disciplines. For example, social psychology is experiencing a serious replication crisis and a deep internal reconstruction driven by processes in the blogosphere.) The only example that comes to my mind is the work of Scott Sumner, leader of the “market monetarism” school. A decade and a half ago, he started his own blog1 with the purpose of disseminating a single idea. His essential point is that inflation is a bad target for monetary policy and that focusing on it may generate additional economic fluctuations. In his opinion, an alternative indicator would be vastly more effective and reliable as a target for monetary policy, namely nominal GDP. I do not know whether Sumner’s efforts have had any impact, but the fact remains that, recently, even the leading figures in monetary theory, such as Ben Bernanke and Janet Yellen, have expressed interest in the idea of targeting nominal GDP.

3. “An econometric idolatry”

It would probably be wrong to restrict the discussion exclusively to the sociological characteristics of modern economics, ignoring some of its specific epistemological characteristics. One of the distinctive features of its current state seems to be a general attitude that I would dub as “an econometric idolatry”. It implies that, for the “typical” modern economist, econometric estimates are the higher reality, bearing the status of ultimate truth. It is the “trump” that beats all other considerations, be it general theoretical principles, intuition, practical experience, common sense arguments or anything else. The following are only a few of its most visible manifestations.

In whatever field of research, if econometric estimates contradict the general theory, modern economists, first, experience no intellectual discomfort in this regard and, second, unconditionally favor econometric estimates, assuming that general theoretical principles are a convention having no direct relationship to reality. A striking illustration is provided by many new studies (not all of them, of course) on minimum wage. A great number of econometric estimates have appeared in recent decades showing that the rise in minimum wage either has no effect or even has a positive effect on the employment of low-skilled workers. If I am not mistaken, it is the only exception to the law of demand actively and thoroughly discussed by modern economists. Apart from low-skilled labor, I know of no other product, service or production factor to which extensive empirical literature would be devoted that would prove that demand for it would not decline and might even rise if its price increases. The fact that this contradicts the basic concepts of economic theory is most often simply neglected. Most modern economists (not all of them, of course) do not pay much attention to such discrepancies: they are little worried whether or not econometric estimates are fitted into any theoretical framework. If econometrics tells us a certain story, then it must be true. If econometric estimates run contrary to a theory, the worse for the theory.2

Because of econometric idolatry, also results in the “typical” modern economist lacks any feeling of the need for an internally consistent comprehensive worldview. He perceives reality as a quilt, where each section of economic analysis forms its own special worldview. I refer again to some papers on minimum wage (Caplan, 2013). As noted above, many of them infer that the rise in minimum wage either fails to affect employment of low-skilled workers or has a positive effect on it. In other words, demand elasticity for this kind of labor is close to zero. This means that the demand curve for low-skilled workers is either a vertical line or may even incline slightly to the right.

At the same time, most studies on immigration demonstrate that an active inflow of low-skilled immigrant workers into the country’s labor market has almost no impact on wages of native low-skilled workers (Pekkala Kerr and Kerr, 2011). This means that the demand for low-skilled labor is highly elastic: in the extreme case, its curve may be a nearly horizontal line.

The “typical” modern economist feels no apparent discomfort with the fact that econometric estimates show one thing in one subfield and quite the opposite in another and sees no problem in trusting both. Some commentators suggest a politico-psychological explanation to this willingness to combine incompatibles. They interpret it as a manifestation of the ideological preferences of economists with progressive political views, since it is left-wing intellectuals who are inclined to support both higher minimum wage and lesser restrictions on immigration. It is quite possible, however, that it has less to do with ideology than with epistemology.

The typical modern economist lives in a fragmented, “balkanized” reality, where each fragment exists largely apart from the others. If studies on minimum wage demonstrate that the demand for low-skilled labor is inelastic, then it must be so; if studies on immigration show that demand for low-skilled labor is highly elastic, then it must also be the case. Each field of research has its own econometric estimates and its own worldview. Given no need for an integrated picture of the economic universe, this hardly comes as a surprise.

Still another manifestation of an econometric idolatry is that wherever estimates are obtained, both using simpler methods and using more sophisticated, advanced, most recent ones, the typical modern economist will always choose the latter over the former. Here is an example (Das and Polachek, 2017). For the United States, estimates of returns to education, i.e. the percentage change in wages associated with one additional year of schooling, obtained using the simple OLS, fall within the range of 5% to 15%. At the same time, these same estimates obtained using instrumental variables estimation, vary from 4% to 94%. I suspect that, when asked which set of estimates is more realistic and trustworthy, an economist and a non-economist would give different answers. In any case, a paper limited to OLS estimates will not be published under any circumstances today, whereas a paper using the instrumental techniques would have a good chance for publication, especially if it offers a new instrument never used before.

4. Science without theory

One of the most important recent trends has been the appearance of numerous experimental and quasi-experimental studies and the sharp rise in their scientific status. Such purely factual, atheoretical analysis is concerned essentially with a single question: whether some A is the cause of some B irrespective of whether the result obtained is fitted into any conceptual framework and whether it is amenable to any theoretical interpretation (De Vroey and Pensieroso, 2016). The soaring popularity of experimental and quasi-experimental methods has been observed in development economics, macroeconomics, financial economics, economics of education, health economics and labor economics;3 behavioral economics relied upon them from its very outset.4 It is such studies that now set the standards of scientific rigor and are regarded as the cutting edge of modern economic analysis. Researchers developing and using experimental and quasi-experimental methods make up the highest caste in today’s economic profession because, in terms of design, their studies are closest to how studies are structured and organized in natural science.

Historically, economics has suffered from a kind of inferiority complex with regard to the natural science, because it had been thought incapable of conducting experiments (Kapeliushnikov, 2015). Only in recent decades, when experimental and quasi-experimental methods began to find their way into the research practice, economics finally has been liberated from this old psychological complex and obtained the long-awaited status of an experimental (i.e. “real”) science. Experiments in empirical economic research caused a “credibility revolution,” as dubbed by Angrist and Pischke (2010): the new methods produced the highest quality of quantitative estimates, which left far behind everything that traditional econometric analysis could achieve. Now economists could reliably identify the presence of causal — not just correlational — relationships and provide precise measurement of the impacts some of the observed phenomena had on others.

The general idea behind the experimental approach is quite simple. As the subject of analysis, situations are selected/constructed where the researcher (in laboratory or field experiments), nature (in natural experiments) or the government (in social experiments) exerts a certain impact (A), which affects one part of the population under review (experimental or treatment group) but does not affect another one (control group). If individuals were assigned to either the experimental or control group in a purely random manner (e.g. by flipping a coin), this provides an effective solution for the endogeneity problem which has been and still is, the biggest stumbling block for traditional econometric analysis.5 To do this, we only need to measure changes in some characteristic (B) for both the affected group and the unaffected group. If it changes more in the experimental group than in the control group, it means that A (impact) has caused B (changes in the characteristic). Thus, the correct design for an experimental study provides an unambiguous answer to the question of whether A is the cause of B or not and no theory, in its traditional sense, is needed in this case.

As an illustration, we can refer to a study with a quasi-experimental design which gained wide recognition and high appreciation (Almond and Mazumder, 2011). The data used in the study included information on differences in the health condition of individuals born in different years and covered the Muslim populations in Uganda, Iraq and the state of Michigan (United States). The starting point for the analysis was the fact that, during the holy Muslim month of Ramadan, which lasts 29 to 30 days, worshippers are forbidden to take food from dawn until sunset. However, the exact start and end dates for Ramadan are not fixed, as they are determined by the lunar calendar. As a result, in some years Ramadan falls on months with longer daylight hours and, in others, on those with shorter daylight hours. This situation is close to the conditions of a randomized experiment, since pregnant women are distributed randomly across years with different lengths of daylight hours during Ramadan. The study found that women whose pregnancies occurred when Ramadan fell on months with longer daylight hours gave birth to less healthy babies than those whose pregnancies occurred when Ramadan fell on months with shorter daylight hours. (An especially strong negative effect was observed for earlier stages of pregnancy). In other words, longer breaks in the nutrition of expectant mothers resulted in poorer health for their children. The quasi-experimental nature of the data allowed the researchers to identify the existence of a direct cause-and-effect relationship between these two phenomena with a high degree of certainty.

We can refer to another example, this time from the field of development economics. In developing countries, school teachers are known to often neglect their responsibilities, either failing to appear at schools at all or to only perfunctorily stay there as long as required, without actually teaching any classes. In their experiment, which has come to be considered a classic, Duflo and Hanna (2005) demonstrated how poor incentives and inefficient monitoring may cause teacher absenteeism. The experiment was conducted in several small villages scattered across a mountainous area in the Udaipur district in the state of Rajasthan (India), where a single teacher usually teaches classes to all grades. In that part of India, the level of teacher absenteeism (the proportion of school days which teachers failed to attend at all) was estimated at 44%. A total of 120 villages were sampled, out of which 60 were randomly assigned to the treatment group and 60 to the control group. Monthly wages for teachers varied between 500 and 1300 rupees in the first group and 1000 rupees in the second. According to the conditions of the experiment, teachers from the first group were asked to photograph the time they arrived and left the school on a daily basis. For every day on which they spent at least 5 hours at school, they were paid a bonus of 50 rupees (around USD 1 at the official exchange rate). This scheme, combining pecuniary incentives with effective monitoring, led to a reduction in the level of teacher absenteeism in the treatment group by almost half, to 22%. Thus, thanks to the experimental design, the researchers managed not only to reliably identify the causal mechanisms involved in this case, but also to precisely evaluate their impact. The experiment demonstrated how teacher absenteeism can be successfully mitigated in developing countries.6

How is analysis of this kind related to economics? Strictly speaking, it is not. It could have been conducted with no less efficacy by a demographer, a physiologist, a nutritionist or a medical statistician, with no idea about economics whatsoever, but with a good command of the respective statistical techniques. How is this related to economic theorizing? It is not, strictly speaking, as neither the initial hypothesis nor the interpretation of the results require any theory.

One could say that, in the first example, a theoretical foundation is provided by another discipline, i.e. physiology, with economic analysis becoming its “sidekick.” (Similarly, in the renowned series of experimental studies on the impact of classroom size on pupils’ achievements, it becomes the “sidekick” of pedagogical psychology). In this regard, behavioral economics provides a purer case, since for each experiment and for each behavioral anomaly, it tends to consruct a separate formal model (in fact, a separate “theory”). The problem with such an approach is that, for each empirical case, it is possible to construct a dozen formal models and to describe it in terms of a dozen “theories”. De facto, the research practice of behavioral economics implies a negation of the hypothetic-deductive understanding of the nature of scientific knowledge which prevailed in the philosophy of science after Karl Popper. (In the simplest terms, some general theoretical proposition is put forward that is not subject to direct empirical testing; next, some lower-level proposition is derived (deduced) from it, which can be tested empirically; if test results are consistent with that empirical proposition, it is accepted, along with the general theoretical proposition from which it was deduced.) Behavioral economics is essentially returning to the more primitive, pre-Popperian inductivist philosophy of science, where the filter for selecting between hypotheses is substantially less tight, because in the case of the hypothetic-deductive approach, an empirical proposition is “confirmed” not only by “facts” but also by its consistency with the higher-level theoretical propositions and with other empirical predictions derived from the same theory. (I do not support here a strong thesis that economics actually was a hypothetic-deductive science; suffice it to say that it tried to be so and that most economists agreed on that).

In the case of the experimental approach, as seen from the examples above, it is not the choice of problems that directs the choice of method, but vice versa: it is the choice of method that begins to direct the choice of problems. The main goal of the researcher is to find natural or create artificial situations that, in more or less detail, reproduce the conditions of a randomized experiment. The search for quasi-experimental cases is becoming the leading motivation for research activity. However, since the number of such cases is limited, the economic profession becomes stratified based on whether the “hunt” for them has been successful or not. The banality, or even the clearly meaningless nature of a topic is no longer a drawback. There is no room left for theory under these conditions; if preserved at all, it only remains as a relic. Under the current situation, it would be natural to expect that, with every year, a number of experimental studies will grow faster and their attraction for new generations of economists will become ever greater. However, the total domination of the experimental approach would mean the death of economic theory in the traditional sense (although, of course, not necessarily the death of economics itself).

We should add that the randomized experimental approach with treatment and control groups is not used in the natural sciences, like physics, chemistry etc. It found its way into economic studies from medical statistics, where it is used to test new kinds of drugs and treatment methods. One could say that the price economic science paid for obtaining the desired status of an experimental science was giving up the methodological ideal towards which it originally used to gravitate. Historically, the dreams economists had of turning their discipline into a “real” science were associated with physics, which, beginning at least from the second half of the 19th century, they perceived as the supreme standard of precision, rigor and scientific quality. Today, we seem to be attending the funeral of the old methodological ideal: economic science began with trying to become like physics and ended up resembling medical statistics.

5. The decadence of theoretical innovations

Now I will state a thesis of which I am the least sure. It is my impression that the era of new large theoretical ideas has passed for economics. I do not mean elaborating ever more sophisticated advanced econometric methods (as research activity is exceptionally strong in this field) or building new, more sophisticated and technically complicated formal models (which are also in abundance today) or providing umpteen empirical studies on important and interesting applied topics (such as dynamics in economic inequality, the polarization of job structure, the significance of cognitive and non-cognitive skills from the standpoint of improving individual productivity, the impact of robotics on employment, the comparative contribution of geography, institutions and culture to the economic development and many others). Although impressive progress can be observed in the above fields, no groundbreaking theoretical innovations can be seen behind it.

I came to this (arguably mistaken) conclusion when I began to analyze literature citations under articles in the journals on labor economics which I came across. The papers devoted to econometric techniques that are included in such reference lists are usually the latest publications related to the most recent period. The same holds for papers on the empirical analysis of various concrete problems. On the other hand, papers that form a theoretical framework for the study nearly always date back to the early 1990s or earlier. Indeed, during about three decades from 1960 to 1990, a real breakthrough occurred in labor economics, giving birth to the human capital theory, the theories of discrimination, the internal labor market theory, the signaling theory, the search theory, the matching theory, the effective wage theory, the idea of deferred compensation, the tournament models etc. However, the stream of new large theoretical ideas seemed to start diminishing from the first half of the 1990s.

Of course, I am not ready to assert that the same situation developed in other areas of economic research. The knowledge I do possess, however, suggests a more or less similar picture.

Behavioral economics? Its basic ideas were formulated during the 1970s and 1980s and its subsequent development consisted mostly of the mechanical accumulation of new cases of cognitive or behavioral anomalies and the construction of formal models for them.

Macroeconomics? The Sturm und Drang period in it also happened in the 1970s and 1980s. Within macroeconomics, only the analysis of the ZLB (zero lower bound) problem could arguably be claimed as a serious innovation in recent years. Modern macroeconomists came to the conclusion that, if the natural interest rate drops far below zero, while the nominal rate remains significantly higher than that (since it cannot fall far below zero), this may result in secular stagnation. However, the ZLB is more of a new important and interesting analytical problem than a principally new theoretical idea. Smith and Ricardo had already considered some theoretical aspects of such a situation, though they did not believe it to be possible in real life. Moreover, the very use of the concept of the natural interest rate in modern macroeconomics is nothing more than a return to the Wicksellian theoretical scheme, rejected after General Theory by Keynes. Additionally, the return to it did not begin today: Milton Friedman seems to be the first to speak about its necessity in his famous presidential address to the American Economic Association.

Finally, whether accidentally or not, but chronologically, the chalky stream of major theoretical innovations coincides with the so-called “empirical turn” in economics, which occurred at the turn of the 1980–1990s and which was so enthusiastically welcomed by the more authoritative experts on the methodology of economic analysis (Colander, 2009).7

6. Behavioral dualism

Serious changes have also taken place in the behavioral (“anthropological”) foundation of economics that formerly was provided by the model of rational choice. However, the situation changed drastically with the emergence of behavioral economics. All of the leading economists today are unanimous in recognizing the exceptional importance of its ideas and approaches and are willing to take them into account in their research practice. The popularity of models of bounded rationality that focus on various behavioral anomalies is increasing every year.

The incorporation of behavioral economics ideas by the economic mainstream provoked a sort of “schizophrenic” split: economists began to freely transit from models with fully rational agents to models with boundedly rational and even irrational agents, feeling no intellectual discomfort from such swings (Kapeliushnikov, 2015). In most cases, the incorporation of “behavioral” components boils down to postulating the co-existence of two classes of agents, with full and bounded rationality (many commentators believe the construction of models with heterogeneous economic agents to be a significant step forward in the formal economic analysis). Although behavioral economics demonstrated that real people are rather far from the hyper-rational homo oeconomicus, modern economists see no serious problem here and express no concern with the current state of affairs. Decisions to introduce or not to introduce deviations from the rationality principle in their models are determined by the nature of the problems under review.

Interestingly, in the long run, such a a behavioral split does not seem either new or unique (Kapeliushnikov, 2017). The idea that the model of rational choice has always been the “hard core” of the economic orthodoxy is a historical aberration. The current situation is rather similar to that which developed in the mid-20th century. A fairly clear distinction existed at that time: microeconomics followed from a presumption of rationality of economic agents, whereas macroeconomics followed from a presumption of their irrationality. It is enough to remember the basic ideas of the original Keynesianism. It regarded literally all categories of economic agents as non-rational beings: workers suffer from the money illusion; consumers are driven by a propensity to consume, which has nothing in common with optimizing behavior; investors are prone to frequently alternating irrational waves of optimism and pessimism (animal spirits) etc. In a similar manner, classical monetarism, with its concept of adaptive expectations, also suggested that economic agents are incapable of learning from their own mistakes and are doomed to repeat them again and again. As a result, for many decades during the 20th century, economics experienced a very similar state of schizophrenic duality: whereas, in microeconomics, actors were represented by rational economic agents, in macroeconomics they were represented by non-rational ones.

In a sense, the whole evolution of macroeconomics after Keynes can be described as a steady “rationalization” of the main blocks of his original analytical scheme, i.e. as a step-by-step substitution of elements of non-rational behavior with those of rational behavior. This process culminated in the revolution of rational expectations that put an end to the “dual world” of micro and macro, which, from that moment on, began to be based on a common behavioral foundation, i.e. the model of rational choice. From that time, the last remnants of irrationality were expelled from economic analysis (to be more precise, from the economic mainsream).

However, the unified behavioral foundation of economic science did not exist long. In fact, behavioral economics returned it into a previous state of “anthropological” duality. Of course, the analogy with the earlier episode is far from complete (Kapeliushnikov, 2017). First of all, non-rationality of economic agents originally existed in macroeconomics in an implicit form and was not recognized by most economists. Now they adopt this assumption quite consciously. Second, at that time, the line between rational and non-rational behavior coincided with the line dividing micro and macro. Currently, models with rational and non-rational economic agents co-exist within both microeconomic and macroeconomic analysis. Third, the concepts of non-rational economic behavior used in Keynesianism and monetarism were of a rather arbitrary nature, without any strong empirical foundation. Behavioral economics provided that foundation. Nevertheless, this earlier experience clearly shows that the state of “behavioral duality” is not unusual for the economic orthodoxy.

How persistent could be the current situation of behavioral dualism? Some authors believe it to be only temporary and that the canonical rational choice model would soon be finally squeezed out of modern economics (Hands, 2014). This outcome seems to be unlikely (Kapeliushnikov, 2017). The reason is quite simple: it is difficult to imagine what the general theory of non-rational behavior, comparable to the model of rational choice in scope, completeness and plasticity might look like. As research practice of behavioral economists clearly shows, the analysis of cognitive and behavioral biases inevitably boils down to investigation into a multitude of isolated particular cases. Upon closer examination, every model of bounded rationality is seen simply as a canonical model of rational choice with a “cherry on top” in the form of this or that behavioral anomaly. In terms of analytical coherence, behavioral economics is clearly inferior to the traditional approach. Therefore, one can assume that, in the future, the methodological foundation for economic analysis will consist of a core featuring the model of rational choice and a periphery featuring multiple deviations from it. It is unlikely that the principle of rationality will lose its traditional status as the basic methodological landmark for economic research. In other words, the “anthropological” dualism will most likely never disappear from economic theory and will forever remain its key methodological characteristic.

7. The phenomenon of “mainstream”

What is the mainstream of modern economics? What is its structure? How has it changed (if it has) over time? Various answers to these questions can be found in the literature but I am inclined to follow a conceptualization proposed by De Vroey and Pensieroso (2016), who regard the mainstream as a relatively new phenomenon, less than fifty years old. They show that it was only first heard of at the turn of the 1970–1980s. (Among other things, this implies that there was no “mainstream” at earlier stages of the history of economic thought.) In a sociological perspective, the phenomenon of mainstream is none other than a closed intellectual club with very high entrance barriers. To be admitted to it, one must satisfy some strict methodological criteria: there is no way in for “outsiders.” These criteria were developed in the course of the methodological overturn which simultaneously took place during the 1970s in several key research fields: macroeconomics, labor economics, development economics, industrial organization, financial economics and various branches of applied analysis.

De Vroey and Pensieroso (2016) identified three basic criteria for belonging to the mainstream which, at a certain moment, began to be perceived as mandatory: (1) mathematical formalization; (2) micro-foundations (any proposed explanations should be derived from the optimizing behavior of individuals); (3) the combination of theory with measurement (the susceptibility of any propositions to econometric testing). This alone shows that the mainstream is not the same thing as the “statistical” domination of particular theoretical doctrine when most active economists became its followers — a situation that was repeated not once in the history of economic thought.

Indeed, in the past, a theory (classical, neoclassical, institutional) obtained the status of “orthodoxy” if it was generally thought to have the best explanatory power (which, naturally, the advocates of competing theories disagreed with). The situation with the concept of “mainstream” is, however, more complicated because it carries strong normative connotations, alien to the concept of “orthodoxy.” Here, we are not just dealing with the contraposition of more or less productive research programs, but with the opposition of good science vs. bad science or, to be more precise, with the opposition of science vs. under-science. The quality of any research is evaluated not only on the basis of its final results (although those are taken into account as well), but first of all on the basis of its initial — purely formal — methodological attributes.

In its essence, the term “mainstream” signifies a certain stylistic, but not content-related, unity as in the case with the term “orthodoxy.” Only those who are willing (and capable!) to follow accepted methodological (but not necessarily conceptual!) strictures might be admitted as members of this closed club. However, since initially the set of mandatory criteria included, inter alia, a requirement for micro-foundations, many incorrectly supposed that the mainstream is merely another incarnation of neoclassics. During the subsequent evolution when the original rigidity of this requirement was partially diluted, it became, however, clear that this was not the case and that the mainstream is far from being purely neoclassical.

The mutation followed a few courses: (1) the requirement for actual mathematical formalization was softened and replaced with the requirement for potential mathematical formalization (in some cases, it will be enough for the sophisticated reader to understand that the “story” told in a paper may be formalized, if needed); (2) the requirement for micro-foundations remained but lost its necessary association with optimizing behavior: appeals to alternative concepts of human behavior developed by behavioral economics began to be regarded as equally admissible; (3) in the combination “theory + measurement,” the first component ceased to be strictly mandatory, giving way at the forefront to purely factual, atheoretical approach (as mentioned above).

In a simplified form, the “anatomy” of modern economics is shown in Fig. 1. The “mainstream” cells are highlighted in grey. Two large blocks can be discerned within it: roughly, the “neoclassical” (microeconomics, neo-Walrasian analysis, business cycle theory, growth theory and various applied subfields) and, roughly, the “atheoretical” (laboratory experiments, field experiments, quasi-experimental research etc.). Perhaps, I could have highlighted three additional cells: econometrics, game theory and cliometrics (economic history). However, even without any additions, one can see how high the degree of internal heterogeneity is for what is called the economic mainstream: today it is seen as a conglomeration of heterogeneous research programs, united only by certain common normative methodological criteria of what “good science” is.

Fig. 1.

The place of the mainstream in modern economics

Source: De Vroey and Pensieroso (2016).

Looking at Fig. 1, we can even say that there has been a substantial rise in conceptual pluralism inside the core of economic science (in comparison with the situation observed in the 1980s). For example, typical representatives of the mainstream are, on the one hand, Eugene Fama, author of the efficient market hypothesis, and on, the other hand, Robert Shiller, who is strongly opposed to it: both enjoy indisputable authority in the community of economists; there is a vast number of references to each of them in literature; both are Nobel prize winners in economics (De Vroey and Pensieroso, 2016). In fact, it is possible to add many other examples of co-existence of competing research programs within the economic mainstream.

It seems that, whereas conceptual pluralism has lost ground inside economic science as a whole (due to the sharp rise of barriers between mainstream and non-mainstream), it has gained more ground inside its core! Today, the traditional opposition of Neoclassic vs. Anti-Neoclassic paradigms has lost much of its meaning, while the main dividing lines are now observed within the economic mainstream itself. It ceased to be purely neoclassical during the last few decades, moving towards much higher conceptual tolerance and pluralism.

How long can such a state last? Some authors predict that the last remnants of neoclassics will be completely and irreversibly expelled from economic analysis in the near future (Davis, 2006, 2008). I don’t believe this to be plausible. The thing is that neoclassical theory still provides the foundation for economic education and it is difficult to imagine a suitable replacement for it. However, if it continues to shape the way of thinking for each new generation of economists, then it can hardly be expected to disappear or at least be squeezed out of the economic mainstream.

8. Ideology on the march

While discussing sociological, epistemological and methodological aspects of modern economics, one should not omit the delicate issue of the prevailing political preferences of its practitioners. Their ideological attitudes, directly or indirectly, manifest themselves in the choice of problems economists prefer to study, in the normative conclusions at which they arrive and in the practical recommendations they provide for governments. (Although it should be admitted that to identify particular cases when ideological beliefs intrude into the scientific discourse, is not an easy task.)

The U.S. seems to be the most convenient case for examining political preferences of modern economists. Why? First of all, the vast majority of the most known and authoritative economists are now working at American universities. Second, the U.S. political system helps to rather easily identify individual’s ideological orientations, based on which of the country’s two main political parties he or she supports. Finally, most current studies on the ideological attitudes of modern economists are based on U.S. data (Klein and Stern, 2007; Klein et al., 2013; Langbert et al., 2016).

In recent decades, economics, following other social disciplines, has become increasingly more homogeneous in terms of prevailing ideological preferences. According to the most recent survey, the ratio of those supporting the Democrats to those supporting the Republicans among university economists is about 4.5:1.0 (Langbert et al., 2016). Only ten years ago, the difference was much smaller, i.e. 2.7:1.0 (Klein and Stern, 2007). At the same time, the predominance of those supporting the Democrats among university economists younger than 35 is twice as great, i.e. 9:1 (Langbert et al., 2016). Of course, economists still have to catch up with historians, whose ratio is 34:1 (journalists: 20:1; psychologists: 17:1; and lawyers: 9:1). Nevertheless, the trend in the economic profession towards ever-greater ideological homogeneity is clear. This might have a negative impact on the competition of ideas in the research community of economists and, in perspective, even lead to serious restrictions imposed on the freedom of thought. Ideological dictates in the academic world can be real and promise nothing good for the future of economics. In more prosaic terms, the further increase in the proportion of economists with leftist or semi-leftist political views implies that, in the coming decades, we may witness more and more governmental interventions in the economy in various and often unexpected forms.8

9. Macro in the aftermath of the Great Recession

I return to my original question: is it a triumph or a crisis after all? To answer it, it is useful to take into account that claims about a deep crisis in modern economics, voiced during and after the Great Recession, were actually addressed to only one of its branches, i.e. macroeconomics. Even if the state of macroeconomics is today as poor as the criticism describes it, it still does not mean that intellectual sterility has struck the entire economic science. As one author nicely characterized it, macroeconomics is the most “glamorous” subfield of economic analysis, as it is usually the only part that is visible for both politicians and the general public (Korinek, 2015). Hence the never-ceasing stream of exaggerated, politicized, emotional charges that flooded mass media all over the world at the turn of the 2000–2010s. However, such estimates, aimed at drawing public attention, are not necessarily correct.

In the aftermath the global economic crisis of 2008–2009s, which economists failed to predict, the main “draft horse” of modern macroeconomic analysis, i.e. the dynamic stochastic general equilibrium model (DSGE), has become the favorite target for criticism from both professionals and pundits. What are its main presumed deficiencies?

One of the most popular accusal is that DSGE models are unrealistic. However, in literal terms, this charge is hardly correct. DSGE models are “computable” models calibrated with the help of empirical estimates derived from available microeconomic studies (such as estimates of elasticity of labor supply etc.). In other words, they have been designed in the most realistic way possible.

Maybe, then, critics propose to substitute DSGE models for non-dynamic ones? — No. Should we go back to non-stochastic models without uncertainty? — No. Should we stop to take into account “general equilibrium” effects turning to partial equilibrium models? — No. Strange as it may seem, almost none of the critics has attacked the key structural characteristics of DSGE models, which even they perceive as serious progress (Reis, 2018). Apparently, they are more leaning to enrich, refine and correct these models rather than give them up altogether.

The thrust of those criticisms is a claim that DSGE models are abstracted from a number of key functional characteristics of an economic system. There is a long list of proposals in literature referring to the manner in which standard DSGE models might be improved, for instance (see Reis, 2018):

  • rejection of modelling based on the concept of a representative agent and recognition of the heterogeneity of households;
  • taking into account specificity of consumer preferences over different classes of goods (for instance, durables, non-durables, housing);
  • transition from simplified models involving agents with an infinite life (and, accordingly, an infinite planning horizon) to more complex models involving agents with a finite life (and, accordingly, a finite planning horizon);
  • rejection of the assumption of rational expectations and recognition of bounded rationality of economic agents;
  • replacement (fully or partially) of exponential discounting with hyperbolic discounting;
  • taking into account not only productivity shocks but also other potential sources of uncertainty;
  • closer integration of the financial sector into macroeconomic models;
  • paying more attention to the effects of economic inequality;
  • incorporation into analysis various distortions associated with taxes and government expenditures;
  • recognition of the key role of money. 9

However, as stressed by Reis (2018), when speaking not about macroeconomics textbooks, but about research at the forefront of analysis, attempts to integrate all of the above factors into macroeconomic models were first made well before the Great Recession and actively continued after it.

In this regard, there is no conceptual gap between the pre-Recession and post-Recession periods. Indeed, the crisis resulted in a noticeable expansion of the set of problems studied by macroeconomists: the causes and mechanisms of the Great Recession, the consequences of quantitative easing, the specifics of the economy’s behavior under ZLB became, along with many other issues, the subject of serious discussions. However, during the post-Recession period, improvement in the analytical apparatus of macroeconomic theory followed the same bearings as in the pre-Recession one. In comparison with the profound effects from the Great Depression or stagflation, the “intellectual” impact of the Great Recession seems to be negligible.10 There has been no transition towards another, completely different theoretical paradigm, as was the case in the 1930s and in the 1970s: accents have shifted, the set of problems has widened, the atmosphere of discussion has changed (with less self-confidence), but the general conceptual framework and the basic analytical apparatus have remained virtually the same. The continuation of the trends, started long before the Great Recession, means that the economists themselves (sometimes contrary to their own declarations) do not appear to believe that the macroeconomic theory is now in “crisis”.

The more serious problems may be associated not with the conceptual limitations inherent in DSGE models, but with the existence of implicit rules regarding their practical application (Korinek, 2015). The standard procedure of DSGE modeling consists of three stages: (1) first, main stylized facts are identified, i.e. for time-series of selected macroeconomic variables, certain statistical characteristics (such as standard deviation, autoregression, covariation etc.) are estimated; (2) second, a DSGE model is constructed for the same variables, which is then exposed to a series of stochastic shocks; (3) third, the statistical characteristics of actual and simulated time-series are compared to assess their closeness and, if the fitness is good, the model is recognized as successful (it “explains” the stylized facts).

However, upon closer examination, this research strategy turns out to be arbitrary and conventionalist, i.e. based on an implicit agreement inside the macroeconomic community as to what should be “good” and “bad” research practice, what a “norm” is and what it is not, what should and what should not be recognized as “science.” For example, there are no objective criteria for establishing which statistical characteristics of macroeconomic variables selected by the researcher should be considered in the analysis and which should not. This is a zone of pure convention (in other words, the researcher’s arbitrary choice). Similarly, there are no generally accepted statistical tests that would help to make strict objective assessments of the “goodness of fit” between actual and simulated time-series. More often than not, this is done with no precision at all: two curves are drawn on a graph and the reader is invited to decide for himself or herself whether there is good or bad fitness. The difficulties do not end here: by adding a new variable to those already included into the model, we can improve fitness, while impairing it for the variables left outside the model. It is still unclear how to act in this situation. Finally, by continuously increasing the number of variables incorporated into a model, it is always possible to reach a threshold beyond which its “good” fit becomes “bad” (Korinek, 2015).

Economists using DSGE models are very reluctant to talk about these implicit conventions. However, neglecting them may lead to serious disorientation, both in interpreting results obtained and in formulating recommendations for economic policy. In a methodological perspective, such non-trivial research practice can even make one wonder whether DSGE macroeconomics is more of a science or an art.

10. Conclusion

To conclude, I will return again to my initial question. The observations presented in these notes seem to suggest a rather banal conclusion that is far less dramatic and sensational than those heard today. What we are witnessing in modern economics today can hardly be called a triumph or a crisis: this is rather an ordinary working state. Although, it should be admitted, this situation is not a very inspiring one, not promising great conceptual breakthroughs in the very near future. Of course, this applies only if I am correct in saying that the era of large innovative theoretical ideas for economic science has passed, that its drift towards purely atheoretical analysis will become stronger and that it will become more and more interventionist as time goes by.

References

  • Almond, D., & Mazumder, B. (2011). Health capital and the prenatal environment: The effect of Ramadan observance during pregnancy. American Economic Journal. Applied Economics, 3(4), 56–85. https://doi.org/10.1257/app.3.4.56
  • Angrist, J., & Pischke, J.-S. (2010). The credibility revolution in empirical economics: How better research design is taking the con out of econometrics. NBER Working Papers, No. 15794. Cambridge, MA: National Bureau of Economic Research.
  • Backhouse, R., & Cherrier, B. (2017). The age of the applied economist: The transformation of economics since the 1970s. History of Political Economy, 49(5, Supplement), 1–33. https://doi.org/10.1215/00182702-4166239
  • Buchanan, J. (1987). The constitution of economic policy. The American Economic Review, 77(3), 243–250.
  • Card, D., & Krueger, A. B. (1994). Minimum wages and employment: A case study of the fast food industry in New Jersey and Pennsylvania. The American Economic Review, 84(4), 772–793.
  • Davis, J. B. (2008). The turn in recent economics and return of orthodoxy. Cambridge Journal of Economics, 32(4), 349–366.
  • Das, T., & Polachek, S. W. (2017). Micro foundations of earnings differences. IZA Discussion Paper Series, No. 10922. Bonn: Institute for the Study of Labor (IZA).
  • De Vroey, M., & Pensieroso, L. (2016). The rise of a mainstream in economics. IRES Discussion Paper, No. 2016-26. Louvain: Université Catholique de Louvain.
  • Duflo, E., & Hanna, R. (2005). Monitoring works: Getting teachers to come to school. NBER Working Papers, No. 11880. Cambridge, MA: National Bureau of Economic Research.
  • Hands, D. W. (2014). Normative ecological rationality: Normative rationality in the fast-and-frugal-heuristics research program. Journal of Economic Methodology, 21(4), 396–410. https://doi.org/10.1080/1350178X.2014.965907
  • Kapeliushnikov, R. (2015). Strategies of behavioral economics. In A. N. Dmitriev & I. M. Savelieva (Eds.), Human sciences. History of disciplines. Moscow: HSE Publishing (in Russian).
  • Kapeliushnikov, R. (2017). The status of the rationality principle in economic theory: Past and present. [in Russian]. Zhurnal Novoi Ekonomicheskoi Assotsiatsii, 2, 162–166.
  • Klein, D. B., & Stern, Ch. (2007). Is there a free-market economist in the house? The policy views of American Economic Association members. American Journal of Economics and Sociology, 66(2), 309–344. https://doi.org/10.1111/j.1536-7150.2007.00513.x
  • Klein, D. B., Davis, W. L., & Hedengren, D. (2013). Economics professors’ voting, policy views, favorite economists, and frequent lack of consensus. Econ Journal Watch, 10(1), 116–125.
  • Krugman, P. (2018). Good enough for government work? Macroeconomics since the crisis. Oxford Review of Economic Policy, 34(1–2), 156–168. https://doi.org/10.1093/oxrep/grx052
  • Langbert, M., Quain, A. J., & Klein, D. B. (2016). Faculty voter registration in economics, history, journalism, law, and psychology. Econ Journal Watch, 13(3), 422–451.
  • Pekkala Kerr, S., & Kerr, W. R. (2011). Economic impacts of immigration: A survey. Finnish Economic Papers, 24(1), 1–32.
  • Samuelson, P. A., & Modigliani, F. (1966). The Pasinetti paradox in neoclassical and more general models. The Review of Economic Studies, 33(4), 269–301. https://doi.org/10.2307/2974425

2 As noted above, by no means do all economists share this atheoretical perspective. Many still believe that the theoretical interpretation of empirical results is an essential and critically important part of analysis. Therefore they interpret zero or positive change in employment of low-skilled workers associated with the rise in minimum wage as an indication that the labor market for such workers is monopsonistic. (In this case, higher wage is known to have an upward effect on employment). However, two things should be pointed out here. First, when asked on what grounds they come to the conclusion that monopsony exists for low-skilled workers, they would answer: on the grounds that, according to our econometric estimates, elasticity of demand for those workers is zero or positive. Here we have an example of circular reasoning: zero/positive demand elasticity is due to monopsony; monopsony is due to zero/positive demand elasticity. Secondly, starting from the famous paper by Card and Kruger (1994), the majority of modern studies on minimum wage are based on data for the fast food industry. However, the very suggestion that employees of those ubiquitous McDonalds, Burger Kings, Pizza Huts and other similar outlets suffer from monopsony looks super heroic: if it proves anything, then probably it is only the rich imagination of those voicing it.
3 By the way, the first paper that ever came to the conclusion that the rise in minimum wage has no effect on employment of low-skilled workers was based on quasi-experimental data (Card and Krueger, 1994).
4 Growth theory also has drifted in the same direction in recent decades. Most recent growth studies focus primarily on the fundamental causes of economic development, such as geography, institutions or culture. They use quasi-experimental methods, without attempting to construct any general theory capable of explaining how these factors interact with each other and why some might be more important than others. Finding any theory behind the statement, “institutions are more important than culture,” is as impossible as finding one behind the directly opposite statement, “culture is more important than institutions”. In fact, the whole “theory” here boils down to discussions and estimates in terms of more or less.
5 In a sense, the whole edifice of modern econometrics can be viewed as a giant superstructure over Marshall’s ceteris paribus requirement.
6 This example is also interesting because it illustrates of how goals have drifted in development economics in recent decades: if previously it focused on discussing general principled issues, such as the choice between a market or a central planning, between private or public property, between investments in physical or human capital etc., now it is almost exclusively concerned with narrow applied topics, such as reducing school teacher absenteeism, increasing child vaccination levels, encouraging farmers to use modern fertilizers etc. It should be noted, however, that Esther Duflo sees this “humbling” of research objectives more as an advantage than as a drawback. From her point of view, developing countries would be better off adhering to a philosophy of small deeds, exemplified by the experimental approach, than toying with the various ideas for a large-scale societal reforms, which have not helped them escape the poverty trap over many decades.
7 For more on the “empirical turn” in modern economics, see: Backhouse and Cherrier (2017).
8 The prevailing interventionist stance of modern economics is evidenced, among other things, by the simple fact that any author publishing a paper is almost mandated to include a section on policy implications. In his day, Nobel prize winner in economics, James Buchanan, called on economists to stop producing political recommendations for governments as if they had been hired for this purpose by a benevolent dictator (Buchanan, 1987). Buchanan failed to make himself heard and the majority of modern economists still perceive themselves as virtually serving a benevolent dictator, i.e. the government.
9 Some more radical critics also propose to reject the methodological requirment of microeconomic foundations and return to the earlier (Keynesian) practice of relying upon purely empirical observable regularities that are not derived from the optimizing behavior of individuals (Wren-Lewis, 2018).
10 Krugman (2018) recently came to the same conclusion.
✩ The paper is a revised version of the address delivered at the 17th annual Leontiev’s Readings Conference “Economic Theory: Triumph or Crisis?”, Saint Petersburg, February 16–17, 2018.
login to comment