Austrian economists have traditionally argued against central planning on the grounds that much of the economically relevant knowledge in society could never be made available to a single planning authority. But today, with an unprecedented and ever increasing volume and variety of data now potentially accessible to the planner, it seems that an omniscient government may be possible after all. Has the big data revolution rendered the promarket arguments of Ludwig von Mises and Friedrich von Hayek obsolete?

In chapter 3 of their 1920 bestseller, The ABC of Communism, Soviet theorists Nikolai Bukharin and Yevgeni Preobrazhensky claimed that in the communist society of the future the state would “know in advance how much labor to assign to the various branches of industry; what products are required and how much of each it is necessary to produce; how and where machines must be provided.” In reality, of course, the Soviet Union never came close to realizing this vision. Given the impossibility of setting targets for the millions of individual items required by a modern economy, planning at the highest level had to be limited to some sixty thousand aggregate categories, which were then disaggregated at lower tiers of the bureaucracy (see chapter 7 of  János Kornai’s The Socialist System for more on planning in the USSR). Contrary to Bukharin and Preobrazhensky’s expectations, the result was a chronic failure to allocate resources efficiently. Shortages of essential industrial and consumer goods became the norm.

Could this failure have been avoided if only more advanced computational capabilities had been available? Nowadays, problems involving millions of variables are no longer insoluble. Might the day have at last arrived when, as Oscar Lange wrote in 1967, the market process “may be considered as a computing device of the pre-electronic age?”

As several authors have recently argued (here and here, for example), in the absence of markets planning would have to proceed without the information on supply and demand conditions revealed by actual transactions. In the short term it might be possible to make decisions about “what products are required and how much of each it is necessary to produce” based on the supply-demand equilibria prevailing in a preexisting market economy, but as the situation changed the plan would quickly lose its relevance to the real world. Sooner or later, the planner would end up “floundering in the ocean of possible and conceivable economic combinations without the compass of economic calculation,” as Mises put it in Economic Calculation in the Socialist Commonwealth.

But in fact an even more fundamental objection can be raised: the market process is nothing like a computing device. As Austrian economists have long emphasized, competition in markets is not simply a mechanism for transitioning to preexisting equilibrium outcomes. It is rather an engine of knowledge creation and entrepreneurial discovery. Running a business is not just a matter of resolving uncertainty about “known unknowns” through an orderly learning procedure. It requires realizations regarding “unknown unknowns” that did not initially play a role in decision-making.

Consider, for example, the famous case of Walmart’s use of data analytics to predict a jump in demand for strawberry Pop-Tarts in areas about to be hit by Hurricane Frances in 2004. As a series of zeros and ones in computer memory, the big data behind this prediction was not in itself information. It had first to be interpreted by a human being with an incentive to answer a particular question and a hypothesis about which variables might be significant. Someone had to have an intuition that an adverse weather event might create a profit opportunity at some particular time and place. Big data and artificial intelligence are tools to enhance the entrepreneurial discovery process, not a substitute for the inspiration of the profit-seeking market participant.

The existence of big databases does not make it any easier to centralize society’s stock of useful knowledge, because local knowledge is necessary to make productive use of data. “Planners,” as Israel Kirzner points out in chapter 2 of The Meaning of Market Process, “simply do not know what to look for: they do not know where or of what kind the knowledge gaps are.” Even if provided with links to every network node in existence, they would still be incapable of replicating the insights of countless individual decision-makers, each with his or her own unique viewpoint and distinct motivation to generate data-driven ideas.

Big data analytics is a means of strengthening the market process by reducing search costs, not a means of replacing it. This technology undoubtedly has important operational implications for individual companies. But it does not make the private firm any less necessary as an institution for efficient resource allocation. Indeed, big data is entirely irrelevant to Hayek’s local knowledge problem, because it does not provide any new means of aggregating the understandings of different individuals. Big data, while covering a wealth of different local situations, is not knowledge. Artificial intelligence software does not “know” anything.

There is thus no reason to think central planning could work any better with bigger datasets and faster processing power than it did during Soviet times, when the planning had to be done with slide rules and primitive mainframes. Smarter devices will not make socialism smarter.

via Mises Institute

One Comment to: Why Smarter Computers Won’t Make Socialism More Workable

  1. Avatar

    C Townsend

    January 17th, 2022

    Why never a mention how complexity theory gives mises and Hayek the win?

    Reply

Leave a Reply

  • (not be published)