After seven years at the zero lower bound, the target range for Fed funds rate was raised by 25 basis points in December 2015. At the time the data were exhibiting similar improvements and yet the increase triggered some global market volatility earlier this year that could have been predicted.
In fact, last September we wrote;
The Fed indeed has created a catch 22 situation; as higher rates are needed badly, but any action towards raising rates would be extremely destabilizing.The reason for the needed higher rates as we have stated in the past was that:
[T]he low rates have been distorting the economy and have created dangerous imbalances, particularly unsustainable level of debts.
We have argued that:
Unfortunately under today’s “currency wars” conditions, with the slowdown in China, and Europe’s debt crisis, as well as huge debt build up by consumers and states the normalization of monetary supply in any single country, as an isolated and uncoordinated action, would be a recipe for disaster,and in particular have emphasized that:
Policy makers should realize how important the role of capital formation in the supply side is. They should realize that for a successful working of international trade currency values, like any other price signals, must be informative about their relative purchasing power, and these can only be discovered in transparent markets, where the fundamental relationships between financial assets and the real sectors are respected -- where the banks are healthy and tax payers are not on the hook for the rescue of Too-Big-to-Fail zombie banks.Last September we argued that;
[T]o raise the policy rate by 25 basis point at this time would not send any useful signal and (...) could be a wrong move. A policy normalization would only make sense when the markets know what the normal level is and how fast is the speed of adjustment toward that level.
On May 19 the Federal Reserve Bank of New York has introduced a data product entitled U.S. Economy in a Snapshot , that in the words of its president William Dudley is supposed "to provide information that helps households and businesses follow the data along with the Fed." Unfortunately. the package is silent about how to organize these data so that one can follow the Fed. One may argue that Fed itself is still struggling with the challenge of calculating the unobservable neutral rate. Mr. Dudley has explained that:
Conventional U.S. monetary policy is conducted by targeting the level of the federal funds rate—an overnight interest rate on bank reserves. Few participants in our economy have any direct interaction with this interest rate. How, then, is controlling this interest rate such an important part of setting monetary policy and steering an 18 trillion dollar economy toward the Federal Reserve’s dual mandate objectives of maximum sustainable employment and price stability? (...)
When asked about the trajectory for the monetary policy stance, I always point out that it is data dependent. The FOMC calibrates the stance of monetary policy to best achieve our twin objectives of price stability and maximum sustainable employment, taking into account our forecast for how the economy is evolving. This forecast reflects the ongoing flow of the data. Data releases that are close to our expectations have little additional impact on the forecast, while data releases that deviate significantly from our expectations can lead to more significant revisions of the forecast. It is, therefore, important for market participants and households to be able to follow the data along with the FOMC and to understand how we are likely to interpret and react to incoming data.
What does it mean to say that the monetary policy stance is data dependent? Simply put, it means that the Fed funds overnight interest rate, as the main policy instrument of the monetary authority, is determined in relation to the interactions among a whole set of data. It is the movement of the key macroeconomics variables and their impacts on each other that determines the equilibrium neutral rate.
In fact, a quick glance at the New York Fed's aforementioned and very useful publication reveals that it contains about 60 time series data depicting movements of various macroeconomics variables. How then one should look at these data, and how can one interpret their seemingly inconsistent movements at certain times in order to project the likely direction of the Fed funds rate?
How to project the likely direction of the Fed funds rate?
There are a number of ways that one could organize the key macroeconomics data. The most familiar way is to specify and estimate a small structural model. However, as we have argued before, in the aftermath of the financial crisis:the macro models are not very well specified, simply because there have not been enough observations that would allow for control of the impacts of various QEs, zero- and-negative interest rates, global shocks and behavioral and policy changes – just to name a few (and assuming that we have a right theoretical model- which is a big assumption). Furthermore, we do not know what are the distribution shape of various arguments in our risk functions and so on.Thus, an alternative way would be to use Bayesian priors and specify a calibrated structural model, that can be updated in a Bayesian learning process. Finally, one may choose a small subset of perhaps about 10 variables in a Bayesian Vector Autoregressive (BVar) model, or one of its variants that would determine with some probability how much the policy instrument is needed to be changed in order for the model to stabilize at a certain inflation target range and output growth level.
With this in mind, it is surprising that most often the discussions of the likely trajectory of interest rates setting by Fed is presented in various types of partial equilibrium analysis. Typically, many analysts employ a Wicksellian paradigm of natural rate of interest, which as we will argue later is totally irrelevant for analyzing a severely imbalanced economy, such as those of many advanced countries in North America, Europe and Asia. Moreover, at times the monitoring of various variables are presented outside any structural or time series model in an ad hoc fashion. For example Esther George, the president of the Federal Reserve Bank of Kansas City, in a May 12th speech in Albuquerque, N.M., reporting a decline to 5 % in the unemployment rate, from 10% in the aftermath of the financial crisis in 2009 has stated:
Of course, the unemployment rate is an imperfect measure of the labor market, so I also pay close attention to other data. For example, one development I find promising is that many individuals who had dropped out of the workforce are finding jobs. After the crisis, the percentage of people participating in the labor market fell sharply. Some of this is because our population is aging, so people naturally work less as they get older. However, some of the decline in labor force participation was due to workers being discouraged about their job prospects. More recently, however, we have seen an upswing in people finding jobs who had previously stopped looking for one. For example, close to 2 million workers returned to the labor force over the past six months. This pace of re-entry is close to the fastest pace in more than 15 years. Despite these positive developments, wage growth has remained sluggish and many people still feel like they have limited options in the labor market.She then goes on to describe two business perspectives in the labour market,
One perspective is of a booming labor market, rising wages and an abundance of opportunity. The other perspective is of stagnant wage growth, limited upward mobility and job insecurity.Relating the second perspective to a sharp decline in the share of workers in middle-skill jobs she argues that
As the Federal Reserve considers these and other economic trends, it must weigh a number of crosscurrents to determine the appropriate interest rate policy. In the shortrun, I continue to monitor how the energy, agricultural and manufacturing sectors are adjusting relative to the national economy. And over the longer-run, I evaluate what trends like job polarization mean for monetary policy.Nevertheless, without specifying what would be the prevailing equilibrium long-run Fed funds rate, she concludes by stating that:
The current setting for the federal funds rate is well below what the FOMC expects will prevail in the longer term. The plan is to move gradually and in a way that is responsive to economic developments. I support a gradual adjustment of short-term interest rates toward a more normal level, but I view the current level as too low for today’s economic conditions.
The problem with this type of analysis is that if businesses are shifting towards intensive margin mode of the production due to uncertainty, using more labour intensive techniques in their short-term capacity planning, for example by introducing additional labour shifts or hiring contingent workers instead of investing in irreversible fixed capital, then the fact that many of the dropped out workers from the labour force are finding jobs would not be that promising.
Ideally of course the impacts labour participation rate, discouraged workers, or wage growth could be incorporated in a structural model, or alternatively the Fed can run small satellite models to estimate and inform the market of their likely impacts. These variables would affect the other key macroeconomic variables such as capital formation, capacity utilization, productivity, terms of trade; to name just a few. A partial equilibrium analysis ignores many of these impacts when the trajectory of these omitted variables under various scenarios can drastically alter the nature of analysis and the outlook.
As a result of ignoring the situation of uncertainty as well as not taking into account the impact of the businesses' shift to intensive margin mode of production as well as delays in investment plans President George in her speech in Oklahoma city of July 9th, 2015 had been too optimistic about the capital expenditure outlook, stating that:
Moreover, as the economy continues to heal and domestic demand continues to strengthen, businesses should have more incentives to increase capital expenditures.As we know, this prediction of course has not come to pass as according to the most recent New York Fed's snapshot of the US Economy in July:
Over the four quarters ending in 2016 Q1, real business investment in new equipment was down 0.3%, continuing a slowing trend in place since 2010. (...) A key reason for the overall slow pace of growth of investment in new equipment is relatively low level of the manufacturing capacity utilization rate. This rate which had been slightly above 75% for over a year, dipped below 75% in May. Historically, robust growth of investment in new equipment is associated with a capacity utilization rate of 80% or higher.As we have argued in September last year that :
The Fed's estimate of longer-run normal rate of unemployment is not consistent with the US investment in capital formation, The appearance of a gradual decline in the US economy's slack is attributable to a greater use of contingent labour and contingent capital, due to the prevailing global uncertainty.This structural approach would reveal that the question of capacity utilization rate needs to be carefully reevaluated. The nature of full capacity under contingent capital and intensive margin would result in a shift of the full capacity potential to the left along the economy's long-term average cost curve. The resulting short-run equilibrium would be different from the long-term equilibrium capacity. Furthermore we specifically stated that:
The economy is being distorted by the zero-interest rate policy and is not getting closer to its long-term equilibrium. The use of contingent production factors has generated a quasi-closing of the gap in reference to a quasi-potential output growth, which corresponds to Klein (1960) and Berndt and Morrison (1981) definitions of capacity. This is why this quasi-closing of the gap has not exerted an upward pressure on the US inflation rate.
How can the labour markets move to equilibrium with such a weak capital formation? It is quite clear that this fragile capital formation is due to the prolonged period in which businesses have postponed investment as a result of the prevailing global uncertainties which have been exacerbated by the authorities suppression of equilibrating market dynamics . Investment spending has grown more slowly than usual for a business-cycle expansion and this is the main reason for the observed decline of the US productivity.
The global uncertainty and ultra-loose monetary policies have encouraged businesses to follow strategies of incremental reductions in costs that are not accompanied by investment in new technology. This has undermined the longer-term growth of potential output, which appears to have caused a distorted and artificial decline in real interest rate, by which authorities hope to encourage entrepreneurs to assume more risk. The economic theory suggests that lack of capital formation would cause a shrinkage in production possibilities frontier, resulting in a decline in labour productivity growth as we have observed in the US
Unfortunately in spite of recognizing that the long-run level of the neutral rate is highly uncertain many analysts, including some of the FOMC members, still focus on it. It appears that some even focus the highly volatile short-term rate. For example, the FOMC's minutes of the June 14-15 meeting reports that:
Many participants commented that the level of the federal funds rate consistent with maintaining trend economic growth—the so-called neutral rate—appeared to be lower currently or was likely to be lower in the longer run than they had estimated earlier. While recognizing that the longer-run neutral rate was highly uncertain, many judged that it would likely remain low relative to historical standards, held down by factors such as slow productivity growth and demographic trends.It should be clear that it is not slow productivity growth that is holding back (the long-run) neutral rate. The direction of causality is the other way around. The prevailing low interest rate gives the impression that the neutral rate has declined, and at the same time they cause a delay in capital information, via a rise in uncertainty that low rates are causing. Low capital formation reduces the trend productivity, although due to intensive margin operation we may observe some transitory short-term productivity increase. As we have argued in the past, the neutral or natural rate of interest derived from the Wicksellian theory is only valid in the long run general equilibrium conditions. We stated that:
[T]he Wicksellian theory is a general equilibrium theory in which the financial rate of interest that borrowers actually pay must be equalized to the natural rate of interest that is determined by the marginal return on the fully employed real capital. If the financial rate is below the natural rate the demand for investment will rise as businesses can borrow at the lower financial rates and invest the funds into high-returning projects. However, the information signals that a Wicksellian paradigm could emit are not meant for a disequilibrium environment in which the real capital is underutilized and businesses are postponing investment in irreversible fixed capital and opt for waiting.-- The introduction to this piece is slightly modified to take into account the Fed's inaction on July 27th.
When due to the prevailing uncertainty businesses refrain from investment and when in their capacity planning they resort to utilizing contingent labour and capital instead of moving towards their long term minimum average cost capacity the Wicksellian equilibrium theory would be an inappropriate analytical framework. In fact, the concept of the natural rate of interest in a disequilibrium environment would be an oxymoron.
No comments:
Post a Comment