The Fed’s lowflation dilemma

A few months ago, the Federal Reserve seemed determined, at long last, to normalise monetary policy in the US. In fact, it still seems to be set on that course. The FOMC has indicated that it intends to raise rates in June by a further 0.25 per cent, and they also seem ready to announce a plan to shrink the central bank balance sheet in September.

But there is now a fly in the Fed’s ointment. The last two monthly releases for consumer price inflation have been much weaker than anyone expected. Although the FOMC was fairly dismissive of the first of these announcements – saying in the minutes of its May meeting that it was probably caused by temporary or idiosynchratic factors – it is not yet known whether they have continued to ignore the second set of weak CPI data in April. The two months together have left core CPI inflation 0.4 percentage points lower than expected.

When the PCE deflator is released next Tuesday, it will probably show the 12-month core inflation rate at 1.5 per cent in April, the lowest figure since the end of 2015. The FOMC’s reaction to this incoming news will depend on their reading of the underlying causes of low inflation, which are highly uncertain. But the markets have already decided that they will take the evidence seriously enough to abort their programme of rate rises after the planned June increase.

It may seem surprising that such a small amount of new evidence could cause a rethink of a monetary normalisation strategy that has been so long in the making. But the Fed’s decisions are supposed to be data dependent, and the latest inflation readings have not supported their prior set of beliefs.

The Fulcrum inflation models are designed, among other objectives, to produce short term projections for inflation based on methods that extract underlying price increases from noisy monthly data. The models’ near term inflation projections (red line) have dropped sharply as a result of the March and April CPI announcements, and the inflation rate for the rest of 2017 is now projected to run well below the rates forecast by the Fed in March (blue dots):

The May FOMC minutes suggest that the Staff’s projections were already marked down after the first monthly surprise. The question now is whether the FOMC can continue to claim that inflation will stabilise at around 2 per cent inflation target (measured by PCE deflator, which usually differs very little from the CPI) in the medium term.

They will probably try to cling on to this assessment for as long as possible. The March and April inflation reports may have been largely driven by random fluctuations, in which case they are likely to be offset by unusually large monthly increases before very long. In that case, it will be back to business as usual.

However, if there is no substantial rebound in inflation data during the third quarter, the FOMC will start to give greater weight to the possibility that a more fundamental change has taken place in the US inflation mechanism. There are two possibilities that will come into focus:

1. Lower inflation due to measurement changes

It is well known that US inflation rates tend to be overstated in the official data series because the BLS and BEA methods for measuring price changes probably make insufficient allowance for improvements in the quality of goods and services, especially in information technology and health care. In 1998, the Boskin Commission concluded that these problems probably led to an overstatement of US inflation by about 1.1 percentage point per annum, and adjustments were made to the official statistical methodology to correct part of this problem.

None of this would matter if the measurement issues remained similar over successive time periods, since the Fed’s inflation target already makes allowance for the likelihood of constant errors in the data. However, recently, there has been a great deal of discussion among official statisticians and others (including Martin Feldstein), suggesting that data methodology may be changing in ways that measure product quality, and the introduction of new products, more accurately than before. This is important because it would lead to a lower official estimate of inflation, even if nothing had changed in the economy.

The possibility of measurement changes have come to the fore in the past couple of months, when a single item – telecommunications services – has reduced the index by 0.2 per cent, reflecting a new methodological treatment of IT quality changes and the adoption of unlimited cell phone data packages by Verizon. It is unclear whether changes in methodology will have further effects on digital services and health care products in the future. It is possible that this could reduce the official measure of inflation by 0.25-0.5 per cent per annum in an extreme case.

If that happens, the Fed will be faced with a repeating tendency for the official inflation data to surprise towards the downside. While the “correct” policy response to this phenomenon would probably be to reduce the inflation target from 2 per cent to (say) 1.5 per cent, while maintaining monetary policy on an unchanged path, the FOMC will be extremely reluctant to do this. Instead, they would be likely to hold interest rates lower than otherwise, until inflation has risen to 2 per cent on the new measurement (or 2.5 per cent on the old methodology).

2. Lower inflation due to a drop in the natural rate of unemployment

Another possible explanation for persistently lower inflation, if it occurred, could be that the natural rate of unemployment has fallen, implying that the labour market has more slack than the FOMC currently believes. At present, the FOMC thinks that the natural rate is 4.7 per cent, while the actual unemployment rate is 4.4 per cent. This shortfall of unemployment relative to the natural rate should be leading to rising wage and price inflation, according to mainstream macro theory.

The FOMC could interpret low price inflation as an indicator that their estimate of the natural rate should be reduced, creating more room for the labour market to tighten further. They have done this several times in the past few years, but the evidence is thin this time.

According to recent econometric estimates by Jan Hatzius et al at Goldman Sachs, and by Bruce Kasman et al at J.P. Morgan, the Phillips Curve that links unemployment to wage inflation may have flattened in the past decade, but it has not disappeared entirely. Furthermore, the GS measure of underlying wage inflation, the US wage tracker, is rising broadly in line with the path indicated by the Phillips Curve, so the Fed should not be ready to shift its models of the inflationary mechanism in the labour market at present.

Where does that leave the FOMC? I agree with Tim Duy that their present stance is still biased towards gradual tightening, because they are not yet placing much weight on changes in data methodology, or in the Phillips Curve. They have now dug themselves into a position where they will be extremely reluctant to drop the intended 25 basis points increase in the fed funds rate on 14 June, or the start of balance sheet shrinkage, probably announced in September.

Lowflation will probably be less evident in coming months. Only if that fails to happen will monetary policy normalisation be placed on hold.

This post originally appeared on Financial Times

Leave a Reply