How to predict the present

How to predict the present

Add to Reading List
Add to Reading List
Q: What was your motivation behind creating Now-Casting?

A: Now-Casting is based on the idea that forecasting the future is almost impossible using economic models. In fact there is this result in the literature that says if you want to forecast GDP beyond one quarter ahead the best model is to look at the previous quarter.

Yet forecasting the present, in terms of current economic conditions as summarised by current quarter GDP, is an area that you can learn more about. Typically in institutions and markets this has been done through judgemental procedures where people just look through a load of data and try to find some
regularity. However, these procedures have never really been tested and there was basically no model around to do now-casting.

About 10 years ago we created a model to do this as part of a project that Ben Bernanke of the Federal Reserve Board initiated. He asked me to look into it because I had been working on models that could handle a large amount of data, which is what you need for this type of project.

Since GDP is established with a delay of around a month and a half after the end of a quarter you could try to use more timely data to try and now-cast. There is a lot of data around, and in principle everything is useful, so you need a model that is able to handle a large volume of data.

This model had to be adapted because there are different degrees of timeliness of the data, for example surveys are more timely than hard data like industrial production but they may also be more noisy. So basically the model solved these technical problems by reading the flow of data in real time and weighting the information in an appropriate way so that it produces an immediate implication for an estimate of current quarter GDP.

It is based on an algorithm, so is not affected by judgement or moods. The model produces a series of GDP estimates that become more and more accurate but also flags up any changes in these estimates. Whenever there is a surprise in the data the GDP estimate is moved, and it produces a table to illustrate why it has changed.

Q: Given their sensitivity to real-time data, how volatile are these estimates?

A: At times they are very volatile, particularly when there is a lot of uncertainty around. For example, when we started in June 2011 it was a volatile period in the Euro area. In July the ECB actually increased rates after looking at a series of the hard data from the region that were all pointing to positive signals. However, the surveys were all pointing to negative signals so our model was showing a
significant slowdown that you wouldn’t have picked up by looking at typical measures like industrial production. In contrast, for the US things have been quite stable through the recovery so the
estimates have not moved much.

Q: Can the model factor in less quantifiable risks such as the impact of policy shifts by governments?

A: The model is stupid. It only reads what is already in the data. Insofar as it’s reflected in the surveys, such as consumer confidence and this kind of thing, we will pick it up. Yet we will not pick up a sudden earthquake that is totally unexpected.

If you look at the big 2008 crisis, we were late in picking it up but earlier than others in picking up on the recovery. We were also first to see signs of a slowdown in the Euro area.

Q: You have written quite extensively on the impacts of extraordinary monetary policy. Could a model be built to forecast the effects of these measures?

A: We recently published an article in The Economic Journal in which we were trying to see what the effect of the action taken by the European Central Bank in the interbank market. There we looked at the action not only in terms of bank funding but also the macro effect. For that we used methods that also handle issues of high-dimensional data as we looked at lots of different variables within the same model, both macro and financial. I think we are actually the only paper to have come up with some
estimates of the macro effect of these policies.

Q: In the 2007 Federal Reserve Open Market Committee transcripts David Stockton acknowledged that the Fed’s models for financial transmission mechanisms were “rudimentary”. Is that a problem that you have come across in approaching this subject?

A: Yes. What he was probably referring to was the structural model.

In my experience as the head of research at the ECB you want to have a combination of different tools. Structural models are highly parameterised. They are based on a lot of ad hoc assumptions that allow you to make structural statements but are very naïve in terms of capturing the features of the financial

So my idea has always been that you can use this model for telling some stories but you need to have some more empirical-based tools that give you an idea of the features of the data.

Q: In your 2010 paper “Monetary policy in exceptional times” you noted the convergence of policy by major central banks in the early years of the crisis. Has there been a growing divergence in policy responses in subsequent years?

A: In the first phase of the crisis the Federal Reserve and the ECB emphasised credit easing rather than quantitative easing. It was the same, except that the ECB only acted through banks because the financial sector in Europe is mostly banks unlike in the US.

Yet in the second phase the Fed moved closer to the Bank of England in using quantitative easing to try to control the long-term interest rate. The ECB, however, stayed where it was by using LTRO to provide liquidity to the banking sector.

In terms of what this means for central banks, both measures increase the size of the balance sheets but the asset composition is quite different.

Q: Do you think that this divergence in policy is making it even harder to gauge the impact of monetary policy?

A: Before the crisis there was a lot of work looking at the Federal Reserve’s interest rate policy. Now they are using different tools as they are trying to deal not only with inflation but also with a lot of other problems including dysfunction in financial markets. It’s much harder to quantify the effect because
it’s indirect, through the financial system.

In fact this is exactly where our models are not really performing. We don’t know exactly how it works as we have very little empirical knowledge of the relationship between financial markets and the real economy. The relationship is certainly very unstable and we have few hard facts.

Q: Are central banks still concerned with money supply targets?

A: The ECB still keep this rhetoric about targeting M3 but in fact they have given up targeting any quantity of money because money is completely endogenous. What is important is not so much the money supply but the interbank market, which is not reflected in the monetary statistics.

Over the last 20 years the interbank has become more and more important and I think it is very important to monitor. It is money, although in traditional thinking this was overlooked.

Q: What do you make of the recent discussion by members of the Bank of England’s Monetary Policy Committee over the possibility of negative interest rates?

A: It’s definitely a tool and it has also been discussed by the ECB. Of course, you have to be careful that it’s not going to distort market activity.


Twitter Feed

RT @mybuchshelf: Are book collectors real readers, or just cultural snobs? – via @aeonmag

A collection of some of the best econ books of the year, feat - @ryanavent, @BrankoMilan, @g2parker and more...…

RT @mark4harrison: Blogged: Donald Trump and America's Incomplete Contract with Itself @warwicknewsroom @cage_warwi…

RT @NIESRorg: The weak pound in your pocket: @angusarmstrong8 continues to make waves with his blog post, this time in the @FT https://t.c…

RT @LSEReviewBooks: Review Archive: The Sharing Economy: The End of Employment & the Rise of Crowd-Based Capitalism by Arun Sundararajan ht…