Are interest rates “normal” or “log-normal”?
by Raphael Douady, 2013
by Raphael Douady, 2013
Traditional, fixed-income risk models are based on the assumption that bond risk is directly proportional to the interest rate, i.e. that the interest-rate distribution is “log-normal.” Two corollaries would then follow. Firstly, nominal interest rates could never be negative. Furthermore, bond volatility vanishes when interest rates approach zero. These conclusions are obviously wrong, if we observe debt situation in a country such as Germany. Should we then infer that the traditional way to model fixed income is no longer valid? Our study, based on bond-data back tests, covering 40 years that encompass both periods of low rates and of high inflation, demonstrates that the traditional approach still applies.
When interest rates are low, the bond risk is proportional to the rate, increased by 1%. As an example, when interest rates shift from 1% to 0%, the bond risk is divided by 2, instead of becoming null. The second conclusion is that the interest rate volatility behaves as if the absolute minimum level for a rate was not zero, but -1%.